History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 19 May 2019 07:02:43 +0000 Sun, 19 May 2019 07:02:43 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://lyris.hnn.us/site/feed What If Donald Trump Resigned? Two thirds of the American public (give or take a little) now believe that it is time for our President to stop being President. Trump should no longer have the power to take us to war on a whim or to ruin the careers of our leaders. 

 

There has been endless, somewhat idle  discussion of “Impeachment”  in Congress. It hasn’t proven, so far at least, to be the answer to our dilemma.  There has developed considerable agreement that a case for an Exodus needs to be made—and soon.  While that case can be made (by lawyers, by partisans, by the impatient, and by those who take our foreign affairs exceptionally seriously), there is a plain truth: we’re getting nowhere. 

 

Tempers have risen as the convoluted months have passed. Countless speeches have been made urging change—and not just in favor of immediate action.  There are among us political party members who pause, consider, maybe show some sadness, and dwell a bit drearily on the theme:  “Yes, I know he really has to go.  But we’re getting nowhere.”

 

I have slowly arrived at a point of view.  Oh, I’ve done what I can:  I’ve written three substantial articles that unreservedly  attack President Donald J. Trump’s performance in office.  It was a pleasure to write, then read, them—if frustrating.  To the extent there has been a reaction, it has been favorable enough, but mostly ineffective.  “Yes,” vast numbers say, “he does have to go.” 

 

If we agree pretty much on the need for Trump’s departure, the time is very much at hand to ask, essentially, What does he think about it?  What does he want, mid-term in the White House? Does he think there has  been enough roughhousing, yelling, defiance, repudiation of  important leaders at times and for reasons that are bound to be embarrassing?  Persecution, really rudeness, to the Press? Could it be that our peerless leader is agreeable to returning himself to a variety of estates and golf courses?

 

Thinking about his “situation” and the unpleasant circumstances that are slowly developing for us and for him, it does seem to this observer that a moment of crisis is approaching.  What, then, has become the Path I see to some kind of solution?

 

Since writing the initial draft of this article our good Nation has sent an aircraft carrier squadron to the Persian Gulf as an all too obvious threat to the Iranian government.  This aggressive action has been taken entirely on the initiative of the one who has other choices!  Military engagement is not the option that will bring him a true and lasting  sense of well being.  He need not suffer legal confrontations, speech and rebuttal, partisan challenges, and never ending indignities to family members (deserved or not).  As the days drag on it is so very apparent there is a tenable solution:

 

The Honorable leader of the executive branch of the United States should RESIGN at a very early opportunity. The President should not drag his feet until the Situation gets too hot to handle.

 

Yes, the owner of “the Trump estate,” that husband of a lovely lady, parent of stalwart children, and regular commuter to Mar-a-Lago and traveler to random places worldwide in government airplanes, should once and for all  take the terrible pressure off his mind and his health by JUST DEPARTING.

 

When President Richard Nixon finally decided the time had come, he wrote a one line notification of what he was doing.  It sufficed then.  But noticeably more than that is needed now. The President will want to offer his point of view to Posterity!  Believe it or not, we the Public will be receptive to thinking and weighing his final point of view.

 

 I have thought about it.  Here is a tentative draft resignation that I think might serve presidential needs and history as well:

 

“I am today resigning the position of President of the United States, effective at the time of transmitting this letter to the Congress.  The never ending turmoil surrounding daily and weekly events is beginning to be a considerable strain on my  well-being.  I fear that it will affect my physical condition before too long. 

 

“The position I have been occupying is one of never ending, constant responsibility. It has had its rewards, for me and members of my family. I feel I have served my Country well.

 

 “I could continue—waging the never ending political battles that so entrance those for whom such political activity is a lifetime activity.  But I am increasingly aware that Life has other rewards in store for me—provided I treat it with careful regard. 

 

“As I say goodbye, I trust that observers will weigh with proper regard the several aspects of my presidency—partisan or not—and arrive at a balanced verdict on my shortened career as President.

 

“I wish my successors well.  Overall,  I am quite certain that my impact on the Presidency of the United States has been positive.”

 

DONALD J. TRUMP                                   

 

The letter above, drafted clear across the Nation cautiously and respectfully (yet still a Draft),  is the best I can offer for consideration at this point in time. It should not bear my name.   “Draft Letter for consideration” is intended as a title and should suffice.

 

I am suggesting this avenue as a possible way—sometime in the near future--to bring an end to the several  crises into which  our beloved Country has gradually worked itself,  and to avoid any and all wars which may ominously be waiting out there!   Our Leader will write his own letter, of course—and by no means do I expect it will be more than a tiny bit  influenced by my ordinary citizen’s prose—if indeed that. (I have no illusions that my prose will be the words finally chosen!)

 

Do be of good faith, fellow citizens of whatever persuasion.  We must avoid additional unpleasantness—and far worse!  Keep calm on the domestic front, and by all means be patient.  Rise above partisanship.  Let’s meet our Leader halfway on the course I suggest which, if taken, may  just be the direction to improving the future of all Americans.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171992 https://historynewsnetwork.org/article/171992 0
Cassius Marcellus Clay and Muhammed Ali: What’s in a Name?

 

A new documentary on Muhammed Ali, What’s my name? is debuting on HBO, depicting the life and career of the man once known as Cassius Marcellus Clay Jr. 

 

What is in a name? 

 

To Ali his name meant everything. 

 

Said Ali: 

 

“Cassius Clay is a slave name. I didn’t choose it and I don’t want it. I am Muhammad Ali, a free name – it means beloved of God, and I insist people use it when people speak to me,” said a newly converted Ali when addressing the media

 

Ali was not joking. During a pre-fight interview between Ali and Ernie Terrell before their February 1967 fight at the Astrodome, Ali, as he was called by ABC’s Howard Cosell, regaled viewers with one of his patented poems to taunt Terrell, who responded by calling him Clay. Ali was not amused, questioning why he insisted on calling him Clay, portending that he was going to pay

 

“What’s my name?” yelled Ali in the eighth round, as he pounded Terrell with jab after jab breaking his eye socket, on the way to a 15-round decision, one of the more merciless beatings in boxing history to not end in a knockout. He had only one more fightin his career: he lost his boxing license for refusing to be drafted into the United States Army on religious grounds. 

 

When asked what his new Muslim name meant, the man heretofore known as Muhammed Ali responded “Worthy of praise, the most-high.”

 

Ali is more than an icon of sport. Ali’s life was emblematic of so many social identities: race, capitalism, war and peace, civil disobedience, freedom of religion; ostracization and redemption. He transcended sport; he was overtly political. Ali became a cultural touchstone and symbol of change, during a time when race and religion, then as now, was a defining paradigm of national discourse. Most importantly, he spoke his truth. 

 

But what about the name Cassius Marcellus Clay? Said a young Ali when interviewed before the Olympic trials: 

 

“I am Cassius Marcellus Clay VI; my great grandfather was a slave and was named after some great Kentuckian…Cassius Marcellus Clay is great name in Kentucky and really where he was from, I couldn't tell you. Now that obtained a little fame people want to know where I am from now, I am going to or have to look it up and see what it's all about now that I am getting a [few] interviews.”

 

The man for whom Ali was named, Cassius Marcellus Clay, also risked his livelihood and even his life to stand up for what he believed. 

 

Cassius Marcellus Clay turned his back on his own culture, put himself at the fore of social change and became one of the leading Southern Abolitionists of the 19th century. Like Ali, he was born in Kentucky and like Ali, it was American racial inequality and social unrest that changed Clay’s life and sent him on a course of political activism. Like Ali he was steadfast in his beliefs and had the force of personality to match.

 

He was a descendent of the famed Whig politician Henry Clay, who espoused antislavery ideas but owned slaves throughout his life. His father was the largest slaveholder in Kentucky, and it was in that milieu his conscience was first awakened to the evils of slavery. Abolitionism became the defining theme of Clay’s political career and life. 

 

As a Yale student with political connections he had the fortune to encounter many of the leading Northern Abolitionists, first meeting Daniel Webster and then William Lloyd Garrison whom he heard speak. Garrison’s rhetoric and unrelenting political action that served as a catalyst to inspire the young Clay:  “the good seed which Garrison had watered, and which my own bitter experience had sown, aroused my whole soul.”

 

When he went back to Kentucky he continued to fight for the cause of abolition. Kentucky was at the epicenter of the debate over slavery and union. 

 

Clay was elected to congress for three terms as a Whig in 1836, but eventually followed in the footsteps of Garrison and started the True American abolitionist newspaper. The newspaper wasrepeatedly threatened and denounced by decree. Clay wrote in his 1885 memoir: 

 

My object was to use a State and National Constitutional right—the Freedom of the Press — to change our National and State laws, so as, by a legal majority, to abolish slavery. There was danger, of course, of mob-violence…and I determined to defend my rights by force, if need be. 

 

In the 1850’s he joined the newly formed Republican Party, though he didn’t always see eye-to-eye with them. He eventually aligned himself with Abraham Lincoln, with whom he shared many of the same views. Clay vigorously campaigned for Lincoln, rousing audiences with speeches and shouting down those who wanted to silence him. In one of the hotbeds of political unrest, on the precipice of Civil War, Clay stood for what he believed in, republicanism and the abolishment of slavery.

 

Clay, as one can tell by his memoirs, like Ali never one for humility, notes his name was bandied about for Vice-President and if he were present at the Republican Convention of 1860, he might have been chosen over Hannibal Hamlin of Maine. 

 

As it was, he was promised a position in what Doris Kearns Goodwin coined the “Team of Rivals,” but the cabinet was full.

 

Eventually, he was given the position as the Ambassador to the Empire of Russia, where he was instrumental in gaining recognition for the Union and preventing countries like Britain from recognizing the Confederacy for economic gain. Though, seldom spoke of, his contribution was essential to the war effort. 

 

Clay also advocated for Emancipation as an act of war as early as 1856. Clay writes that he urged Lincoln to write the Emancipation 1862. He did object, however, that the Emancipation Proclamation only applied to those areas annexed by the Union. Although he was in Russia and not there so see it, Clay received many a laudatory letter when Emancipation became a reality from men like Garrison and Wendell Phillips. 

 

For Ali, his stand against the Vietnam War nearly ended his sporting career, for Clay his political stance was a matter of life and death. While debating the merits of abolitionism – he opposed the annexation of Texas despite fighting in the Mexican War because of slavery – what began as a peaceful engagement became violent. Clay was shot by a mob planning to kill him. He had to defend his life with his knife, killing one of his assailants in self-defense. 

 

It is ironic that Ali who made his living as a pugilist, took a peaceful political stance, while his namesake who made his living as a political figure on the soapbox, almost had his life and career cut short by violence. Yet they share a common bond, each willing to risk ostracization for what they believed.

 

For Ali, that meant standing up for his religious beliefs, and for a time becoming something of a national pariah among many who didn’t understand his conversion or agree with his opposition to fight in the war. His refusal to be drafted and inducted into the military led to the loss of his boxing license and prolonged legal battle. Eventually Ali would be vindicated by the law of the land, a 8-0 Supreme Court vote overturning his conviction on the grounds of conscientious objection. He became one of the most beloved and  recognized men on earth and many see him as a symbol of greatness and national pride. Ali lit the torch  at the 1996 Olympic Games and received the Presidential Medal of Freedom. 

 

Like Ali, Clay would not be silenced.

 

Said Pulitzer’s New World: 

 

Cassius M. Clay won another victory for free speech, and struck a good blow in behalf of Republicanism…Mr. Clay had publicly announced, through both the papers issued at Richmond, that he intended to speak on this occasion, and the subject was much canvassed in the streets. The more violent portion of the Revolutionary Committee, we learn, were for silencing him.

 

Each felt a call to action that changed his life.Each eschewed public opinion and mounting vitriol to assert their ideals and stand for what they believed while using their gift of rhetoric to let people know just what they thought. Each man has markedly impacted what are some of the pervading narratives of American history -- race, social equality and national identity.

 

The two men, born Cassius Marcellus Clay, have a lot in common, showing that name, birth and background don’t necessarily dictate one’s impact, rather its acculturation and moral courage that does. Both Ali and his namesake are connected with one moniker and while one man eschewed the name Cassius Clay, the abolitionist andthe athlete are synonymous with courage and social change.   

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171955 https://historynewsnetwork.org/article/171955 0
Roundup Top 10! Roundup Top 10

 

It’s time to stop viewing pregnant women as threats to their babies

by Kathleen Crowther

How Georgia is continuing a centuries-long tradition, and why it must stop.

 

Why we can — and must — create a fairer system of traffic enforcement

by Sarah A. Seo

Its discretionary nature has left it ripe for abuse.

 

 

If judicial nominees don’t support ‘Brown v. Board,’ they don’t support the rule of law

by Sherrilyn Ifill

Few of us — no matter our race, color or creed — would recognize our democracy or legal system without the changes touched off by this momentous civil rights case.

 

 

How anti-immigrant policies thwart scientific discovery

by Violet Moller

By hindering international collaboration, the Trump administration has triggered a “brain drain.”

 

 

Why We Still Care About America’s Founders

by Rick Atkinson

Despite their flaws, their struggle continues to speak to the nation we want to become.

 

 

Rashida Tlaib Has Her History Wrong

by Benny Morris

The representative’s account of the Arab-Israeli conflict relies on origin myths about the birth of Israel.

 

 

A Whitewashed Monument to Women’s Suffrage

by Brent Staples

A sculpture that’s expected to be unveiled in Central Park next year ignores the important contributions of black women.

 

 

Redacting Democracy

by Karen J. Greenberg

What You Can’t See Can Hurt You

 

 

Men Invented ‘Likability.’ Guess Who Benefits.

by Claire Bond Potter

It was pushed by Madison Avenue and preached by self-help gurus. Then it entered politics.

 

 

 

Special Focus: Impeachment

What Democrats Can Learn About Impeachment From the Civil War

by Jamelle Bouie

Lesson One: Don’t let Trump take the initiative.

 

How the Mueller report could end the Trump presidency without impeachment

by Jasmin Bath

Democrats should run on a message from 1860: You need a president you can trust.

 

 

An Open Memo: Comparison of Clinton Impeachment, Nixon Impeachment and Trump Pre-Impeachment

by Sidney Blumenthal

The facts and history indicate that the Clinton case bears little if any relevance to the Trump one, while the Nixon case shows great similarity to Trump’s.

 

 

The Precedent for Impeachment: Nixon, Not Clinton

by Kevin Kruse and Julian Zelizer

"Blumenthal, who had a front row seat to the Clinton drama, understands that there are major differences between these two instances."

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171994 https://historynewsnetwork.org/article/171994 0
The D-Day Warriors Who Led The Way to Victory in World War ll

 

From THE FIRST WAVE: The D-Day Warriors Who Led The Way to Victory in World War ll by Alex Kershaw, published by Dutton, an imprint of Penguin Publishing Group, a division of Penguin Random House, LLC. Copyright © 2019 by Alex Kershaw.

 

The clock in the war room at Southwick House showed 4 a.m. The nine men gathered in the twenty‐five‐by‐fifty‐foot former library, its walls lined with empty bookshelves, were anxiously sipping cups of coffee, their minds dwelling on the Allies’ most important decision of World War II. Outside in the darkness, a gale was blowing, angry rain lashing against the windows. “The weather was terrible,” recalled fifty‐three‐year‐old Supreme Allied Commander Dwight Eisenhower. “Southwick House was shaking. Oh, it was really storming.” Given the atrocious conditions, would Eisenhower give the final go‐ahead or postpone? He had left it until now, the very last possible moment, to decide whether or not to launch the greatest invasion in history.

Seated before Eisenhower in upholstered chairs at a long table covered in a green cloth were the commanders of Overlord: the no‐nonsense Missourian, General Omar Bradley, commander of US ground forces; the British General Bernard Law Montgomery, commander of the 21st Army Group, casually attired in his trademark roll‐top sweater and corduroy slacks; Admiral Sir Bertram Ramsay, the naval commander who had orchestrated the “miracle of Dunkirk”—the evacuation of more than 300,000 troops from France in May 1940; the pipe‐smoking Air Chief Arthur Tedder, also British; Air Chief Marshal Sir Trafford Leigh‐Mallory, whose blunt pessimism had caused Eisenhower considerable anguish; and Major General Walter Bedell Smith, Eisenhower’s chief of staff.

A dour and tall Scotsman, forty‐three‐year‐old Group Captain James Stagg, Eisenhower’s chief meteorologist, entered the library and stood on the polished wood floor before Overlord’s commanders. He had briefed Eisenhower and his generals every twelve hours, predicting the storm that was now rattling the windows of the library, which had already led Eisenhower to postpone the invasion from June 5 to June 6. Then, to Eisenhower’s great relief, he had forecast that there would, as he had put it with a slight smile, “be rather fair conditions” beginning that afternoon and lasting for thirty‐six hours.

Once more, Stagg gave an update. The storm would indeed start to abate later that day.

Eisenhower got to his feet and began to pace back and forth, hands clasped behind him, chin resting on his chest, tension etched on his face.

 

 

What if Stagg was wrong? The consequences were beyond bearable. But to postpone again would mean that secrecy would be lost. Furthermore, the logistics of men and supplies, as well as the tides, dictated that another attempt could not be made for weeks, giving the Germans more time to prepare their already formidable coastal defenses.

Since January, when he had arrived in England to command Overlord, Eisenhower had been under crushing, ever greater strain. Now it had all boiled down to this decision. Eisenhower alone—not Roosevelt, not Churchill—had the authority to give the final command to go, to “enter the continent of Europe,” as his orders from on high had stated, and “undertake operations aimed at the heart of Germany and the destruction of her armed forces.” He alone could pull the trigger.

Marshaling the greatest invasion in the history of war had been, at times, as terrifying as the very real prospect of failure. The last time there had been a successful cross‐Channel attack was 1066, almost a millennium ago. The scale of this operation had been almost too much to grasp. More than 700,000 separate items had formed the inventory of what was required to launch the assault. Dismissed by some British officers as merely a “coordinator, a good mixer,” the blue‐eyed Eisenhower, celebrated for his broad grin and easy charm, had nevertheless imposed his will, working eighteen‐ hour days, reviewing and tweaking plans to launch some seven thousand vessels, twelve thousand planes, and 160,000 troops to hostile shores.

Eisenhower had overseen vital changes to the Overlord plan. A third more troops had been added to the invasion forces, of whom fewer than 15 percent had actually experienced combat. Heeding General Montgomery’s concerns, Eisenhower had ensured that the front was broadened to almost sixty miles of coast, with a beach code‐named Utah added at the base of the Cotentin Peninsula, farthest to the west. It had been agreed, after Eisenhower had carefully managed the “bunch of prima donnas,” most of them British, who made up his high command—the men gathered now before him—that the attack by night should benefit from the rays of a late‐rising moon.

In addition, it was decided that the first wave of seaborne troops would land at low tide to avoid being ripped apart by beach obstacles. An elaborate campaign of counterintelligence and outright deception, Operation Fortitude, had hopefully kept the Germans guessing as to where and when the Allies would land, providing the critical element of surprise. Hopefully, Erwin Rommel, the field marshal in charge of German forces in Normandy, had not succeeded in fortifying the coast to the extent that he had demanded. Hopefully, the Allies’ greatest advantage—their overwhelming superiority in air power— would make all the difference. Hopefully.

Not even Eisenhower was confident of success. “We are not merely risking a tactical defeat,” he had recently confided to an old friend back in Washington. “We are putting the whole works on one number.” Among Eisenhower’s most senior generals, even now, at the eleventh hour, there was precious little optimism.

Still pacing, Eisenhower thrust his chin in the direction of Montgomery. He was all for going. So was Tedder. Leigh‐Mallory, ever cautious, thought the heavy cloud cover might prove disastrous.

Stagg left the library and its cloud of pipe and cigarette smoke. There was an intense silence; each man knew how immense this moment was in history. The stakes could not be higher. There was no plan B. Nazism and its attendant evils— barbarism, unprecedented genocide, the enslavement of tens of millions of Europeans—might yet prevail. The one man in the room whom Eisenhower genuinely liked, Omar Bradley, believed that Overlord was Hitler’s “greatest danger and his greatest opportunity. If the Overlord forces could be repulsed and trounced decisively on the beaches, Hitler knew it would be a very long time indeed before the Allies tried again—if ever.”

Six weeks before, V Corps commander General Leonard Gerow had written to Eisenhower outlining grave doubts, even though it was too late to do much to alter the overall Overlord plan. It was distressingly clear, after the 4th Division had lost an incredible 749 men—killed in a single practice exercise on April 28 on Slapton Sands—that the Royal Navy and American troops were not working well together. Apart from the appallingly chaotic practice landings—the woeful yet final dress rehearsals—the defensive obstacles sown all along the beaches in Normandy were especially concerning.

Eisenhower had chided Gerow for his skepticism. Gerow had shot back that he was not being “pessimistic” but simply “realistic.” And what of the ten missing officers from the disaster at Slapton Sands who had detailed knowledge of the D‐Day operations, the most important secret in modern history? They knew about “Hobart’s Funnies,” the assortment of tanks specially designed to cut through Rommel’s defenses—including flail tanks that cleared mines with chains, and DUKWs, the six‐wheeled amphibious trucks that would take Rangers to within yards of the steep Norman cliffs—and they knew exactly where and when the Allies were landing. Was it really credible to assume that the Germans had not been tipped off, that so many thousands of planes and ships had gone unseen? 

Even Winston Churchill, usually so ebullient and optimistic, was filled with misgivings, having cautioned Eisenhower to “take care that the waves do not become red with the blood of American and British youth.” The prime minister had recently told a senior Pentagon official, John J. McCloy, that it would have been best to have had “Turkey on our side, the Danube under threat as well as Norway cleaned up before we undertook [Overlord].” The British Field Marshal Sir Alan Brooke, chief of the Imperial General Staff, had fought in Normandy in 1940 before the British Expeditionary Force’s narrow escape at Dunkirk. Just a few hours earlier, he had written in his diary that he was “very uneasy about the whole operation. At the best it will fall so very, very far short of the expectation of the bulk of the people, namely all those who know nothing about its difficulties. At the worst it may well be the most ghastly disaster of the whole war!”

No wonder Eisenhower had complained of a constant ringing in his right ear. He was almost frantic with nervous exhaustion, but he dared not show it as he continued now to pace back and forth, lost in thought, listening to the crackle and hiss of logs burning in the fireplace. He could not betray his true feelings, his dread and anxiety.

The minute hand on the clock moved slowly, for as long as five minutes according to one account. Walter Bedell Smith recalled, “I never realized before the loneliness and isolation of a commander at a time when such a momentous decision has to be taken, with the full knowledge that failure or success rests on his judgment alone.”

Eisenhower finally stopped pacing and then looked calmly at his lieutenants.

“OK. We’ll go.”

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171945 https://historynewsnetwork.org/article/171945 0
‘What’s Up, Doc?’’ Bugs Bunny Takes on the New York Philharmonic, Carrots and All

 

That wascally wabbit, Bugs Bunny, the notorious carrot chomping, sarcastic cartoon rabbit who first leaped on to the nations’ movie screens in 1940 and has been the star of 800 cartoons, four movies and 21 television specials, is back again, this time as the star of a special concert, Bugs Bunny at the Symphony II, in which the New York Philharmonic, live, plays the music of a dozen full length cartoons from the Looney Tunes and Merrie Melodies series, most starring Bugs, while the audience watches the cartoons themselves on a large movie screen. The production is at David Geffen Hall, the Philharmonic’s home, at Lincoln Center, New York. The show is this coming weekend as part of its national tour. 

The concert/show, co-sponsored by Warner Bros., under different names, created by conductor George Daugherty and David Ka Lik Wong, has been traveling through the United States for about 20 years and has been seen by 2.5 million Bugs enthusiasts. In addition to the show, patrons at Lincoln Center will get to meet a number of furry and colorful Looney Tunes characters who will be roaming through the lobby before the curtain. If Wile E. Coyote is there, watch out for him!

Among the cartoons to be screened will be Baton Bunny, Show Biz Bugs, Rhapsody Rabbit, Tom and Jerry at the Hollywood Bowl, The Rabbit of Seville, Rabid Rider, Coyote Falls, Robin Hood Daffy and What’s Opera, Doc?.

Conductor Daugherty was a Bugs fan as a kid, but it was not because of the rabbit’s zany onscreen antics. No, it was because the Bugs Bunny cartoons, and most in the Looney Tunes and Merrie Meodies cartoon factories work used the music of the great classical composers, such as Wagner, Rossini, Liszt and Donizetti. “I was a classical music fan as a boy and I reveled in listening to this great music used as the backdrop for these cartoons. I also appreciated the fact that millions of American kids were being introduced to classical music through Bugs Bunny,” he said.

The Bugs Bunny shows are like no other.

Fans at the Bugs concerts go wild. They cheer the good guys and jeer the bad guys. They applaud. They whoop. The juxtaposition of one of the world’s great orchestra’s playing the music of Richard Wagner as patrons of all ages shout and scream is both puzzling and wonderful.

“You go to a typical classic music concert and everybody is very quiet and respectful of the music. You go to a Bugs Bunny cartoon concert, though, and you lose all abandon. That’s what happens at these performances,” said conductor Daugherty with a big smile. “The same thing happened in the 1950s and it will happen forever.” 

He adds that most older people saw Bugs and Looney Tunes cartoons in a movie theater and kids on a small screen television set. “The chance to see the cartoons in a movie’ like setting, the Philharmonic concert hall, repeats that old feeling for adults and is all new for kids,” he said.

Daugherty and Wong started the production in 1990 and called it Bugs Bunny on Broadway. Since then the show, also called Bugs Bunny at the Symphony and Bugs Bunny at the Symphony II has been staged by more than 100 major orchestras, including the Boston Pops, the Los Angeles Philharmonic and Philadelphia Orchestra. It has been shown at the Hollywood Bowl and Sydney Opera House. 

In 1990, of course. Bugs was a huge Hollywood star. He began his career as a character in the Merrie Melodies cartoon series, making his star debut in Wild Hare in 1940. He was an instant hit, along with dopey Elmer Fudd, wily Daffy Duck and others. His popularity soared during World War II, when millions flocked to movies and the cartoons, which served as an escape from wartime pressures. Bugs Bunny was turned into a flag waving patriotic character during the war, even appearing in a dress blue U.S. Marine uniform in one cartoon. His popularity grew after the war and he remained the number one cartoon character in America for years, chomping on carrots in movie theater all across the country.

The really big advantages of the Philharmonic Hall, Daugherty said, was the sound of the orchestra in the concert hall.

“Back in the 1940s and ‘50s, when these cartoons first came out, the sound equipment in places where the cartoons were made, and in movie theaters, was limited. At the Philharmonic at Lincoln Center, and other halls where we stage the concerts, the sound is beautiful. That’s why people go to these shows,” said Daugherty.

He is always amazed at the people he meets at his productions. “I meet very old and very young people and music lovers, and cartoon overs, from every walk of life,” he said. He once met a couple who met at a Bugs Bunny concert eight years earlier, fell in love and were married.

People are getting used to these type of movie/performance shows. The Philharmonic has staged a number of them. Among them were Fantasia andStar Wars. The Philharmonic will stage a movie/concert of Close Encounters of the Third Kind and Psycho in September, Harry Potter and the Sorcerer’s Stone in December, and Singin’ in the Rain and Mary Poppins in May, 2020.The idea of a movie and a live orchestra is gaining ground in America – fast.

Surprisingly, the audience for the Bugs Bunny productions are neither kids or parents and kids – but individual adults. “I’d say 90% of our audience are adults without kids,” said Daugherty. “They are all coming back to see the cartoons they loved as children.”

And Bugs? The founder of the Warner Bros.’ Looney Tunes production show thinks that the hyperactive gray and white rabbit, getting on a little over the years, would love it.

When I ended my interview with the conductor, I was tempted to assume my very best Bugs Bunny voice and ask him “What’s up, Doc?” I could not do that, though, because the New York Philharmonic is so distinguished...

Really? Wait until this weekend, when Bugs fans pour into the Geffen concert hall at Lincoln Center and roar for Bugs and his cartoon pals who starred with him in all those wonderful old Looney Tunes and Merrie Melody cartoon production houses. The roar will be louder than the traffic in Times Square.

 

PRODUCTION: The Lincoln Center Shows are Friday at 8 p.m. and Saturday and 2 p.m. and 8 p.m.  

  That’s All, Folks !

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171954 https://historynewsnetwork.org/article/171954 0
Citizenship and the Census Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

 

Citizenship is becoming an ever bigger political issue. After some years of heated arguments about undocumented immigrants and whether they ought to be allowed to become citizens, a new front in the citizenship war has broken out over the census. The Trump administration wants to include the following question on the 2020 census form: “Is this person a citizen of the United States?” Possible answers include: born in the US, born abroad of US parents, naturalized citizen, and “not a US citizen”. 

 

It certainly is useful to have accurate data on the citizenship status of our population. But political calculation lurks behind this question, based on the following chain of reasoning. In the midst of a Republican campaign against immigrants and immigration, a citizenship question might frighten immigrants, both legal and not, from responding to the census, thus lowering total population counts. The census results are used to apportion Congressional seats and Electoral College votes, including everyone counted, whether citizen, legal or unauthorized resident. Many federal spending programs distribute funds to states based on population. Places with large numbers of immigrants tend to be Democratic-leaning big cities, so there could be long-range political power implications if the count is skewed. Counting citizens and non-citizens connects to counting votes, the most important constitutional issue of our time.

 

The biggest impact could be in Democratic California, one of Trump’s most persistent adversaries: 27% of Californians are immigrants and 34% of adults are Latino. Studies have already shown that Latinos were undercounted in the 2010 census and non-Hispanic whites were overcounted, according to the Census Bureau itself. The amount of federal funds that California could lose if a citizenship question causes even larger undercounting could reach billions of dollars.

 

Commerce Secretary Wilbur Ross, Steve Bannon (then a white House advisor), Kris Kobach (then Kansas Secretary of State), and others decided in early 2018 to put in the citizenship question, last asked in the 1950 census. Ross claimed the impetus came from a concern in the Department of Justice about protecting voting rights, but journalists uncovered an email trail proving he lied. The chief data scientist of the Census Bureau, John Abowd, opposed the addition of a citizenship question, which he said “is very costly” and “harms the quality of the census count”, and would result in “substantially less accurate citizenship status data than are available” from existing government records.

 

Nevertheless, Ross decided to include the question. Democratic attorneys general for 17 states, the District of Columbia, and many cities and counties have mounted a legal challenge in federal courts across the country. Judges in three federal courts in California, New York, and Maryland have already ruled that there should be no citizenship question. One judge described the argument by Commerce Secretary Ross as “an effort to concoct a rationale bearing no plausible relation to the real reason.” Another judge called the Republican case a “veritable smorgasbord of classic, clear-cut” violations of the Administrative Procedures Act, a 73-year-old law which makes the simple demands that decisions by federal agencies must be grounded in reality and make logical sense.

 

The Supreme Court has agreed to take the case on an expedited basis. So the census absorbs considerable political weight and becomes itself a constitutional issue, pitting Democrats and Republicans on the stage of the Supreme Court. A lawyer for the Democratic-controlled US House of Representatives will be one of the four attorneys arguing against the citizenship question. He will repeat the political power argument on which the local Democratic authorities based their case: they have standing to sue, because they would lose House seats and federal funds due to deliberately skewed results. 

 

The pure political weight of each seat on the Supreme Court has never been made so clear as in the past three years, where one seat in 2016 became the prize in a naked display of Republican Senatorial political power: we can do this, so we will. Now 5 Republican-appointed justices and 4 Democratic-appointed justices will decide the case. The decision will soon have consequences, when the 2020 Census results are used to allocate state and federal representation by Republican and Democratic legislatures for the next election, and even before that, to allocate federal dollars.

 

If you are interested in a fuller discussion of the significance of this case, go to the website of the National Constitution Center. It is rare to find a detailed, logical, clear and unbiased description of the facts on such a politically charged issue.

 

While technical legal issues determine who is a citizen, each party has been proclaiming their version of a good citizen. Republicans have been clear about their version of how a good citizen should act. Hate the free press, because they only tell lies. Physically attacking journalists is okay for a Republican citizen, and elected Republicans will defend your right to do that. The government elected by the citizens is evil, not a democratic institution, but one run by an unelected hidden “deep state”. Nothing is wrong with manipulating the tax system, because taxes are bad, the government wastes the money it collects, and the IRS is an ideological ally of the deep state, anyway. Citizens not only have the constitutional right to resist an oppressive government, but a good citizen treats our federal government as oppressive, and ought to resist it now, with the exception of everything the current President does.

 

It’s not necessary to be a violent white supremacist to be a good Republican citizen, but that’s not a disqualification. Disqualifications have to do with paperwork, with color, with where one was born, and with ideological viewpoints. Liberals are traitors to America, the worst kind of a citizen. People who believe in the right of a pregnant woman to control her own body are murderers, still citizens, but belonging in jail. Various other crimes of the mind disqualify Americans as good Republican citizens: advocating gun control, believing in climate change, and demanding that we protect the endangered environment.

 

Democrats need to tell Americans how we think about citizenship, not just the paperwork and the legalities, but the ethics and good behavior. I think a good American citizen:

1) Prizes the diversity of viewpoints that an ethnically and religiously diverse society produces;

2) Believes in the power of government to make people’s lives better;

3) Believes that government should act in the interests of all citizens, especially those who have the least resources;

4) Wants the government to protect the rights of minorities;

5) Believes that personal religion should be a free choice, but that the religious beliefs of no particular group should determine government policy.

 

If that is not a winning argument about what it means to be an American, then there will be no progress toward creating an equal and just democracy.

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/blog/154211 https://historynewsnetwork.org/blog/154211 0
Carolyn Forché: Bearing Witness to the Wounds of History

(Photo by Don Usner)

 

 

“You want to know what is revolutionary, Papu? To tell the truth. That is what you will do when you return to your country. From the beginning this has been your journey, your coming to consciousness.”

Carolyn Forché in What You Have Heard is True, quoting Leonel Gomez Vides (emphasis original)

 

From 1980 until 1992, more than 75,000 people died in the bloody civil war that raged in El Salvador. Most of the dead were civilians who died at the hands of government forces supported by the United States. The war also left 550,000 internally displaced people and 500,000 refugees who fled the country, as well as more than eight thousand civilians who were “disappeared” and never found. 

The 1980 assassination of beloved Archbishop (and now Saint) Oscar Romero—the voice of the poor—sparked the conflict, and the ensuing 12 years were marked by countless atrocities: the military’s complete destruction of villages and massacres of civilians such as the 1981 massacre at El Mozote that left more than 700 men, women and children dead; rampant kidnappings and gruesome torture; murders of labor leaders and workers; the rape and murder of four American churchwomen; and the 1989 massacre of Jesuits that led to international intervention.

Acclaimed poet, translator and human rights activist Carolyn Forché made seven extended trips to El Salvador in the two years preceding the outbreak of the war, from 1978 to 1980, during the violent “time of the death squads.” She traveled at the behest of her impassioned and brilliant guide and mentor, Leonel Gomez Vides, who desperately hoped to prevent a war in his home country. He also wanted a poet—not a journalist—to accompany him and the share with the American people the reality of life in a land of injustice, atrocity, and extreme poverty. He chose the already acclaimed poet Ms. Forché, then age 27, for this daunting task, and she eventually accepted the challenge despite Leonel’s enigmatic background.

In her powerful and lyrical new memoir What You Have Heard is True: A Memoir of Witness and Resistance (Penguin Press), Ms. Forché recounts in vivid detail her experience in those two turbulent years in El Salvador. Unlike most memoirs, she tells this story in the voice of her younger poet self and conveys the surprise and wonder and shock she felt with each new experience. At the same time, she tells the story of Leonel who introduced her to political and military leaders, American officials, and the wealthy, as well as to workers, teachers, campesinos [peasant farmers], and religious leaders including Archbishop Romero. Leonel constantly reminds her to remember what she sees and to note details because she must tell the American people the unvarnished truth about the situation in El Salvador.

Ms. Forché spent 15 years writing her new memoir. She began the book in 2003 and referred to her notes. diaries as well as photographs, reminiscences of friends, and other contemporaneous documents from her years in El Salvador. She tossed away three early drafts and finished her memoir last year.

Ms. Forché vividly describes in her book what she saw and what she learned four decades ago in El Salvador, a nation on the brink of war. She learned about living in a state of constant tension and fear. She learned about extreme economic inequality and abject poverty. She learned about the beauty of the verdant countryside and the vibrant life in the cities and villages of El Salvador. She learned about brutal torture conducted by US-trained military officials. She learned about a network of safe houses for those who opposed the vicious military regime. She learned about the excruciating pain of prisoners who were locked in wooden boxes the size of washing machines. She learned to see in a new way thanks to the questions and insights shared by the elusive and brilliant Leonel. She learned about tranquility in the face of fear from a future saint, Archbishop Oscar Romero. She learned about the body dumps where the remains of the mutilated dead were scattered. One day, she learned “that a human head weighs about two and a half kilos.” And she learned much more.   

Ms. Forché’s memoir is especially timely as the current US president disregards the law of asylum and fixates on a wall to keep refugees from crossing our borders. She explores the foundations of today’s surge in refugees who flee persecution and violence in El Salvador and other Central American nations.

The title of her memoir, What You Have Heard is True, is from the opening line of perhaps her best-known poem, “The Colonel.”  In this work, she describes the cruel wall that surrounded the home of this Salvador army officer:

Broken bottles were embedded in the walls around the house to 

scoop the kneecaps from a man’s legs or cut his hands to lace.

She continues the poem with images from a 1978 dinner at the colonel’s home. The evening concluded when the colonel emptied from a grocery bag the gruesome trophies of many kills, “many human ears,” at the table where Ms. Forché sat. And he said, “As for the rights of anyone, tell your people they can go fuck themselves.”

This haunting poem encapsulates a moment in history. The image of the colonel’s ghastly wall resonates today. A wall to separate, to intimidate, to divide, to maim, to mutilate. This wall seems a metaphor for the brutality of the oppressive, US-backed Salvadoran government of 40 years ago, and it presages the persecution suffered by refugees who flee Central America today for safety while our president promotes his wall that, regardless of human rights law, will instill fear and despair as it discourages any hope of compassion or sanctuary.

 

“The Colonel” serves as a historical document, a record of what Ms. Forché observed first-hand as violence grew toward war in El Salvador. The poem also stands as an example of the “poetry of witness,” a term she coined to describe poetry that concerns social and historical experiences of extremity such as war, genocide, torture, imprisonment, political persecution, and exile.

 

Poetic works of witness preserve moments of atrocity and trauma and serve as reminders from history for generations to come. Ms. Forché wrote: “We are writing what in the future will be the irrevocable past.” A few examples of poets of witness—and history—include Federico Garcia Lorca, Claribel Alegria, Terrence des Pres, Anna Akhmatova, Bertolt Brecht, and Denise Levertov. Ms. Forché collected these writers and dozens of other poets from around the world in two widely-acclaimed anthologies that she edited: Against Forgetting: Twentieth-Century Poetry of Witness, and the Poetry of Witness: The Tradition in English, 1500-2001.

 

Ms. Forché served until last year as the Director of the Lannan Center for Poetics and Social Practice at Georgetown University. Her books of poetry include Gathering the Tribes, recipient of the Yale Younger Poets Award; The Country Between Us, the Lamont Selection of the Academy of American Poets; The Angel of History, winner of the Los Angeles Times Book Award; and Blue Hour, a finalist for the National Book Critics Circle Award. She has received numerous awards for her distinguished writing and teaching, including three fellowships from the National Endowment for the Arts, a Guggenheim Fellowship, a Lannan Foundation Fellowship, as well as the Robert Creeley Award, the Denise Levertov Award, and James Laughlin Award for poetry. In 2004 she became a trustee of the Griffin Trust for Excellence in Poetry, Canada’s premier poetry award. 

 

Ms. Forché also has been a devoted human rights activist since her return from El Salvador in 1980 when she began speaking out in virtually every American state about the crimes against humanity there and US involvement in the civil war. In the 1980’s, she also reported from Beirut for National Public Radio about the civil war in Lebanon, and she worked with human rights groups in South Africa. In 1998, was presented the Edita and Ira Morris Hiroshima Foundation Award for Peace and Culture in Stockholm for her work on behalf of human rights and the preservation of memory and culture.She continues to advocate for a more just and peaceful world. She lives in Maryland with her husband, photographer Harry Mattison.

 

 

Ms. Forché generously spoke recently by telephone about her new memoir and about issues of history, remembrance, and atrocity as well as on the plight of Central American refugees today.

 

Robin Lindley: Congratulations Ms. Forché on your moving new memoir, What You Have Heard is True. Before I get to the memoir, I wanted to ask first about the “poetry of witness,” a term that you coined about poems that respond to conditions of extremity and challenge the denial of history. I think that's important now as our current president ignores and distorts history for his political advantage. What is “poetry of witness” and how do you see its role in remembering the past?

 

Carolyn Forché: Many poets have written in the aftermath of extremity, having lived through wars as soldiers or civilians, and endured incarceration, exile, censorship, house arrest, banning orders and other forms of state-imposed repression.  As they passed through these experiences, their language also passed, and was marked by suffering and brutality. Poems written in the aftermath of these horrors might be read as “witness” to experience: personal, social, and historical. I began using this term to distinguish such works from the more polemical poems written in service to a political movement, which are sometimes attacked for being “political.”

 

Robin Lindley: It seems you were on the road to becoming a poet of witness even by 1977, before your El Salvador experience. You had written the award-winning book Gathering the Tribeswith now celebrated poems on history and relationships. In the summer of 1977, you stayed in Mallorca with Central American poet Claribel Alegría and translated her poetry of witness and you met other renowned writers such as World War I veteran Robert Graves.

 

Carolyn Forché: There were quite a few writers who visited Claribel in that house in Deia, and listening to them was the beginning of my education, not only in the political realities of Latin America at the time, but in Latin American literature as well. I also met Robert Graves and in fact we gave him his 83rd birthday party. 

 

Robin Lindley: So, by the time your memoir begins in the fall of 1977, you had been exposed to the poetry of witness and you had been honored for your own poetry.

 

Carolyn Forché: I was very young. My first book had won the Yale Series of Younger Poets Award, which is the wonderful thing to happen to a first book. It stressed me into a more prominent position in the world of literature than I might otherwise have been able to achieve, so it made me nervous, and it made me feel a little like retreating. I wanted to keep writing but I didn't want to be such a public person. I spent time after that traveling and backpacking. 

 

And I had my first teaching job, which was at San Diego State University, where I'd befriended Maya Flakoll, who was the daughter of Claribel Alegria. We decided together that we were going to translate Claribel’s poetry to English for the first time. She had been translated into other languages, but never into English. And that's what led us to spend summer in Mallorca in 1977. We worked on the translations together and we finished with a book called Flowers from the Volcanoand it was Claribel’s first book in English.

 

\During that summer I had heard mentioned a man named Leonel Gomez Vides who was Claribel’s cousin. He still lived in El Salvador and he was a rather mysterious person. No one seemed to know clearly who he was and what he was doing. He was known to be a champion motorcycle racer and also a world champion marksman. He had given some of his land to campesinos and he may have been working with the guerillas. Or he possibly worked with the CIA. Nobody knew. But he was apparently a brilliant person. Whenever I would ask a question about him, everyone would be very quiet. They didn't encourage my curiosity at all.

 

Robin Lindley: And in the fall of 1977, Leonel Gomez Vides showed up at your house in Southern California with his two young daughters. He was a stranger but you let him in and he stayed with you for a few days. Why did you to trust him enough to invite him into your home?

 

Carolyn Forché: When he was first at the door, I showed him photographs from the summer in Mallorca and I asked him to identify the people in the photographs who were Claribel, his cousin, and Maya. And he did, and then he immediately and very warmly began to talk about them and to reminisce about times with Claribel with a kind of love, and I realized that yes, he was a family member and he was very warm, very friendly. And he was very nice with his daughters. 

 

And so, it seemed to all right. I didn't ever question hosting them at that time. He was pretty intent on talking to me during those three days. All he did was talk and he drew illustrations of everything that he talked about. He had covered my dining room table with white paper and, by the time he was finished, there was a mural of El Salvador's history and in fact of the entire history of Central America. 

 

Robin Lindley: He was very concerned that a war like ours in Vietnam was imminent in El Salvador.

 

Carolyn Forché: Yes. He was building to a revelation that he suspected his country was on the verge of war and that this would begin in three to five years. And he was interested in having an American poet come to El Salvador and try to learn as much as possible about the situation so that when this war did begin, this poet could come back and talk to the American people about the situation, about what was giving rise to the war and all of that, because he believed that if the United States entered the war on the side of the military in any serious way, it would be very different kind of war. He was hoping to avoid that. 

 

He came to visit me as an American poet. And of course, I tried to dissuade him from imagining that a poet could accomplish the task he imagined, explaining to him that poets didn't have a great deal of exposure or credibility in the United States, and that we weren't consulted on matters of foreign policy. We were considered a subculture or a fringe element. He was surprised by that because, of course, in Latin America poetry is very important and taken very seriously, so he decided that one of my tasks was to change the role of poets in the United States, which I thought was very quixotic and probably more impossible than anything else he was asking me to do. 

 

I was touched by his faith in poetry and by his regard for it, and by his command and his knowledge of history going way back to before the European conquest. And he did go that far back. He was really intent on my understanding the situation with the deep historical roots. He didn't just start in 1970 or some other recent year. He went all the way back before European contact. 

 

At the end of those three days, he invited me to El Salvador and I still wasn't sure about his invitation. He said you're going to improve your Spanish and it'll be like a Peace Corps experience and I will open doors for you and show you the whole country and all of the different social groupings. He said he would introduce me to campesinos and to wealthy coffee planters and to the military. And he said, you'll get a full picture and then, when the war begins, you'll be in an excellent position to talk about it in your own country.

 

Robin Lindley: And you heeded Leonel’s call to come to El Salvador.

 

Carolyn Forché: At that time, I was having a hard time with my own poetry, which was one of the reasons I had started to translate Claribel Alegria. And I had just received the Guggenheim fellowship and I had no real plan because I didn't think I would get one. I had this fellowship and I had the opportunity and a door was opening and I knew it. And I knew that this offer doesn't come along very often. 

 

So, most of my friends disapproved of the idea on the grounds that I didn't know Leonel very well and, in their opinion, Central America was a dangerous place. El Salvador was at peace at the time, meaning not yet at war. Then one friend, at the very end, said I think you should do it. I think you want to do it and I think you should go. And that's all I needed; one person to approve and I landed in El Salvador and January 4th, 1978 for the first time.

 

Robin Lindley: Thank you for that context. As you write, Leonel went to great lengths to explain the traumatic history of El Salvador when he spoke with you at your home. He talked about the Spanish conquest and the atrocities against the Indians and the history of military rule and oppression. Did you have a sense of how violent the country was before you left for El Salvador?

 

Carolyn Forché: Yes. I knew there was a great deal of poverty and inequality and maldistribution of wealth. The gap between the rich and poor quite resembles our own country now. But at the time, we had a large middle class here, so it was shocking to me that two percent of the of the population could own 60 percent of the resources. 

 

And there is a violence to poverty as well. And there was a lack of willingness to reform, and a lack of willingness to do small things that led to the desperation of an armed struggle. There were peace marches and labor unions were organizing with people who were trying to get slightly better wages and slightly better conditions. And this was always refused and suppressed by armed force. All of the demonstrations were fired upon. 

 

Leonel also pointed out that El Salvador had one of the highest murder rates in the world at that time. Violence penetrated the society, so it was a dangerous place. I didn't realize how dangerous until I got there. 

 

The period when I was in El Salvador has been called “the time of the death squads.” It was the time before the war. There was no armed uprising yet, but there were organized paramilitary civilian and military death squads operating not only in the countryside but in the cities. By the time I was leaving in 1980, they were killing up to a thousand people a month in the capital city or disappearing them. Bodies were left everywhere or taken to body dumps, essentially dumping grounds for the dead.

 

They were killing anyone suspected of having anything to do with championing the rights of the poor or working on behalf of the poor in any way. So, teachers, priests, nuns, doctors, students, union organizers—all of these people—were subject to being suddenly pulled out of their houses or pulled off the street and never seen again. 

 

So, it was dangerous, but it wasn't yet war. And the history that Leonel shared with me prior to my trip was one of violence and land confiscation and of altering the living conditions for the indigenous people of Central America who had held land in common and previously developed very sophisticated methods of growing food. And then the lands were confiscated systematically, first for the cultivation of indigo and ultimately for coffee. 

 

When it was realized that the highlands were perfectly suited for growing excellent coffee, the violence was poverty, land confiscation, suppression, and rigidity. This is a lesson for us too in the United States. If you refuse small reforms and refuse an escalation in the minimum wage and refuse constantly to do anything at all to improve the lives of the poor, eventually you're going to have a big, big problem. This rigidity does not lead to anything good, and I'm seeing that rigidity now in our government. I've seen this before so I know what I'm looking at.

 

Robin Lindley: You have much to teach Americans. You became aware also that the United States was supporting an oppressive military government and funding the Salvadoran military and even training troops in skills such as torture.

 

Carolyn Forché: Yes. For a while aid had been cut off because the government had to be certified by the U.S. State Department as respecting human rights in order to receive economic and military aid.  This was the human rights policy that was put in place by President Carter. It was primarily designed for use against the former Soviet Union and its client states, but it wound up being applied to our allies who were very busy keeping order by violent means in their own countries.

 

In the case of El Salvador, the Salvadoran military was very confused and angry about this certification of human rights. Eventually El Salvador somehow was certified, even though it wasn't respecting human rights. It was said that the deaths were being caused by “unknown elements” who had nothing to do with the government, which wasn't true of course. So, the economic and military aid was restored and the first $5.5 million in military aid was allocated, which doesn't sound like much now, but it was at least symbolically significant then. That happened on the day after Monsignor Oscar Romero, the recently canonized St. Romero, was murdered on March 24, 1980. The U.S. Congress held hearings. I was present at those hearings. They voted to approve the military aid and the sending of 12 American advisors who they called “trainers” because they didn't want to echo the language of the American War in Vietnam at that time. The 12 soldiers were to go to El Salvador to advise with $5.5 million in military aid. And of course, that amount increased exponentially over the course of the ensuing 12-year civil war.

 

Robin Lindley: That vote for aid had to be a disappointment for you.        

      

Carolyn Forché: Yes. Congress supported the military. And we also trained the Salvadoran military on our own bases in our country and sent them back to El Salvador. But because the American public was not in favor of direct military intervention, the United States never sent our own soldiers to deploy in El Salvador and engage militarily in combat. That was largely because the American public turned against intervention. 

 

Instead, you had this vast movement in the United States of people who supported sanctuary for fleeing refugees. They were people from established organizations like Witness for Peace and the Sanctuary movement. There was a network of U.S. residents and citizens in solidarity with the Salvadoran people. This organizing was very effective. I think we would have gone into El Salvador militarily but for that movement, and also but for certain Democrats in the Congress at the time who were vigilant about the situation in El Salvador. Certain congressmen and senators were very knowledgeable about Central America and they kept much worse things from happening.

 

Robin Lindley: It’s an appalling history. That brings me back to your memoir and your initial impressions. In terms of history, it’s interesting where Leonel arranged for you to stay when you first arrived in El Salvador. What happened on your arrival?

 

Carolyn Forché: I should say first that, in this book, I take the reader on the journey that I took. In other words, the reader never knows more than I knew at the time. So it unfolds a bit like a mystery or maybe a thriller.

 

Robin Lindley: Yes. I think the memoir reads like a thriller.

 

Carolyn Forché: In the book, I start off the journey with the arrival at the airport [in El Salvador]. I didn't know anything about anything yet and I'm 27 years old. Leonel wasn't there to pick me up.  I looked around and thought, oh my gosh, what am I going to do? And then a Peace Corps volunteer came towards me and said Leonel had sent him to get me. He said, I'll take you to him. We're going to have dinner at the Benihana of Tokyo restaurant in San Salvador.

 

After that dinner, which had a lot of interesting people at it, Leonel took me to stay at a house that was occupied by the sister of a Catholic priest he knew. This house once belonged to General Martinez who was the dictator or so-called president of El Salvador during the 1930s, when he has presided over the 1932 massacre called the Matanza [“the killing”] of perhaps 30,000 to 80,000 people [mostly indigenous peasants], depending on your source. 

 

And so there I am, sleeping in the dictator’s bed my first night. That was Leonel’s way: start in the dictator’s house because we have been living under military dictatorship, and that dictatorship was unbroken for 50 years.

 

The military candidates always won the presidential elections. There was never a question about that. The ballot boxes were fixed. They called it “sugaring the ballots” so that if the military candidate wasn't winning in any particular region, they would just stuff the ballot box with favorable ballots.

 

The military always won and they always appointed their own ministers and those ministers were always their fellow officers and that's how things worked. The jobs of the military weren't particularly well paid. The job was to maintain order. Their other job in their own minds was corruption and trying to get as much money as possible while they held power. And some of that pocketing of money had to do with siphoning American aid money. 

 

That's why the military was so upset when the aid was cut off. It wasn’t because they were trying to benefit the people of El Salvador with this aid. It was because some of that aid was going into their own pockets and that's how they were becoming rich enough to retire comfortably in Miami or Houston after they left power. They didn't really have to worry too much about coups in El Salvador because every generation of officers would keep power for four years and then they would cede to the next generation. And this is how it worked in the military. And there was never any question that you had four years to steal money. So a cut in US aid was very threatening to the military. 

 

And so, I was at the seat of power when I arrived and slept at a dictator’s house then occupied by the sister of a priest. 

 

Robin Lindley: Leonel had a sense of irony. Your impressions on the poverty you saw are also instructive. You saw poverty in the cities and traveled out in the countryside and met campesinos.

 

Carolyn Forché: I had not been in an underdeveloped country, what they used to call “third-world” countries. There's no real name for a country that has not been industrialized. 

I was seeing this poverty for the first time, although poverty in industrialized countries such as our own is also very brutal and harsh. It just takes a different form in El Salvador. In the countryside, they didn't have running water and they didn't have potable water. They didn't have electricity in most of the villages. They were living in very primitive conditions. And they didn't have much to eat. They lived on beans and corn. That was it. If they had anything else, they sold it. It was a very meager diet.

         

The life expectancy at that time was 47 years and one out of every five children died before the age of five of curable diseases like measles. I saw malnourished children. Conditions were harsh and workdays long. There was no such thing as time off.

 

I was startled when I first got there, and saw the world much as a visitor or tourist would see it. I write about the women carrying large jugs on their heads. The jugs were very large. Some were two feet high and they were balancing them on their heads and they could turn their heads without spilling a drop. These urns contained water because people didn't live near drinkable water so they had to carry water to where they lived. These women walked so gracefully. At the beginning I saw them walking beautifully and it was just something that would be appreciated for its beauty by a tourist. Later, I discovered that those jugs were incredibly heavy and that those women suffered damage to their cervical spines from compression caused by carrying these jugs for so many years. And so, you start to see the world differently. 

 

Leonel taught me a new way of seeing the world—of looking and thinking. I'm hoping that, through reading the book, people will also have that experience. That education is what I tried to replicate in the memoir. I tried to, step by step, show the reader what happened, how I was shown a different reality.

 

It was a serious challenge to write in such a way. The book is really about Leonel.  I try to capture his incredible personality on the page, his humor and his brilliance. He was complex, and I was young and rebelling, pushing back and arguing with him, not accepting everything that he said and did. We grew as friends and we found ourselves in incredible situations.

 

Robin Lindley: Leonel was a master of the Socratic method. He posed questions and you had to supply the answers. You had to figure out new and often perilous situations yourself. 

 

Carolyn Forché: I would ask sometimes a simple question, and instead of answering me, he put me in a situation where I would find the answer myself. He always felt that experience was a better teacher. Some things just had to be personally felt and seen in order for learning to take place, in order for consciousness to change. He was interested in consciousness and the formation of it, and what makes us think and feel the way we do. Where do we get our ideas about the world and how are those ideas formed and what, if anything, challenges them?

 In the United States we tend to think that we don't really have any ideology. We're the default position. We are the normal way of things and our way of life is the right way. We don't have ideologies. We’re not communist. We're not this, we're not that. But actually, we do. We all do. Every human does. And one of the things that would be important for us is to begin examining that ideology a little more closely. Asking questions. Why do we think this is right? And why is that wrong? How did our attitudes about the world develop over time?

 

Take, for example, our faith in capitalism, I'd love to know how that developed. It's an economic system. We treat it almost as a god, as sacrosanct, and unchallenged, but it's just a particular economic system we've adopted. Leonel was very, very good at raising questions and showing you that things are a little more complex than you might think.

 

Robin Lindley: Your courage struck me. You were actually chased by death squads. The tension and the violence are so vivid in your book. There must have been many fearful times for you.

 

Carolyn Forché: Everyone in El Salvador at that time lived in fear and it was intense. It was very, very scary place, and so adrenaline was always high and people were always hypervigilant and on edge. One never could relax or be comfortable. If you're sleeping in your bed at night, at any moment, something could happen to you. So that doesn't feel safe.

I experienced what everyone else was experiencing. You couldn’t avoid it if you were there. That's what the life was for the people there and also for me. I was pursued by death squads because of the people I was with, and I was very, very lucky on those occasions. I tell those stories in the book. I talk about what happened and I tried to describe them as clearly and precisely as I could. 

 

I did witness one abduction, and I also describe it.  It's been quite a few years now, but it was years more before I lost the hypervigilance. It didn't calm down within me for quite a long time after I left El Salvador.

 

Robin Lindley: After the violence and horror you witnessed, it’s understandable if you had some symptoms of post-traumatic stress disorder.

 

Carolyn Forché: I didn't think of it that way in terms of myself because I associated post-traumatic stress disorder with combat veterans. You had to have been a soldier to have that happen to you. Now, of course, I know that's not true at all. You can have post traumatic stress disorder from domestic violence. You can have it from all sorts of extreme experiences—anything that frightens you deeply, any blow to the psyche, any wound to the heart or the soul can cause [PTSD] to happen. 

 

I think everyone in El Salvador at that time who was awake and thinking probably suffered from post-traumatic stress for years. 

 

These young parents who are fleeing now are the children of the people that I was with then. And life, for Salvadorans, is even more dangerous now. We really should be bringing them into our cities and getting them settled and then somehow working inside El Salvador and Honduras to help protect people from extortion, rape, violence and murder.

 

Robin Lindley: The tension in El Salvador is palpable in your writing. Yet Leonel, in the course of your two years, took you into the nests of vipers, to the homes of right-wing military officials and politicians. And he also introduced you to guerillas, doctors, nuns, priests, campesinos, and his friend Archbishop Romero.

 

Carolyn Forché: He knew everyone. He cultivated friendships in all sectors. Of course, when he was young, he came to a military academy in the United States for a while. And he had friends who were in the military, and friends and relatives in the officer corps. He went to grade school with men who later were officers. So he knew everyone. And he was from a prominent family and they were coffee farmers. 

 

He had a very small coffee farm that didn't make very much money anymore, so he wasn't a rich man— but he knew the wealthy. And because he'd worked for so many years with campesinos, he knew the poor. He knew workers and labor organizers. He really did have these friendships in all these sectors and that's why he was able to bring me into the offices of the military, the homes of the wealthy, and the villages of the poor. It was because they knew him.

 

Robin Lindley: You described one of your meetings so vividly in your haunting poem, “The Colonel.” You actually met this military officer who was very angry and threatening. Was Leonel with you then?  

 

Carolyn Forché: That incident happened in 1978. It was fairly early on in my time in El Salvador in the period when the military was angry with the US government for imposing the requirement for human rights compliance to re-start any US aid. And so, when the colonel says, tell your people the equivalent of go to hell, the people he was referring to was the US government. He thought I could just go tell President Carter that. He was angry and probably a little bit inebriated. 

 

He was in possession of body parts taken as a bounty, as proof of kills, as was common in Vietnam. It has been common through the ages and in all parts of the world. He spilled some body parts in front of me and that was his answer to the State Department requirement to comply with human rights.  He thought I was from the US government, and no matter how much I denied that, he was convinced. So that was the origin of the poem. 

 

Leonel was with me and he had set up the meeting but he did not know that that's what was going to happen.

 

Robin Lindley: Was this the extremely brutal Colonel Chacon?

 

Carolyn Forché: I have not identified the colonel from the poem because of his family, and I never will. I met several colonels very much from that mold. I'm hoping that the memoir will illuminate the culture in which that colonel was formed and why the colonel was the way he was. 

 

Colonel Chacon was probably the worst of them at the time. I talk about what finally happens to him in the book. He was truly a butcher of human beings. And he was at the helm of a fairly large network across different countries of paramilitary killers who were working for hire. He was creating his own small army, some of whom were Cuban exiles. There were various people involved in it, and that little army that he was creating scared everybody. The US had no control over it. He was forming a shadow regime, an army of killers.

 

Robin Lindley: Your portrait of Chacon and his atrocities is chilling. Leonel told of his horrible torture, of cutting off fingers of those he interrogated or even disemboweling living victims. Wasn’t the right-wing politician Roberto d’Aubuisson also running death squads in El Salvador?

 

Carolyn Forché: D’Aubuisson was a colonel who was cashiered in the 1979 coup. Later, he became a civilian politician and was a member of the right-wing political party. There's ample evidence of his connections to death squads, but I don't know enough to speak about it. I will say that there's a lot of evidence that he was involved in the murder of Monsignor Romero and that he was a member of a network of death squads. He died of lung cancer at a fairly young age. His name became synonymous with death squads when I was there. When they referred to the death squads, they referred to d’Aubuisson. 

 

But most people wouldn't know who Colonel Chacon was. He wasn't a name on the street because he was operating internationally and clearly under the radar.

 

Robin Lindley: Did you meet Roberto d’Abuisson?

 

Carolyn Forché: I did not meet him. He was well off, living in the open, and he wasn't a shadowy figure. I saw him in public several times but never talked to him. 

 

Robin Lindley: Your writing about Leonel’s friend Archbishop Romero and his tireless advocacy for the poor is very moving. You met him several times. What was your impression of the Archbishop?

 

Carolyn Forché: Leonel was a good friend of Monsignor Romero and he was also a good friend of Madre Luz, who was the mother superior of the Carmelite Order of Nuns who ran the hospital where Monsignor Romero lived. He had a little house there. Leonel kept that place going for a long time. They were all close friends and Leonel would take me to meet there with Madre Luz at the convent. Leonel introduced me to Monsignor Romero. 

 

Monsignor Romero was very kind. He was a bit shy, very studious, and deeply thoughtful. He had studied in Rome. 

 

As things started to deteriorate and as the killing escalated, one of his close friends, Father Rutillo Grande, a Jesuit, was murdered. Monsignor Romero went to keep vigil with his body and then began to publicly denounce the military regime. He became the only institutional voice against the oppression in the country. He was a very visible public figure, and he saw himself as a shepherd, as a bishop of his people, as someone to stay with his people and keep watch with them and take care of them. Every Sunday he would say mass in the cathedral and his homily would be broadcast all over the country on radio.

 

The right hated Monsignor Romero. He was number one on the death squad hit lists, some of which were printed in the newspapers. Yet he stood up and he denounced the oppression every Sunday. And he read out the names of the dead. He was very compelling. He said yes to the call of that moment.

 

The last time I talked to him, he told me I had to leave the country the next day. I asked if he would leave the country. He said,” No, my place is with my people and your place with yours now.” That was difficult for me to accept, but Monsignor Romero knew what was coming. He knew his time was short. 

 

I also thought he was a saint long before the Vatican acknowledged his sainthood. There was a kind of tranquility about him, even though he felt fear. He talked about feeling fear like any other human being. But he gave his life for his people. He didn't abandon them. I have utmost regard and also love for him, and his loss was a grave one for humanity.

But now we have him among us in spirit.  The people of El Salvador venerated his sanctity long before the Vatican acknowledged it.

 

Robin Lindley: He told you that you should leave the country to be with your own people. I believe you left the next day and, a week later, he was assassinated.

 

Carolyn Forché: On March 16, 1980, he told me that it was important for me to leave. He was assassinated a week later, and I was back in the United States because he asked me to leave. I received a phone call from El Salvador and I was told he had been shot. At first, I didn't know if he was dead, but yes, he was.

And then I went to Washington DC to attend the hearings I mentioned with the House Subcommittee on Inter-American Relations. They were holding hearings on whether or not to support the Salvadoran government economically and militarily. It was the day after the assassination of Monsignor Romero. They couldn't delay the vote even a day. And they voted yes, to support the military.

 

Robin Lindley: What a harrowing time. I'm sure you would have been at the funeral for Archbishop Romero a few days after his assassination if you had stayed in El Salvador. More than one hundred thousand mourners gathered on the cathedral plaza. The funeral turned into bloodbath when right-wing attackers threw bombs and shot into the huge crowd. Dozens of people were killed or wounded.

 

Carolyn Forché: My husband [photographer Harry Mattison] was there and he took the photographs of the funeral that are now iconic. He has talked to me about it. It was a bloodbath. Just horrific. Interestingly, there was no head of state or other officials there. It was a poor people's funeral, and it was brutally attacked. 

 

I was not at the funeral but I would've been there if I had stayed was in El Salvador. There were a number of people killed. And there’s a really haunting image: there's a kind of plaza outside the cathedral and in the aftermath, after everyone had gone, there were shoes all over the plaza. People had literally run out of their own shoes to get away from the gunfire, and the shoes were strewn all over the plaza. I remember seeing that image. 

 

My husband was taking photographs as the people were struggling to get into the cathedral to escape the gunfire. Finally, he put his camera down and just started lifting people over the barricades to protect them.

 

Robin Lindley: Your husband was a hero, risking his life to save others. I noticed also that your son Sean is a documentary maker. It seems he takes after both mom and dad by combining his own form of witness and photography.

 

Carolyn Forché: You know the expression the apple doesn't fall far from the tree? In the case of Sean Mattison, our son, we joke that he didn’t fall far from the tree. He fell on the tree.

 

Robin Lindley: You must be very proud of Sean. 

 

Our present immigration and refugee issues are rooted in the history you detail of civil war and US intervention in Central America. After your two years with Leonel in El Salvador, you went to virtually every US state and used your voice to describe the oppressive military dictatorship and human rights abuses in El Salvador.

 

Carolyn Forché: Yes, When I came back, I went to 49 states, all except Hawaii, and talked in churches and synagogues and even in rotary clubs. I was invited to speak because of my book of poems about El Salvador,The Country Between Us. The poems became very well-known because of two newspaper columnists, Nicholas von Hoffman and Pete Hamill, who had written about the book in their syndicated columns. And as a result, my book was known more than a poetry book normally would be. 

 

And I’m glad you brought up the US intervention and the situation of the refugees at the  border because that's crucial right now. 

 

We did so much wrong in Central America, among them supporting military dictatorships that we knew were brutally oppressing the people. And we knew also that they were stealing from American economic and military aid and that they were also stealing loans that they had received through the Inter-American Development Bank and through other resources that were intended for hydroelectric plants and projects like that.

 

In El Salvador, we were dealing here with a corrupt government and we supported their suppression of an uprising and the deployment of military forces against them. There was a 12-year civil war with us on the side of the military. The military could not win that war because they did not have the popular support to win. It was fought to a draw and it was settled by peace negotiations that were in fact initially arranged by the man I write about, Leonel Gomez Vides. He visited me in the United States [during the war], and he was the one who arranged the first meetings to bring the war to an end. 

 

As part of the negotiated settlements, there were promises made about what was going to happen after the war, and those promises were broken. In the aftermath of the war, the judiciary was not functional. The society began to fall apart. Then extensive money laundering and narco-trafficking took over in El Salvador.  There was corruption at every level of society and that created a situation of extreme violence that exists even now. The violence is brutal. It's gruesome. It involves the extortion, rape, torture, killing, and mutilation. Ordinary people are being preyed upon and they see terrible things happen to neighbors and those they love.

 

 The refugees are lifting their children into their arms, taking a little bit with them in a little rucksack or something, and running north as fast as they can and with no resources. They don't care what the desert or the border have in store for them. They flee. When you're really afraid, anything you can imagine is better than what you're running away from.

 

 The people who are coming to our border are not migrants looking for some better job. They have no illusions about what awaits them. They are refugees fleeing violence that we in great part created with our support of corrupt, dictatorial regimes. 

 

We are the authors of the chaos that you're seeing now. And we have done nothing to abate it or mitigate it. What we have now are collapsed countries, failed states, not only El Salvador but in Honduras and Guatemala, and Nicaragua is now becoming a different kind of failed state because of Daniel Ortega's dictatorship. 

 

The whole of Central America is in turmoil and these people are running for their lives. When they get to our now militarized border, they are treated with extreme coldness and hostility. 

We have broken international law by detaining people who are seeking asylum. People used to present themselves at our border, ask for asylum, fill out paperwork, and then they were free to live and work in the United States while their asylum claims went through our court system, and that could take two or three years. If they were denied asylum, they could appeal and that would take a little more time. Now, we've criminalized them counter to international law. We're detaining them and separating their families. That separation was temporarily halted, but is about to resume under the orders of the Trump regime. 

 

And so, we have exacerbated the so-called crisis at our border by our unwillingness to offer needed hospitality and care and comfort to people who we endangered with our policies. It's a refusal of compassion, a refusal of empathy, and a refusal of common decency.

 

Robin Lindley: It seems many Americans do not understand the difference between refugees who are fleeing persecution and other types of migrants. 

 

Carolyn Forché: The media are not helping with that because they call them all migrants or immigrants. Well, no, they are not voluntarily emigrating to another country in order to get a job. We have to understand what refugees are and what they're fleeing, why they're terrified. They're refugees of war and its aftermath and they're asking for asylum. They have the right to ask for asylum and the right to have their claims considered. They have the right to request asylum inside our border and to be allowed to live freely while their claim is being considered.  

Robin Lindley: There seems to be little understanding of our obligation to protect refugees under domestic and international law. That's appalling today.

 

Carolyn Forché: Yes, I agree.

 

Robin Lindley: As you said, these Central American refugees are fleeing from brutal violence. I was just reading that El Salvador’s homicide rate is one of the highest in the world. I think there are a couple of dozen murders a day in San Salvador. 

 

Carolyn Forché: Yes, it is very dangerous now. It's more dangerous now in many ways than it was during the war. It’s beyond chaotic. A person will be asked for money. If they don't pay the money, they are brutally killed and their body parts strewn everywhere so that the next person asked for money will pay. It's horrible, and our policy at the border is making everything worse.

 

We're a nation of immigrants. We should be welcoming immigrants, especially those who are fleeing danger in their own countries. Most especially them. 

 

And we need people here. It's not true, as President Trump said, that America is full, like it's some kind of building with a certain number of hotel rooms. No, we're not a hotel, but a vast part of a vast continent, and we are not full. 

 

People who come here from Central America tend to work in the jobs that no one wants here. They work in agriculture doing stoop labor, like picking strawberries. They also work in the restaurant industry by washing the dishes and bussing the tables. They're all over American suburbs doing the landscaping, the housecleaning, and the babysitting. This is what they're doing.

 

I don't understand the aversion to these refugees, and I don't understand the lack of awareness about how much we need immigrants to come here and establish themselves and also keep this country a little younger demographically. We’re becoming an elderly nation. Why not let in the youth of other countries now?

 

The Administration talks about drugs, but most drugs are not carried by hand through ports of entry over our borders. Drugs are transported in containers, on ships entering our harbors. For some reason, it's all set up so that those containers never get opened or inspected. They come by air, they come by sea, and they come in large quantities. This is an international business that operates like any other corporation, so you're not going to find a lot of drugs on people coming across the border to seek asylum. That's not who's coming. You're going to find single mothers with little children in tow. These are families who are fleeing violence. They don't know what's ahead for them. They don't know what's going to happen to them, but anything is better than going home.

 

Robin Lindley: Thank you for those powerful words. You mentioned that Leonel was involved in the peace process that ended the civil war in El Salvador. What happened to him after you left El Salvador in 1980?

 

Carolyn Forché: He remained in El Salvador for a time, and then was granted asylum in the United States, where he worked tirelessly to influence U.S. policy and to gather the people who would eventually bring the war to an end.

 

Robin Lindley: Have you maintained contacts with people in El Salvador? 

 

Carolyn Forché: Yes, my friends are still there. Many have died, but I am in touch with those who are alive.

 

Robin Lindley: I appreciate your comments on your experiences in El Salvador for the two years with Leonel. Is there anything you’d like to add about how that experience changed you and your writing?

 

Carolyn Forché: I think my experience there changed my life, and therefore my writing.

 

Robin Lindley: What projects are you working on now?

Carolyn Forché: I’m finishing my fifth book of poetry, In The Lateness of the World, which will be published by Penguin Press in 2020.

 

Robin Lindley: What lessons to you hope readers take from your new memoir?

 

Carolyn Forché: I’m hoping readers will be moved by it.

 

Robin Lindley: Thank you very much for your thoughtfulness and generous comments Ms. Forché. I know that readers will appreciate your insights. And congratulations on your powerful new memoir.

 

 

 

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171948 https://historynewsnetwork.org/article/171948 0
When War Was a Family Affair

 

 

There was a time before World War II when family members serving together aboard warships was judged a good thing. An older brother in uniform was a persuasive recruiting poster for his younger siblings, especially when the uniform came with a steady paycheck. During the gloom of the Great Depression, many a young man joined the United States Navy not out of patriotic pride or a desire to see the world, but to put food on his family’s table. One less mouth to feed and five or ten dollars sent home from a monthly pay of thirty-some dollars made a big difference.

 

Harvey Becker left the family farm in Kansas in 1938 when he was almost twenty-two. Younger brother Marvin enlisted a year later, and nineteen-year-old Wesley followed suit a year after that. Wesley dreamed of becoming an artist and had hoped his path would lead to Kansas State, but money was tight. All three Becker brothers requested service together aboard the battleship Arizona and became gunner’s mates assigned to Turret No. 2.

 

Brothers Gordon and Malcolm Shive shared a childhood playing in the sands of Laguna Beach, California, but times turned tough after their father’s early death and the arrival of a cantankerous stepfather. Gordon left home first and joined the Marines. He qualified for a slot in the prestigious Marine Detachment aboard the Arizona and rowed on its competitive whaleboat team. Younger brother Malcolm had a special interest in radios and took that talent into the Navy. By December 7, 1941, Malcolm was a radioman serving on the Arizona with his big brother.

 

The Free family connection aboard the battleship was a father-son affair. Thomas Augusta Free, known throughout his Navy career as “Gussie,” came to the Arizona looking for one last good ship to round out his twenty-year career as a machinist’s mate. It was icing on the cake that his eighteen-year-old son, William, was aboard as a new seaman. Gussie had been absent at sea for much of William’s childhood growing up in Texas and they both relished their time together. 

 

Wesley Heidt was the younger brother of Edward “Bud” Heidt. Wesley had just been promoted ahead of his older sibling and shipmate but Bud didn’t care. He was focused on his girlfriend, Donna. There was nothing official, but her mother surmised there might be an engagement ring coming Donna’s way for Christmas. Bud and Wesley’s own mother had been urging Wesley to write more often. “Don’t worry,” Wesley assured her when he did, “I am safer on this battleboat than I would be driving back and forth to work if I was home.”      

 

Of course, he wasn’t. On the morning of December 7, 1941, there were thirty-eight sets of brothers, including the three Becker lads, serving on the Arizona. Harvey Becker and a few other married men had liberty ashore with their wives. They were the lucky ones. When bombs began to fall a few minutes before 8:00 am, the destruction was horrific and almost instantaneous. Turret No. 2 was at the center of the destruction. Out of seventy-eight brothers, only fifteen survived the attack. Among the dead were Marvin and Wesley Becker and Gordon and Malcolm Shive and Bud Heidt. Gussie Free and his son also perished. 

 

Masten Ball of Iowa was one of those who survived. He was blown off the Arizona’s deck but somehow escaped the fiery waters largely unscathed. His younger brother, Bill, a promising baseball prospect, was never seen again. Back home in Iowa, the five Sullivan brothers, who were family friends, promptly enlisted to avenge Bill’s death. The Sullivans wanted to serve together and died board the cruiser Juneau when it sank off Guadalcanal with only a handful of survivors. The Shives’ younger brother Robert, not yet twelve, took it very personally and tried to enlist. His grieving mother didn’t try to stop him, knowing that kind but firm recruiters would.

 

After these tragedies, the US Navy never absolutely forbade such family service despite a perception among the general public to the contrary. There was no “Sullivan Law” and commanding officers did not separate brothers already serving together. Later in World War II, the Navy, Marine Corps, and Coast Guard permitted the transfer of “sons of war-depleted families” out of combat zones—essentially a sole survivor policy—unless they were engaged in nonhazardous duties. Transfers were not automatic, however, and applications had to be filed by the serviceman or his immediate family. Out of a sense of service, many never took advantage of these provisions.

 

None of this brought solace to those who had made the Navy a family affair and lost brothers on the same ship. For the Beckers, Shives, and so many others, service at sea during World War II really was a family affair. 

 

 

To read more about these families, check out Brothers Down here or here

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171947 https://historynewsnetwork.org/article/171947 0
Jane Manning James and African American Women in the Mormon Church

 

The Church of Jesus Christ of Latter-day Saints (the LDS, or Mormon, Church) has long faced criticism for its treatment of its black members.  For over a century, men of African descent were not allowed to hold the LDS priesthood, even though that office was conferred on virtually all other male members of the church.  As a corollary, black men and women were not allowed to perform the temple ceremonies that Latter-day Saints considered crucial for reaching the highest degree of exaltation in the afterlife.  In June 1978, LDS Church leaders announced a revelation that ended these race-based restrictions.

 

Understandably, much of the discussion about race in the LDS Church centers on this history of exclusion and the ultimate reversal of the church’s discriminatory policies.  Yet despite its history of racial discrimination, African Americans have been members of the LDS Church since its beginning. How these Mormons navigated life in the church before the 1978 revelation is far more rarely discussed.  Jane Elizabeth Manning James, a free black woman who converted to Mormonism in the early 1840s, provides a little-known vantage point from which to tell a story of Mormonism that takes the church’s racial history into account. A relatively small number of African Americans joined the Latter-day Saints during James’s lifetime, and even fewer made the treks to Nauvoo, Illinois and to Utah’s Salt Lake Valley. Tracing Jane James’s story reveals some of the less-frequently trodden paths sometimes open to nineteenth-century African American women and men and reveals how African American Mormons constructed rich, satisfying religious lives despite the LDS Church’s discriminatory policies.

 

Jane Elizabeth Manning was born in Wilton, Connecticut in the early 1820s.  Her mother had grown up in slavery, but was emancipated before Jane was born. Her father died when she was a young girl, and Manning then went to work for a wealthy, elderly white couple in New Canaan, Connecticut, about six miles from her family’s home. She joined the New Canaan Congregational Church in 1841, but converted to Mormonism a short time later when she heard an LDS missionary preach, and she seems to have brought the rest of her family into the church as well.  

 

In 1843, the Manning family joined an interracial group of converts from southwest Connecticut and headed to Nauvoo, Illinois, where the church was then based.  Jane James later remembered that when they got to Buffalo, New York, the black members of the group were refused passage on the boat that was to take them to Cleveland.  Instead, they walked the seven hundred and twenty-eight miles to Nauvoo. When they got there, Jane Manning worked as a servant in the home of Joseph Smith, the religion’s founder; when he was killed in 1844, she went to work for Brigham Young, Smith’s successor.  She married Isaac James, another black convert, and they moved to Utah with the church. They were in one of the first companies to reach the Salt Lake Valley in 1847.  

 

Jane James lived in Salt Lake for the rest of her life.  She was active in the LDS church and spent a great deal of energy requesting permission to perform the temple ceremonies that she believed were necessary for her salvation and that of her family, but because she was black, her temple access was restricted.  She could be—and was—baptized for her dead family members, one of the three main rituals performed in temples.  But the other two rituals were closed to her: she was not allowed to receive her endowment—the LDS language for participating in the initiation ceremony that all Latter-day Saints are supposed to perform—and she was not allowed to carry out sealing rituals.  Latter-day Saints believe that temple sealings—marriages and adoptions—create family relationships that last for eternity.  According to LDS theology, without these ceremonies, James’s connections to her family members were severed when she died in 1908 and she was unable to reach the highest degree of glory in the afterlife. 

 

Because the temple was mostly unavailable to her and her family, James constructed her religious identity, at least in part, around direct encounters with the divine and through the sense that flowed from these encounters that God was on her side.  James had experiences like this throughout her life. Supernatural healings were one form in which Jane interacted with the divine.  For example, in 1896, James told a gathering of LDS women about healing herself. The secretary who reported on the meeting put it this way: “Sister Jane James bore a faithful testimony and said she had been terribly afflicted in her head, and she took her consecrated oil and anointed herself and she was healed.  Felt that that was faith, and praised the Lord for her blessings.”

 

James also received visions from the Holy Spirit.  The most dramatic episode was her experience of doing the Smith family’s laundry shortly after being hired as a domestic servant in Nauvoo. “Among the clothes I found brother Joseph’s Robes,” James recalled in her autobiography. “I looked at them and wondered, I had never seen any before, and I pondered over them and thought about them so earnestly that the spirit made manifest to me that they pertained to the new name that is given the saints that the world knows not of.  I didn’t know when I washed them or when I put them out to dry.”  The “new name” that James mentioned was a reference to the temple endowment ritual, suggesting that although temple ceremonies were supposed to be secret, she received information about them directly from God.

 

Perhaps the most frequent charismatic experience in James’s life was speaking in tongues, a practice that was very familiar to early Mormons.  James’s first recorded instance of speaking in tongues was shortly after her conversion.  In her autobiography, James recalled, “About three weeks after [baptism] while kneeling at prayer the Gift of Tongues came upon me, and frightened the whole family who were in the next room.”  For James, this experience confirmed her decision to join the LDS Church.  Apart from this first one, James’s recorded experiences of speaking in tongues occurred in social settings where their value in encouraging and comforting the Saints was clear.  

 

In seeking and valuing charismatic experiences like these, James was very similar to many members of the LDS Church she joined in the 1840s.  James’s encounters with the divine allowed her to fit in with other early Mormons and to construct a religious identity that affirmed the proposition that God interacted actively with humans in ways that echoed Joseph Smith’s own experiences.  But although she found validation for her religious experiences in the LDS Church, racism still constrained James’s religious life: what blackness meant, theologically, socially, and politically, was a moving target during James’s lifetime. Attending to Jane James’s religious experience in the LDS Church helps us see that questions of race vexed the institution and its members throughout the nineteenth century and beyond.    

 

 

For more on Jane Manning James and African Americans in the LDS Church, check out Dr. Newell's book:

 

 

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171950 https://historynewsnetwork.org/article/171950 0
The Activist Origins of Mother's Day Murray Polner, who writes book reviews for HNN, is the author of “No Victory Parades: The Return of the Vietnam Veteran” and “When Can We Come Home? A Debate on Amnesty for Exiles, Anti-War Prisoners and Others."

 

 

After the carnage of the Second World War the members of the now defunct Victory Chapter of the American Gold Star Mothers in St. Petersburg, Florida, knew better than most what it was to lose their sons, daughters, husbands and other near relatives in war. “We’d rather not talk about it,” one mother, whose son was killed in WWII, told the St. Petersburg Times fifteen years after the war ended. “It’s a terrible scar that never heals. We hope there will never be another war so no other mothers will have to go through this ordeal.” But thanks to our wars in Korea, Vietnam, Grenada, Panama, the Gulf War, Iraq and Afghanistan –not to mention our proxy wars around the globe-- too many Moms (and Dads too) now have to mourn family members badly scarred or lost to wars dreamed up by the demagogic, ideological, and myopic. 

 

But every year brings our wonderful Mother’s Day. Few Americans know that Mother’s Day was initially suggested by two peace-minded mothers, Julia Ward Howe, a nineteenth century anti-slavery activist and suffragette who wrote the “Battle Hymn of the Republic,” and Anna Reeves Jarvis, mother of eleven, who influenced Howe and once asked her fellow Appalachian townspeople, badly polarized by the carnage of the American Civil War, to remain neutral and help nurse the wounded on both sides.  

 

Howe had lived through the Civil War, which led her to ask a question that’s as relevant today as it was in her time: “Why do not the mothers of mankind interfere in these matters, to prevent the waste of that human life of which they alone bear and know the costs?” Mother’s Day, she insisted, “should be devoted to the advocacy of peace doctrines.” Howe soon moved beyond her unquestioned support for the Union armies and became a pacifist, opposed to all wars. “The sword of murder is not the balance of justice,” she memorably wrote. “Blood does not wipe out dishonor, nor violence indicates possession.”

 

Though not a mother, my favorite female opponent of war and imperialism was  the undeservedly forgotten poet and feminist Katherine Lee Bates who wrote “America the Beautiful” as a poem in 1895, which is now virtually our second national anthem for all Americans, left, right and center.  The poem I love best is her “Glory,” in which an officer heading for the front says goodbye to his tearful mother.

      

       Again he raged in that lurid hell

       Where the country he loved had thrown him.

       “You are promoted!” shrieked a shell.

       His mother would not have known him.

 

More recently there was Lenore Breslauer, a mother of two, who helped found Another Mother for Peace during the Vietnam War and also helped coin their memorable slogan: “War is not healthy for children and other living beings.”  Years later I came to know three mothers named Carol (Adams, Miller and Cohen, plus my wife Louise) who formed Mothers and Others Against War to protest President Jimmy Carter’s absurd resurrection of draft registration. They stayed on to battle Ronald Reagan’s toxic proxy wars in Central America.

 

On this Mother’s Day we could use more anger and dissenting voices of many more women of all political stripes to protest the needless and cruel sacrifice of their sons, daughters, wives and husbands as cannon fodder, as Russian mothers did in protesting Moscow’s invasions of Afghanistan and Chechnya. In Argentina and Chile, mothers and grandmothers marched against U.S.-supported torturers and murderers during the late seventies and early eighties. And in this country, the anti-war movement has often been led by women who no longer believe “War is a glorious golden thing…invoking honor and Praise and Valor and Love of Country”—as a bitter, disillusioned and cynical Roland Leighton, a WWI British combat soldier, wrote to his fiancée, Vera Brittain, the great British anti-war writer.

 

Sadly, on Mother’s Day yesterday, today, and in the years ahead, peace and justice seems further away than ever. How many more war widows and grieving families do we need? Do we need yet another war memorial to the dead in Washington?  More bodies to fill our military cemeteries? More crippled and murdered soldiers and civilians so our weapons manufactuers's stock prices can rise? Do we really need to continue disseminating the myth –and lie-- that an idealistic America always fights for freedom and democracy? 

 

Vietnam, Korea, the Middle East, etc., more than one hundred thousand American men and women have been killed or grievously harmed in our endless wars, not to mention several million Asians and Middle Easterners, including Israelis and Palestinians. Do enough Americans care? They all had mothers.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/blog/154210 https://historynewsnetwork.org/blog/154210 0
Mothers and Food Aid from World War One to Today

 

 

Before Mother’s Day in 1918 you might have heard this chant if you were walking in New York City: “Ten cents for a Belgian baby. Forget-me-nots for Belgian babies.” Volunteers were selling Forget-me-not flowers to raise funds for the Belgian Babies Fund. The Girl Scouts invaded the financial district, according to the New York Tribune, to get people to buy the flowers. Just by buying a Forget-me-not you were giving the best Mother’s Day gift: food to save the life of her baby.   

Little children were starving to death in Belgium because of World War One. Germany occupied Belgium and the fighting had caused food shortages. The fundraising for Belgian babies was nation-wide throughout 1918. The Cincinnati Enquirer urged sales of the flower writing, "It is appropriate and fitting that the fund for Belgian babies is to be swollen by the sale of forget-me-nots. That dainty little flower is perennial, conveying the thought that our loving remembrance of the waif's of Belgium will not be sporadic and uncertain, but, like the flower itself, recurrent and constant.” Every flower purchased was food for a hungry Belgian child.

 

In St. Louis, 12-year old Josephine Windy had a special reason for volunteering. She was born in Belgium. Josephine told the St. Louis Post-Dispatch “I want to volunteer my services to aid the Belgian babies. Two of my cousins live in Belgium and their parents have died since the war began. I have an uncle now fighting in France.” Josephine waited at the Mayor’s office to get his donation and kick off a day of fundraising in St. Louis to save more babies from malnutrition.  Donations from America rescued many from starvation in Belgium and Europe. A whole system of food aid, led by Herbert Hoover, was developed to fight the famine caused by World War One.   

We must remember the horror Belgian mothers felt, seeing their baby dying from malnutrition, still exists today. The scenes of despair have moved to other war-stricken nations like Yemen, South Sudan, Syria, Afghanistan, and Mali.  As you read this, the World Food Program, UNICEF, Save the Children, Mercy Corps, World Vision and other relief agencies are trying to get life-saving food to children.  In Yemen 85,000 kids have died of hunger and disease caused by the civil war there, according to Save the Children. There is a race against time to save millions of others from this fate.   

Tragically, funding is not able to keep pace with the massive hunger emergencies. We can and must do better. The Yemen hunger crisis is so large that 20 million people are food insecure in the impoverished country. That is about 70 percent of Yemen's population. 

 

In Yemen, Afghanistan or anywhere how can we expect peace to emerge if children are starving and malnourished? 

The Rhode Island non-profit Edesia has been producing a special food for infants in Yemen called Plumpy’Nut and Plumpy’Sup. This enriched peanut paste saves infants from deadly malnutrition. Edesia works around the clock to produce this food so it can be sent to relief agencies in Yemen, South Sudan, Sierra Leone and other desperate nations. 

 

A child’s treatment with Plumpy-Nut costs about 50 dollars. For that little amount you can save a life, just how people did when buying flowers for the Belgian Babies Fund. 

There are millions of mothers across the world right now desperate to save their children from malnutrition. On Mother’s Day, and every day, we should take action to help them. We should feed every hungry child in the world. Like the Forget-me-not flower, our feeding of starving children everywhere should be "recurrent and constant."

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171949 https://historynewsnetwork.org/article/171949 0
The Remarkable History of the Union League Club

Edward Lamson Henry's Presentation of Colors, 1864, depicts the outfitting of two African-American regiments  at the Union League Club of New York's first clubhouse on 17th Street, facing Union Square.

 

Today, as we observe the dismay of Princeton students at their university’s legacy of slavery, and the Trump administration’s increasingly hostile attitude toward Mexico, it is important to recognize that there were moments in our history where both African-American freedom and Mexico’s independence were addressed in a positive way. Careful readers of history are familiar with Henry David Thoreau coupling the two issues in his essay Civil Disobedience, “This country must cease to hold slaves and wage war on Mexico.” But few remember the pivotal role played by a small social club in New York City at a pivotal moment in US history. 

The modest building on the corner of Park Avenue and 37th Street with the red brick and limestone façade houses an impressive library, dining and meeting rooms, a traditional bar, and a wide array of art and collectables. For many years, it was considered a gentlemen’s club with a unique group of movers and shakers in the city of New York, but women have been part of the scene since the Sixties. It is still exclusive, however, by member nomination and invitation only.

Founded in 1863 at the height of the Confederate advances against the North, its purpose was to unite a group of influential leaders in the community in support of President Lincoln and the Union. It consisted of businessmen, newspaper editors, brokers, and professionals who were committed “to resist to the uttermost every attempt against the territorial integrity of the Nation.”

Frederick Law Olmsted, the architect who designed Central Park in New York and Golden Gate Park in San Francisco, was one of the founders. Some of the early members included J.P. Morgan, William Cullen Bryant, and Ulysses S. Grant. In addition to believing that a strong federal government was a necessity for a prosperous nation, they were abolitionists and felt that slavery could no longer be tolerated in America.

The club could not have been formed at a more critical time. A month before its founding, Lincoln had signed the Emancipation Proclamation and there was a widespread belief among immigrant workers that the newly-freed slaves would take away their jobs. Five months later, the infamous Draft Riots broke out in New York City and among the targets of the vandals were the Colored Children’s Orphanage and the newly-founded Union League Club.

The club members were steadfast, however. Undismayed by the attacks, they ran off the rioters, then got together to organize and fund a regiment of US Colored Troops. The  regiment joined the many white regiments that New York had already enlisted in the Union cause. These new soldiers would be professionally trained in tactics, weaponry, and marksmanship on Rikers Island. 

By February of 1864, the fully equipped and trained 20th regiment of the US Colored Troops marched south. Composed of free blacks, they were no rag-tag group of reluctant enlistees. They were a highly-motivated group of volunteers who heeded the call to make the emancipation of their brothers in the South a practical reality. Later, two more regiments were formed, the 26th and the 30th respectively. The regimental steward later became the first black physician in Athens, Ohio. Benjamin Randolph, the first black officer in the regiment, went on to serve with the Freeman’s Bureau and became a state senator.

But the story, as interesting as it is, doesn’t end there. After the Civil War was over, the US was faced with a situation that could no longer be ignored: French and Austrian troops occupied Mexico and an Austrian archduke called himself the Emperor of Mexico.The US had withheld active support for the exiled President of Mexico Benito Juárez during the Civil War for fear France might join the Confederacy and the US would be fighting a war on two fronts. 

When the Union Leaguers invited General Jo Hooker, hero of Antietam and Williamsburg, to accept a gold presentation sword in June of 1865, Mexican envoy Matías Romero persuaded him to speak of this crisis to the club. In his acceptance speech, reported in the newspaper of that day, General Hooker assured the members that, while the resources our country would likely deter any incursion on US soil, “we must take care, however, that a continent designed to vindicate the wisdom of republican institutions is not encroached upon.”

Despite the flourishes of 19th century prose, the message was clear to the businessman and potential supporters of the Mexican cause whom Romero hoped to convince. In the days that followed, members of the Union League Club would invest heavily in Mexican bonds and would find other practical ways to help that country rid the Americas of occupying European armies. By 1867, the young Romero would raise over $18 million, the Juárez army would triumph over the French and Austrians, and the Mexican Republic would be restored.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171952 https://historynewsnetwork.org/article/171952 0
George Mason: Lost Founder

 

America was woven together by three revered pieces of political paper: The Declaration of Independence, the Constitution, and the Bill of Rights. George Mason’s intellectual potency had a decisive role in shaping and producing all three documents, and leaves one with an inevitable conclusion: George Mason deserves careful and renewed focus. 

 

As War seemed inevitable in the Summer of 1776, Mason assumed a leadership role in the Virginia Convention as Fairfax County's delegate to Williamsburg. Five days after meeting in May 1776, the Virginia Convention formed a committee to draft a bill of rights and a new constitution. Mason was appointed to the Committee on his first day at the convention. It was in this Committee that Mason drafted his most famous writing and posit to American history: the Virginia Declaration of Rights, the antecedent to the Declaration of Independence and the Bill of Rights. In fact, Mason’s prolific words flowed first---before Jefferson, Madison, Hamilton, or Washington’s and famously read:

That all men are created equally free and independent and have certain inherent natural rights among which are the enjoyment of life and liberty…and obtaining happiness and safety.

 

Mason’s draft was approved by the Convention on June 12, 1776, and quickly printed in The Virginia Gazette, reprinted in The Pennsylvania Gazette and in Williamsburg newspapers, nearly a full month before Jefferson’s draft of the Declaration of Independence. Mason’s Declaration of Rights was disseminated and reprinted all over America and beyond the seas. While Jefferson resided in Philadelphia, he had an obvious preoccupation with the ongoing events at the Virginia convention. Throughout late May and early June 1776,couriers moved back and forth between Williamsburg and Philadelphia, carrying drafts for the bill of rights to the convention. 

 

Mason had already completed his Declaration of Rights when John Adams, Benjamin Franklin, Thomas Jefferson, Richard Henry Lee and Edmund Pendleton were all in Philadelphia struggling to compose the “original” Declaration of Independence. No doubt, they read Mason’s completed manuscript that was in the hands of Richard Henry Lee, and later saw it reprinted in the newspapers. Mason’s Declaration of Rights was personally handed to Jefferson in manuscript form by Lee, who had received it from his brother, T.L. Lee, in late May, 1776.  Jefferson's first draft of the Declaration, which has never been found, seemed extremely similar to Mason's Declaration of Rights. Both Franklin and Adams, who were on the committee with Jefferson in Philadelphia, later prepared a Bills of Rights for their respective states. Yet neither of them adopted Jefferson's version of the Declaration. The Pennsylvania's Bill of Rights of September 28, 1776 also used Mason’s language:

All men are born equally free and independent, and have certain natural, inherent and inalienable rights, amongst which are, the enjoying and defending life and liberty, acquiring and possessing and protecting property, and pursuing and obtaining happiness and safety. 

 

Thus, the historical case can be made that George Mason should be fully credited with the original draft of what ultimately became the famed Declaration of Independence. One could argue that for the most part, Jefferson smoothed, edited, and pruned Mason’s language from the Virginia Declarations of Rights--written three weeks before Jefferson’s final product. Did Jefferson plagiarize Mason’s work? Absolutely not. Did he borrow heavily in constructing his own document to famous effect? Yes. Jefferson condensed and eloquently expressed Mason’s language, ultimately making the document his own while retaining its essential content. Both documents boldly proclaimed that all men were born free and that the duty of government was to protect their safety, liberty and happiness. Jefferson would later write in his autobiography that Mason’s “elocution was neither flowing nor smooth, but his language was strong, his manner most impressive, and strengthened by a dash of biting cynicism when provocation made it seasonable."

 

Mason’s Declaration of Rights combined a succinct statement of the republican principles that underlay the Revolution with a smattering of constitutional doctrine designed to protect individual civil liberties. The opening paragraphs throb with a richer emotion than any other public document Mason ever wrote. Mason’s second article confirmed that magistrates derived their powers from the people, and in a third article Mason asserted the people’s “indubitable, inalienable and indefeasible Right to reform, alter or abolish” any government that failed to provide “for the common Benefit and Security of the People, Nation, or Community.” Mason’s fourth article repudiated the notion of a hereditary aristocracy. “The Idea,” he wrote, “of a Man born a Magistrate, a Legislator, or a Judge is unnatural and absurd.”

 

History has consigned Mason to the second tier of historical significance. Quite simply, given the length and breadth of Mason’s political writings and influence, his name should be more recognizable in the public domain and in the same conversation with Thomas Jefferson, James Madison and Alexander Hamilton. Although some have largely dismissed Mason as a man who simply refused to sign the Constitution (he was one of three men who refused), this is not a historically accurate portrayal of the man. He refused to sign the final document because he believed it sanctioned human slavery and omitted the rights of individuals. He outlined his refusal in “Objections,” written at the Convention and later read in every town and village: "There is no Declaration of Rights." He carried his struggle for a Federal Bill of Rights to the people and lived barely long enough to see his efforts crowned with Congressional victory, the monumental Bill of Rights.

 

While few Americans know Mason today, in his own time and place his contemporaries grasped at superlatives to describe the Virginian. Madison exuded that “Mason possessed the greatest talents for debate of any man I have ever seen or heard speak.”Patrick Henry pronounced him “the greatest statesmen I ever knew.” Jefferson complimented his mind as “great and powerful.” Philip Mazzei, the Florentine physician and world traveler, wrote, “he is not well enough known. He is one of those brave, rare—talented men who cause nature a great effort to produce.” The Italian ranked Mason as one of the intellectual giants: “[Mason] is one of those strong, very rare intellects, which are created only by a special effort of nature, like that of a…Machiavelli, a Galileo, a Newton…and so forth.”

 

Although George Mason lived and wrote almost 250 years ago, his ideas are especially relevant to present day America. Mason’s true importance in the 21st century is to be found in the cauldron of his ideas--the rights of the human spirit, life and liberty. His constitution-making shattered the old myth of divine right-vox populi, vox dei- and proved that mankind could handle their own affairs. Mason understood more than most that necessary powers must be given to a government, but the price of increased power was decreased liberty. Mason’s political works were a creation of a democratic genius of mind and heart. He was the instrument of expressing, in one brief document, the concentrated resolution of a nation: the causes, the motives, and the justification of individual liberty against tyranny. Mason speaks to us now, because he spoke so powerfully in his time. His life and writings were directed by the epic issues of the Revolution, similar issues that confront us today: individual rights, governmental power and warfare.

 

Thus, the historical case can be made for Mason's elevated public fame, placing him among the more famous of American Founders for both civil rights and freedom of religion. George Mason rightly deserves to be considered one of the fathers of our national government. 

 

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171953 https://historynewsnetwork.org/article/171953 0
In Dalmatia, Distant Pasts Influence the Present

 

Danijel Dzino is a Lecturer in the Departments of Ancient History and International Studies (Croatian Studies) and a member of the Ancient Cultures Research Centreat Macquarie University. Danijel was born within the ancient borders of the Roman civitas Daesitiatum, in provincia Dalmatia. He received his PhD in Classics at the University of Adelaide. He was a Visiting Research Fellow at the University of Adelaide until moving to Macquarie University to start work on his ARC postdoctoral grant. He is the author of three books, among them, Becoming Slav, Becoming Croat: Identity Transformations in Post-Roman Dalmatia(2010) and Illyricum and Roman Politics 229 BC - AD 68 (2010). 

 

 

What books are you reading now?

 

It is very difficult to find the time for reading. Do not get me wrong – I read all the time – but my reading consists of mining for useful information or interesting opinions which could be used in research, rather than reading the book from cover to cover. The last book I can remember that I read from cover to cover was Chris Wickham’s Framing the Early Middle Ages: Europe and the Mediterranean, 400-800.

 

What is your favorite history book?

 

There are a few. Your readers would probably be familiar with the Mémoires d'Hadrien by Marguerite Yourcenar. However, my favorite history books were, as far as I know, never translated from Croatian into English. That is Ivan Aralica’s The Morlak Trilogy (Travel Without Dream, Souls of the Slaves,The Builder of Inns) from the 1980s. These three books describe fictional individual destinies of the people living in the Early Modern frontier-zone between the Venetian Republic, Ottoman and Habsburg Empires in modern Dalmatia, Herzegovina and Bosnia. I also like pseudo-historical fantasy like Lord of the Rings or The Song of Ice and Fire.

 

Why did you choose history as your career?

 

I wanted to be botanist first, but later in life found a way to connect my obsession of Dalmatia, where I spent summer holidays as a kid and teenager, with my professional interests. Where I was born (Sarajevo in Bosnia and Herzegovina), the ancient past is usually well-hidden underground. The medieval past rarely makes an appearance in the places which are usually difficult to access, while the earliest period visible to the observer is the Ottoman era. However, in Dalmatia, ancient, medieval and the early modern past are inextricably intertwined, and visible on every corner. Walking through the Diocletian’s palace in Split, or the Old Town Zadar, for example, provides a unique experience of the past as an integrated multi-dimensional entity which still impacts the present in very particular ways. 

 

What qualities do you need to be a historian?

 

Patience, persistence, focus, and the capability to process huge amounts of information. The historian must be conservative and innovative at the same time by respecting the work of past scholars but also daring to see the things that predecessors were not able to see.

 

What is your most memorable or rewarding teaching experience?

 

The most rewarding experience was the development of my undergraduate unit Archaeology of Dalmatia, which dealt with the Dalmatian (in a sense of the Roman province) past and material record from the Iron Ages to High Medieval era. 

 

What are your hopes for history as a discipline?

 

My hope is that history preserves its dignity – as a university subject but also as a field of research. Modern Western universities in Anglophone countries connect everything with money. The imperative is to have more students and win more research grants. For that reason, it is necessary to keep the student numbers at any cost, simplify the curriculum, make the students happy and well-entertained, which in my opinion underestimates intellectual capabilities of younger generations and degrades the profession. Another of my hopes is that history as a field of research moves away from postmodern deconstruction of historical grand-narratives into the building of new historical narratives based on postmodern criticism. Postmodernism in history was necessary, but it played its role and now is the time for historians to move into something else.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

Unfortunately, I do not have many collectible books – my real hobby is collecting postage stamps, which are sometimes indeed historical artifacts. This does not mean that I do not have plenty of scanned rare history books on my laptop, my favorite ones being those of Daniel Farlati’s monumental masterpiece Illyricum Sacrumfrom the 18thcentury, which I read quite often. I often buy small copies of Croatian medieval inscriptions, like the Tablet from Baška inscribed in the Glagolitic script, or the models of early medieval churches, such as my favorite building, St. Donatus from Zadar. 

 

What have you found most rewarding and most frustrating about your career? 

 

There are many rewarding experiences in my career. The most rewarding is certainly research, including archaeological excavations of the Bribirska glavica site near Skradin in Croatia, where I participate as one of co-directors for last 6 years. This is a multi-period site with habitation stretching from ca. 1000 BC to the 18th century, extremely rich with research potential.

 

There are also a few frustrating things, especially increasing bureaucratization of academic jobs and the need of universities to regulate every minor thing related to research and teaching. There is even a whole new bureaucratic language to learn – we are not writing research publications anymore but ‘produce research outcomes’, the journals and books became ‘publication outlets’, etc. Administration at universities is multiplying like bacteria and an immense amount of administrative work is eating energy which could be used for researching and teaching. This creates a paradox that successful bureaucrats in academia, at least in the humanities, today could climb on the academic ladder much faster than successful lecturers or innovative and productive researchers. 

 

How has the study of history changed in the course of your career?

 

It changed quite a lot, actually, especially in the last ten years. Today, through digitalization we have an immense quantity of old and new literature and sources available in a matter of seconds. This provides unlimited opportunities for the researcher or student of history to get a much deeper understanding of the field of study and produce work much faster.

 

What is your favorite history-related saying? Have you come up with your own?

 

It is not technically history-related, but rather a historiography-related saying: Slavica non leguntur (“The Slavic languages are not read [world-wide]”), which symbolizes legitimized ignorance of local Slavophone historiographies by the Anglophone/Frankophone/Germanophone scholarship in the past but (unfortunately) also in the present.

 

What are you doing next?

 

I am currently working on manuscript of the book which is going to discuss making of the Middle Ages in Dalmatia – from Justinian’s reconquest in the 6th century, to the rule of the Croat duke Branimir in the late ninth century. 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171946 https://historynewsnetwork.org/article/171946 0
Protesting with her Feet: The World’s Fastest Middle-Distance Woman versus Sports Governing Bodies

 

 

Last Friday, May 3rd, South African female middle distance runner Caster Semenya won the 800m at the Diamond League competition held in Doha, Qatar. Her thirtieth straight win in the 800m came only forty-eight hours after the Court of Arbitration for Sport (CAS) – a quasi-judicial body headquartered in Switzerland - rejected her appeal of unfair discrimination by the International Association of Athletics Federation (IAAF). The IAAF had ruled that female runners with elevated testosterone levels be required to take blockers in order to compete in races between 400m and the mile. These new rules tookeffect this Wednesday, May 8th. While acknowledging that the ruling discriminated against female athletes with naturally elevated levels of the growth hormone, the CAS thought it necessary to ensure “fairness” and the creation of a level playing field so that females with normal testosterone levels would not be competitively disadvantaged.

 

Some senior athletic officials are gratified at this legal vindication of their position. IAAF president Lord Sebastian Coe - British former 1500m and 800m Olympic medalist whose unforgettable middle distance running contrasts with his soporific performance as Conservative parliamentarian - embraced the ruling. It was “straightforward,” he said, confirming the governing body’s traditional gender categorization between men and women athletes. In contrast, Semenya’s lawyer Patrick Bracher argued that the ruling was unfair because the law’s consequences remain unclear and the rule maker is also deciding on how the law plays out.  because it is being played out without knowing how it is going to play out and you cannot have the “same person” making the law being the same person implementing the law.

 

This controversy stretches back over a decade. In 2012, Dutee Chand – the daughter of weavers born in Odisha, India – triumphed in the 100m for under-eighteens. In 2014, she medaled at the Asian Junior Athletics Championships. Despite the success of this wonder teenager, she was dropped from the Indian team because of concern that possible hyperandrogenism flouted the rules of sports governing bodies against natural hormonal growth favoring female athletes over other female athletes. After appealing her case, the IAAF policy on natural hormonal growth among female athletes was suspended. After returning to the track, Chand made it to the women’s 100m finals at the 2016 Rio Olympic games, although she did not progress to the medal rounds.

 

Dutee Chand’s fellow athlete from the Global South has faced a similar arduous battle with sports governing bodies. Born in Ga-maserhlong, South Africa, in 1991, and educated at the University of Pretoria, Caster Semenya exploded on the world athletics scene at the 2009 World Championships in Berlin where she won the 800m while eighteen years old. Since then, she has gold medaled at the 2011 and 2017 World Championships, the 2012 and 2016 Olympic Games, the 2015 African Games, and the 2018 Commonwealth Games. During a decade of remarkable track success, Semenya has been under investigation by the IAAF as well as the frequent target of hostile media commentary.

 

Much of this negativity stems from an ignorance of her physical condition known as hyperadrogenism. It is a physical condition in which the body produces increased levels of testosterone. This constant production can increase endurance as well as muscle mass. Some 5-10 percent of women are reputedly affected by natural hormonal growth. But the jury remains out on the impact of natural hormonal growth on improved athletic performance. It is not at all clear that Caster Semanya’s performance is the consequence of her hyperandrogenism. Her advantage could derive from other factors such as intense training, total commitment, and superior coaching rather than biological difference. It is these latter factors, for instance, that Duree Chand cites as explanations for her athletic prowess.

 

The argument for regulation is clear according to Lord Coe. But the issue is obviously more complex. The CAS ruling was by majority and not unanimous. Some members are clearly uncomfortable with the decision. Moreover, medical science remains unclear on the precise nature of the impact that raised levels of testosterone have on athletic performance. It is clear that steroids do enhance athletic performance. But supplements are very different from the natural process in which the body produces hormonal growth rather than transforms from external drug usage. 

 

Another argument in support of this ruling is that Semenya is a great athlete and that after taking the testosterone blocker she will have an opportunity to prove how great she is by continuing to be a world-class athlete. This argument, however, overlooks the vital point that she should not be obligated to modify her natural body since her condition is not drug-induced but natural. Indeed, it is not up to her to prove her doubters wrong, but rather for her doubters to arrive at a more sensible, fairer, and judicious way of treating her and other athletes with natural hormonal growth. It is the responsibility of the IAAF to draw on real science to distinguish between those who take performance-enhancing drugs and those whose physiologies produce hormonal growth naturally through no fault of their own. 

 

The CAS decision that she can compete but only if she lowers her growth hormone levels is also an attack on the middle-distance runner’s self-dignity as well as her human rights. Semenya has the right to earn a living at something she obviously excels at without being regulated by an official body that wants to deny her the right to earn her living and as well as be who she wants to be. The irony is that the IAAF wants to regulate the human body when sport is supposed to test the body’s natural abilities. Indeed, this gifted sporting woman has played by the rules up until this point only to be told that she can no longer do so. 

 

There is an important ethical component to this argument against the IAAF’s ruling and its upholding by CAS. The ruling is asking an athlete at the top of her game to no longer be at the top of her game. Caster Semanya has been tremendously successful for over a decade. She is earning a good living, is contented, and has a powerful impact on young girls and women around the world.

 

What this ruling does is to effectively deny her the right to earn her living as well as the right to pursue her happiness in what she loves to do. 

 

One English pundit claimed the ruling “protect[ed] the integrity of the sport.” The gatekeepers of the CAS share traditional notions of competition, fairness, sexual classifications, hormonal balances, and so forth that are now being challenged in a number of important ways. Science tells us that a small but important number of women retain enhanced levels of testosterone naturally. This challenges traditional hormonal definitions of male and female. Fairness remains an important ideal in the athletic arena, but so does the fight against discrimination against individuals. Semenya – much like Chand - is discriminated against in this ruling.

 

There are several preferable alternatives to this misguided ruling and the attitudes upon which it is based. First, growth hormone athletes should not be required to take drug-reducers in order to change their natural bodies. The May 8th ruling should be withdrawn immediately. Second, we should figure out the science first before we assume a direct correlation between natural hormonal growth (not drug enhancement) and improved performance. Third, we should start the rules at the beginning not the middle of world-class athletes’ careers. Fourth, diversify these regulatory bodies to reflect modern changes in sporting competition, physiology, gender, sexuality, and global origin. Finally, sports governing bodies should proclaim the abilities of Chand, Semanya, and other women athletes for their natural abilities rather than bewail and prosecute them for their natural advantages.

 

Caster Semenya is an inspiration to young female athletes around the world. She follows in the history-making footsteps of thousands who protested injustice: fugitive slaves who voted with their feet by escaping Southern US slavery; Civil Rights Catholic protesters who marched against Unionist persecution in Northern Ireland; and, fellow South Africans whose long walk to freedom overthrew racist apartheid. She should attract the attention and admiration of all of us who feel compelled to shout out whenever acts of discrimination are being legislated in the name of so-called fair play. Long live Caster’s protesting feet.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171921 https://historynewsnetwork.org/article/171921 0
The Loss of Republican Principle

 

In the mid 1970s, America faced an impeachment crisis under President Richard Nixon. A lawless President who had abused power and obstructed justice was creating a constitutional crisis that presented his party, the Republican Party, with a dilemma: how should they react?

 

In 1974, the Republican Party was led by three men: Senator Barry Goldwater of Arizona, the 1964 GOP Presidential nominee; Senate Minority Leader Hugh Scott of Pennsylvania; and House Minority Leader John Rhodes of Arizona. All three had been supportive of much of Richard Nixon’s domestic and international agenda and all three wanted to support their party and its principles. 

 

But when it became clear that Richard Nixon had abused power and obstructed justice, the three men consulted with their fellow Republicans in both houses of Congress and decided  Nixon had gone too far and was a threat to constitutional order and the rule of law. The tipping point for these three leaders was the move by the House Judiciary Committee on July 27, 1974 to adopt three articles of impeachment. Seven House Republicans joined the majority Democrats in charging Nixon with obstruction of justice, abuse of power, and contempt of Congress. This was three days after the Supreme Court, unanimously demanded that the President hand over the Watergate tapes demanded by Special Prosecutor Leon Jaworski. The 8-0 vote included three Justices appointed by Nixon. In response, Nixon released the so-called “smoking gun” tapes days later on August 5.

 

At noon on August 7, Goldwater, Scott, and Rhodes went to the White House and informed the President that he had lost support among his own Republican colleagues. He would be unlikely to gain more than 15 Republican votes against conviction in an impeachment trial. It was time for him to resign and allow Vice President Gerald Ford to assume the Presidency. With the loss of the support of these leaders, especially Goldwater who Nixon always highly regarded for his strong principles and ethics, Nixon saw no way out other than to resign.

 

No one in their right mind would have thought that Nixon, with his combative personality, would ever think of resigning. Nonetheless, Nixon fully understood he had to do what was proper to do for the nation and for the institution of the Presidency. Nixon resigned and delivered his farewell speech on August 9, 1974. 

 

Now, 45 years later, some believe history is repeating itself. After nearly two years of investigation and the release of Robert Mueller’s redacted report, many believe Donald Trump has besmirched the office of the Presidency. Many worry Trump threatens the dignity, prestige, and respect that the American Presidency has commanded for over 230 years. Yet, unlike 45 years ago, Republicans as a party seem unwilling to abandon the President. 

 

Senator Mitt Romney of Utah, the 2012 GOP Presidential nominee, made a strong public statement condemning the behavior and actions of Donald Trump, but he did not indicate any willingness to go beyond the statement.  He issued a sharp rebuke of Trump after the release of the Mueller Report, saying he was appalled by the “extent and pervasiveness of dishonesty and misdirection” of people around the President and Trump himself. Romney was also alarmed by the extent of the Trump campaign’s willingness to accept help from Russia and called the scandal an abandonment of the goals of the Founding Fathers. However, he expressed relief that the evidence against Trump was not substantive enough to justify charges of obstruction of justice or any other crimes.

 

For his statement, he has been bitterly attacked by Mike Huckabee, former Arkansas Governor, 2012 Presidential candidate, and father of White House Press Secretary Sarah Huckabee Sanders. Huckabee said he was sickened that Romney might have been President and reminded the public that Romney had once sought a cabinet appointment from Trump. 

 

Few others have reacted to Romney’s criticism of Trump or to the allegations of the Mueller Report. Senate Majority Leader Mitch McConnell of Kentucky continues to support Trump and his agenda. House Minority Leader Kevin McCarthy of California is also unwilling to hold Trump accountable in any fashion. If anything, he is more subservient to Trump than former Republican leader and Speaker of the House Paul Ryan in the first two years of the Trump Presidency. Sadly, Romney, McConnell, and McCarthy, in similar positions to Goldwater, Scott, and Rhodes in 1974, have prioritized their party over the public’s interest. 

 

In so doing, they are destroying the Republican Party’s historical tradition of great Congressional leadership in favor of their conservative agenda. Just as the earlier generation of Goldwater, Scott, and Rhodes are given tribute in American history, the new generation of Romney, McConnell, and McCarthy will be condemned in the long run of history.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171922 https://historynewsnetwork.org/article/171922 0
Senator Grassley says the New Deal "didn't work"; historians have other ideas The Political Uses of the Past Project collects statements by elected and appointed officials and sends a select few of those out to historians for comment. Additional checks and more about the effort can be found on the project's home page

Sen. Charles Grassley: "The New Deal in the 1930s didn't work. It didn't get us out of the Great Depression"

I would like to make a point about the so-called Green New Deal. It is very obvious it is a reference to Franklin Roosevelt's New Deal in the 1930s. The implication is that what the New Deal did for the Depression should be a model for the environment. There is just one great big problem: The New Deal in the 1930s didn't work. It didn't get us out of the Great Depression. The Depression didn't end until we entered World War II. Just like the original, the Green New Deal sounds like really bold action, but it is really a jumble of half-cocked policies that will dampen economic growth and will hurt jobs.

—Sen. Charles Grassley, The Green New Deal, Senate Floor, March 5, 2019

Historians say...

 

 

Once the Democrats decided to reference FDR’s New Deal in their latest attempt to combat global warming, it was only a matter of time before their opponents resurfaced the charge that the wide-ranging response to the Great Depression didn’t work. As Robert S. McElvaine points out in The Great Depression: America, 1929-1941, the Great Depression has become akin to the Holy Grail among economists. The need to claim or disclaim the unprecedented set of policies that comprise the New Deal is similarly urgent among politicians for obvious reasons: If it worked, maybe we should think big about public programs. If it didn’t, maybe the government should stay far away from the economy.

Senator Grassley’s statement about the New Deal is stark and definitive. Quite simply, in his mind, it did not work. The historians who responded to our request for input disagreed strongly, as a glance at their ratings will show, but they were not hesitant to discuss how the New Deal occasionally fell short.

We received responses from six historians and ratings from four of them. Their full responses appear below the summary. We’ve also included below two additional comments from Senator Grassley regarding the New Deal in order to reveal more of his argument and his thinking.

Browse and download sources recommended by the historians below from our Zotero library, or try our in-browser library.

Steel Industry by Howard Cook, fresco, 1936, Pittsburgh US Post Office and Courthouse

Summary

There’s a history to this history. The New Deal has long been a battleground and the source of broad, ahistorical thinking. Robert McElvaine quotes Senator Mitch McConnell in 2009, who held forth on how he was “reading history” and learning that “for sure” the “big spending programs” of the 1930s “did not work.” Eric Rauchway details the history of the debate in “New Deal Denialism,” published in 2010. The idea that the New Deal was a failure is one of the most pervasive and persistent historical beliefs on the political right.

But instead of arguing directly from the data or focusing on particular failures, many critics of the New Deal very strangely pivot to the assertion that the depression ended because of the war, not because of FDR’s economic, monetary, and social policies. In other words, massive government spending didn’t end the depression; it was really, really, really massive government spending that did it. This is baffling in its self-defeating logic. Several historians who responded took this up. Read more...

Robert F. Himmelberg, Professor of History, Emeritus, Fordham University

Senator Grassley wants to deflate the proponents of the “Green New Deal” who take advantage of the popular idea that the New Deal was a bold and effective counter to a grave national emergency. Both are generalizing too much, for the New deal was neither a complete failure or a roaring success.

The Senator is correct in saying heavy unemployment lingered until the war came, but neglects to note that GNP had returned to the 1929 level by early 1937.  Read more...

Anya Jabour, Regents Professor History, University of Montana, author of Sophonisba Breckinridge: Championing Women's Activism in Modern America (University of Illinois Press, 2019)

The problem with Senator Grassley’s comment is that his view is short-sighted. While it is admittedly difficult to credit the New Deal with “ending” the Great Depression, it is equally undeniable that the policies implemented then, in particular those legislated by the Social Security Act of 1935 and the Fair Labor Standards Act of 1938, profoundly reshaped the American economy and U.S. society by creating federally-funded programs to provide essential aid to the young, the elderly, and the disabled as well as by establishing groundbreaking workplace regulations, including a federal prohibition on child labor and a national minimum wage. Read more...

David M. Kennedy, Donald J. McLachlan Professor of History Emeritus, Stanford University

Rating: 1.6 Three thoughts:

1. The FDR administration managed to knock the unemployment rate down from 25% in 1932 to about 14% in 1936—a pretty impressive counter-punch to the greatest economic shock in modern history.

2. Counter-cyclical policy was poorly understood in the 1930s; the New Deal faced the task of inventing policy tools to cope with what history still regards as an unprecedentedly huge “Black Swan,” the sources and dynamics of which were and still are something of a mystery. Read more...

Robert S. McElvaine, Professor of History, Millsaps College, author or editor of five books on the era of the Great Depression and New Deal

Rating: 0.9

This ... is a gross misreading of history. What the fact that the Depression did not end until World War II shows is the exact opposite of what McConnell and Grassley argue: It proved that big spending does work, but FDR was unwilling to spend enough, until forced to do so by the war, to stimulate the economy sufficiently to end the Depression. It wasn’t that the policies of the New Deal didn’t work; it was that they were not taken far enough. New Deal policies did not dampen economic growth or hurt jobs. Trickle-down economics does that. Read more...

Kathryn Olmsted, University of California, Davis, author of Right Out of California: The 1930s and the Big Business Roots of Modern Conservatism

Rating: 0.3 The economic growth rates during the New Deal were phenomenal: about 9 percent a year, with the one exception of 1937. The reason 1937 is an exception is that Roosevelt cut back on spending that year. In other words, the recession of 1937 proved that the New Deal policies worked, and the president quickly returned to them. It’s true that unemployment rates did not return to pre-Depression levels until the war. But that’s only because the economy had shrunk so much under President Hoover. Read more...

Eric Rauchway, Professor of History, University of California, Davis; author of Winter War: Hoover, Rosevelt, and the First Clash over the New Deal (Basic Books, 2018)

Rating: 0.1

This statement combines one near-truth (while there’s no official way of marking an end to the Depression, unemployment did not return to pre-1929 lows until the U.S. entered World War II) with a number of major untruths.

The New Deal did work; economic recovery was rapid and effective by the measures we ordinarily use. During Franklin Roosevelt’s first two terms in office (excluding the recession of 1937-1938) GDP growth averaged around 8 or 9 percent per year, rates that are (the economist Christina Romer says) “spectacular, even for an economy pulling out of a severe depression.”  Read more...

 

 

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/blog/154208 https://historynewsnetwork.org/blog/154208 0
Roundup Top 10!  

 

Why we need to address the demands of striking ride-hailing service drivers

by Mary Angelica Painter

History tells us that ignoring these grievances could lead to catastrophic consequences.

 

The St. Louis roots of 'Make America Great Again'

by Steven P. Miller and Warren Rosenblum

In pronouncing this version of Americanism, the Legion drew upon the worst of the nation’s wartime tendencies: rising xenophobia.

 

 

The key to lowering America’s high rates of maternal mortality

by Melissa Reynolds

Health-care providers have forgotten the central lesson of two millennia of gynecology.

 

 

Revise, revise, revise. That’s how history works

by Jeff Kolnick

Revisionism is not something to be feared or rejected, nor is it something to be celebrated or revered. It is what historians do, and we do it all the time.

 

 

How John and John Quincy Adams predicted the Age of Trump

by Carol Berkin

“The Problem of Democracy” offers a final warning to its readers who live in an era of “alternate truths” and blind devotion to charismatic leaders.

 

 

The Coming Generation War

by Niall Ferguson and Eyck Freymann

The Democrats are rapidly becoming the party of the young—and the consequences could be profound.

 

 

Sandra Bland Did Not Kill Herself

by Crystal A. deGregory

I did not watch the video. I do not need to. I know that Sandra Bland did not kill herself—a morally corrupt justice system did.

 

 

May Fourth, the Day That Changed China

by Jeffrey N. Wasserstrom

Protests in 1919 propelled the country toward modernity. One hundred years later, the warlord spirit is back in Beijing.

 

 

MLK's prescription for healing hate was embracing 'agape'

by Eli Merritt

King spoke about the Greek concept of agape, or brotherly love and compassion, a social concept he defined as “understanding, creative, redemptive good will for all men.”

 

 

Hamburgers Have Been Conscripted Into the Fight Over the Green New Deal. The History of American Beef Shows Why

by Joshua Specht

Hamburgers are the newest front in the culture wars.

 

 

The myths behind the push to resurrect child labor

by Oenone Kubie

Why is there a significant push to resurrect child labor.

 

 

Preventing an Israeli-Iran War

by Alon Ben-Meir

The EU is in a unique position to prevent the outbreak of a war between Israel and Iran that could engulf the Middle East in a war that no one can win.

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171944 https://historynewsnetwork.org/article/171944 0
How Chinese History Restarted 100 Years Ago

 

 

On Sunday, May 4, 1919, some 3,000 students assembled at Tiananmen Square in Beijing to protest the Versailles Treaty that ended World War I.  Most were from China’s premier institution of higher education, Peking University, but some 13 colleges were represented in all.  It was a sunny spring day, and students chanted slogans, held up banners written in English, French and Chinese (some were written in the students’ blood) and handed out fliers.  They marched to the legation quarters of the foreign ambassadors, where they were allowed to leave letters but not enter the district. A group of especially radial studentsmarched on to the house of the “traitor,” cabinet minister Cao Rulin, and burned it down.  (Cao escaped, but the Beijing regime’s ambassador to Japan was severely beaten.)

 

The students were angry that the victorious Allies would return territory first taken by Germany to Japan instead of China.  When China declared war on Germany in 1917, it sent 140,000 workers to Britain and France to keep the Allies’ factories open and supplies moving. Japan had declared war on Germany in 1914 and occupied the German concessions in Shandong. Cao Rulin was a logical target of the students’ ire, well known for his pro-Japanese activities.  The protestors of May 4 condemned their own government, which they learned had made secret agreements with the Japanese, and pleaded for sympathy from the international community.  Above all, they claimed to represent “educational circles” that would arouse China’s “industrial and commercial sectors” to take political action.

 

Though initially tolerant, the police ended the day by arresting dozens of students.  The arrests naturally provoked further demonstrations, and street demonstrations and class boycotts—already called the “May Fourth movement”—spread across China.  The whole summer was marked by furious student meetings, petition drives, and even boycotts of foreign goods.  By this time the students had won considerable support from professional associations, business groups, and workers.  Many merchants enthusiastically supported the anti-Japanese boycott; others were pressured to join.  Shanghai was virtually shut down in early June when 60,000 workers went on strike. The movement inspired the patriotism of Chinese communities abroad as well.  

 

China’s weak government, led by military men, had little legitimacy, and it sought to appease the students by firing the “three traitorous officials” whom the students had first targeted. In the end, China’s legation to Paris refused to sign the Versailles Treaty. The Allies, however, never seriously considered returning Shandong to China.  For them, the issue was a minor blip on the way to settling the Balkans, punishing Germany, establishing the League of Nations—and using Germany’s old territories across the Pacific to buy off the Japanese, whose request for a statement acknowledging the principle of racial equality they firmly refused.

 

In what sense did “May Fourth” restart Chinese history?  After all, the ideals that the students were preaching were hardly new.  Belief in democracy and science, hopes for human rights and national self-determination, and criticisms of the traditions of autocracy, patriarchy, and Confucianism had stirred educated youth (and some not so youthful) for a generation or more. Nor were street actions and boycotts new—American goods were the target of a boycott campaign in 1905 that was provoked by the anti-Chinese immigration laws and policies of the US.   

 

Nonetheless, it was the May Fourth movement that revived Chinese politics, which had been left moribund in the wake of the 1911 Revolution.  “Politics” in this sense refers to a public realm of discourse and action created by people coming together—this was precisely how the May Fourth students pictured themselves as opposed to the closed and stagnant world of China’s military-backed bureaucrats and assemblymen.  The May Fourth movement inspired political action and made it possible.  While the vast majority of the population in rural China barelyaffected, at least immediately, China’s rapidly growing cities buzzed with new associations, journals, and social experiments such as communal living and work-study programs.  A new sense of dedication and even self-sacrifice was palpable.  Out of this ferment grew the two great disciplined, militarized, and revolutionary political parties of the twentieth century: the Nationalist Party (Guomindang) and the Chinese Communist Party (CCP).  

 

1919 marked a paradoxical moment, combining great hopes with enormous disillusion.  The first disillusion was the fact of the Great War itself.  Up until this point, several generations of Chinese had looked to the West as a model for China’s own reforms.  They combined a hatred of the foreign incursions against China since the Opium Wars of the 1840s with growing admiration of Western civilization. By the early years of the twentieth century, thousands of Chinese had studied in and traveled through America and Europe, as well as Japan, which seemed to offer a model of Western-style modernization close at hand.  But the “Europe War” that broke out in 1914 dragged on and on.  Chinese readers kept up with the latest developments in weaponry: machine guns, airplanes, submarines, and poison gas.  All this suggested that Western civilization was morally bankrupt.  Few politically aware Chinese at this point thought that China should support either side in a contest between nations that had forced the “unequal treaties” on China.

 

At the same time, China was descending into a downward spiral of political breakdown and violence.  In 1915 Japan issued its “Twenty-one Demands,” insisting on greater privileges just as the European powers were unable to counter it.  President Yuan Shikai tried to make himself emperor, at which point many of his military supporters slunk away.  Yuan died in the midst of the bruhaha that he had created, and regional military commanders disowned any fealty to Beijing.  There still remained a kind of rump central government, and the economy continued to function, a working class emerged, new schools proliferated, universities grew, and at least in the foreign concessions order was maintained.  But rural banditry flourished and the “warlord era” with its endlessly indefinite battles had dawned.

 

No wonder Chinese considered that the 1911 Revolution had failed.  Some said that the political failure stemmed from a deeper cultural backwardness: people debated if the Chinese people needed a long period of education and how much of the past should be discarded. Meanwhile, “politics” had come to refer to the machinations of small groups of officials and military men. Into this malaise came the stirring promises of Woodrow Wilson.  

 

For Chinese, the ideal of “democracy” was at least as important as Wilson’s talk of “national self-determination.”  To join the war against Germany became a righteous cause. And Chinese greeted the Allied victory of 1918 joyously.  They saw a victory of light over dark, of civilization over militarism, of cosmopolitanism and open-mindedness over nationalism and racism.  Some thought that the Bolshevik revolution in Russia and a coming Communist Revolution in Germany represented the logical culmination of popular democracy.  Such language was not propaganda but reflected a genuine sense that history had shifted.  Even more sober observers thought that the defeat of Germany at least represented a triumph of international law and a shot across the bow of imperialism.

 

Such hopes were not limited to China, and reflected a utopian moment partly rooted in the ever-wilder promises of President Wilson, and partly rooted in local conditions.  In China’s case, the old ways had been under challenge for a generation or more.  By 1919 the cosmology of Heaven and cosmic forces explicated in Confucian texts had clearly collapsed; the foundations of the emperorship had crumbled beyond repair; and many young people had concluded that their fathers’ power over their fates was intolerable.  Culture, society, and the political realm were all in enormous flux; morality had to be rethought.  This opened the way for the May Fourth generation to turn to ideals of intellectual freedom and individualism, a national vernacular and new literature, and democratic institutions to strengthen and unify the country.  

 

The impact of the racism and colonialism enshrined in the Versailles Treaty can thus be imagined.  The “West” could no longer serve as a model for China’s future development.  True, a few intellectuals kept their faith in moderate reform, but more turned in other directions.  The revival of Confucianism had appeal, but most of the younger generation turned toward more radical routes.  Looking at the student demonstrators of 1919, we can see a combination of heightened anti-imperialism and genuine cosmopolitanism.  From the vantage points of Beijing and Shanghai in particular, the world now offered new revolutionary hopes.  Politics had returned: an open, tumultuous politics of studies and the streets.  Students positioned themselves as devoted patriots, not seeking advantages for themselves but to simply strengthen China and awaken its mass of citizens.  At the same time, they were citizens of the world familiar with Dickens, Gogol, Rousseau, John Stuart Mill, Rabinadrath Tagore, and John Dewey, not to mention Arthur Conan Doyle and Alexandre Dumas. They believed in universal values—freedom, equality, justice, and socialism.  The meaning of the war, then, lay not in a vacuous Wilsonianism or the effect of the Versailles Treaty, but in the German and Russian Revolutions. With the Bolsheviks’ victory in Russia, Leninism offered an anti-colonial revolutionary alternative to Wilson’s empty promises.  The new Soviet Union offered to abandon the old Czarist claims on Chinese territory, and the re-formed Communist International prepared to send its missionaries and organizers to China. 

 

With allowances for Communist jargon, Mao Zedong was right when, in 1939 on May Fourth’s twentieth anniversary and in the midst of Japan’s invasion of China, he looked back from his isolated, precarious perch far to the northwest, to call the May Fourth movement “a new stage in China’s bourgeois-democratic revolution against imperialism and feudalism.”  Mao’s formula long shaped Chinese understandings of the origins of Chinese modernity.  Historians today do not accept Mao’s tendentious equation of May Fourth with Communism, but few historians would deny that May Fourth marked a new stage of some kind.  It did not end warlordism, nor did it provide new standards to judge legitimacy—democratic norms had been developing from turn of 2oth century.  Nor was May Fourth a watershed in Chinese history, if only because we can now see how it was embedded in a longer set of revolutions across the twentieth century.  But it was much more than a simple reaction against the racist imperialism embodied in the Versailles Treaty.  It was a culmination of intellectual, social, and institutional changes developing in China since the 1890s, and it led to a new politics.  If it did not magically create the CCP, it did reflect the new muscle of the working class as well as student power.  And it did define the new political norms that allowed China’s first Communists to find a foothold amid the flourishing utopianisms of the day.  

 

The present leader of China, Xi Jinping, described “May Fourth” as a student movement based on patriotism and revolutionary fervor in his celebratory speech on April 30.  He was not wrong, but he neglected the key slogan that became associated with May Fourth: “science and democracy.”  This meant a commitment to rational, secular thinking combined with a commitment to popular sovereignty and an open society.  To this ideal, the May Fourth movement added the energies of an aroused and angry nationalism—a nationalism that for some justified violence as it remained open to progressive currents from around the world and opposed to oppression in all its forms.  Much of “May Fourth spirit,” then, is seen by China’s current leaders as a threat to their authority.  This year, as China also marks the seventieth anniversary of the founding of the People’s Republic, some Chinese will also mark the thirtieth anniversary of the Tiananmen Square democracy movement.  The political restart of May Fourth in 1919 is a direct ancestor to both these later events, and even today May Fourth still stands as a model of youth-led social movements.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171920 https://historynewsnetwork.org/article/171920 0
A Bloody Mary Bar and a Barroom Full of Fun  

I am a theater critic. In my 45 years of criticism, I have enjoyed the plays if Eugene O’Neill, William Shakespeare and Tennessee Williams, but none of them can match the enjoyment, the sheer fun, that I had last Saturday when I finally caught up with the most hilarious and raucous play in New York, Imbible: Day Drinking, a heady Off Broadway show  about the history of drinking at brunch now celebrating its second anniversary.

The musical, at the New World Stages theater complex at Eighth Avenue and W. 50th Street, New York, is the tuneful history of brunch, a time-honored American dining tradition that isn’t American at all. The story includes the history of the Bloody Mary, Irish Coffee. Champagne and the Bellini, brunch drinks, told in spirited, light hearted songs presented by a seasoned and a deliciously giddy cast -- Bobby Eddy, Nick Barakos, Devon Meddock, Emily Ott, Megan Callahan and Devyn White. They cavort from one end of the barroom to another. They sing, they dance, they tell jokes, they adopt foreign accents, they don one wild costume after another and through it all tell you everything there is to know about the brunch and its drinks, a history that is just fascinating, ice or no ice.

Their story goes all the way back to the ancient Greeks, the Turks of the Ottoman empire. Napoleon Bonaparte, tropical Caribbean islands and every bar in San Francisco (and there a lot of bars in San Francisco).

Imbible: Day Drinking is a serious (well…………..), nicely-structured musical that is organized, runs along in sharp dialogue, witty jokes, marvelous human caricatures and has as many sight gags as a Marx Brothers movie. There is never a moment without a good laugh. It is as much fun as finding the key to a good wine cellar late on a Saturday night. 

The play is produced in an actual bar, the Green Room, that services customers at the theater complex on Sundays. It is stocked with every bottle of liquor you could think of, and some you could not think of. The actual bar is the length of the room and faces a room full of several dozen tables. Each patron at the play is entitled to three drinks served in the show and all the pastries you can eat off trays on top of the bar (you’d better rush for these; they go fast).

The purpose of the writer, Anthony Caporale, also the co-director, is to tell the history of brunch and its drinks and he never tries to push the idea of excess drinking (a warning from me – if you must drink, drink moderately). You do not see drunks, men and women passed out or anyone asking for a cold one. It is a straight forward musical written to please, and it certainly does.

The brunch, a time - honored staple of American dining, was invented sometime in the 1890s by British hunters who chased terrified foxes with their hounds across the English countryside all morning. Done, with hours in the saddle behind them, they all sat down for a late breakfast around 10 or 11 a.m. They called it brunch and the name stuck. The brunch was quickly moved to America, where restaurants began to serve it, particularly on Sundays. You, me, everybody, goes to them.

The play starts with the history of the Bloody Mary and then, in chronological order, moves to Irish Coffee, Champagne and the Bellini. The show’s actors explain to you how to make them and invite you to come up to the bar and make your own Bloody Mary.

The actual story begins with a man sound asleep on the bar, wrapped in blankets, who arises at 9 a.m. big smile o his face, ready for his weekly Sunday brunch. The story, with humorous music and lyrics by Josh Ehrlich, flies through the barroom from there, with actors dressed as French Generals, Cave Women, doctors, bartenders and a slinking, seductive woman or two. They pop up and down behind the bar, dance through the audience and jump out from behind pillars, all with a big wink.

Everyone is aghast when an actor says that sparkling wine is the same thing as champagne and you get a very funny lesson on how the French invented Champagne, and Irish coffee, too, on the island of Martinique. The Americans took up coffee drinking after the fabled Boston Tea Party prior to the American revolution and made it famous (today, an actor says with great satisfaction, Americans drink 2.4 billion cups of coffee each day and the rate by New Yorkers is seven times that of the average American. New York is the city that never sleeps? That’s why).

There are some funny stories about the invention of drinks and the show winds up with the story of the Bellini, the smooth, peachy drink, that was invented by the Cipriani family and first served at the famous Harry’s Bar in Venice, Italy.

The show ends with the ensemble singing the delightful Let’s Do Brunch, while everyone hoists their Bellinis high int the air. The play runs in conjunction with the theater groups other story, The Spirited History of Drinking.

You want to have a good time in the theater, learn a lot about the history of brunch and drinking and with a wonderful Irish Coffee, too? This is the place. Raise your glasses high!

PRODUCTION: Imbibe: Day Drinking, is directed by Anthony Caporale and Nicole DiMattei, Choreography is by Ms. DiMattei. Musical Director: Robbie Cowan. It has an open ended run.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171919 https://historynewsnetwork.org/article/171919 0
Run, Hide, Fight If You Must…Now What?

 

I’m on the faculty of the University of North Carolina Charlotte, a semi-retired historian teaching two courses that meet two days a week. It’s a doable schedule for an old guy.

Tuesday was my last day on campus for the Spring semester and it was supposed to be an uneventful day.  I wrapped-up one class in the morning and gave an exam in another during the late afternoon.  Then I headed home on the light rail.

At home, I made myself a martini and sat down on my balcony to enjoy the evening and anticipate the coming weeks of summer inactivity. That’s when my phone lit up with alerts. Emergency messages from the university told people still on campus to “run, hide and, as a last resort, fight.

One building away from where I gave that last exam, a student entered a classroom and began shooting.  He killed two and seriously wounded four more.  The unthinkable, the thing that always happens somewhere else, had happened here.

For the next several hours, I sat glued to the television, moving among various internet news sites and double checking Twitter for more information.  Students, faculty and staff still on campus were locked down.  The light rail train, which I had just taken home, was halted several stops away from campus.  My colleagues and students were hiding in darkened rooms, awaiting for the arrival of police to clear the campus, classroom by classroom.  Others were running for safety.

Media pictures depicted where I had walked just a few hours before.  I watched people being marched away from the scene, hands in the air.  Would I see my students?  Would I see my colleagues?

As is common during these horrible events, there were immediate calls for “thoughts and prayers”  but, as a Vietnam Era Marine and a former police officer, I have a very cynical view of such comments.  I understand the destructive power of firearms.

On the TV screen, I watched as the alleged shooter was taken into custody.  I waited for the names of the victims.  Were they students?  Were they my students?  Were they faculty or staff, people I knew?

I could not help but remember the people I have known who were victims of violent and senseless crimes such as this.  A former business colleague was killed by the Unabomber.  A high school girlfriend and her family were murdered because they opened the door to a stranger one evening.  Like many Americans, I am sickened by the never-ending slaughter that permeates our society and ask myself why are we unable to stop it.

As the evening wore on, I felt numb, as if the events of the day had not happened, as if they were somewhere far away, not part of my world. 

When it was reported that the shooter was a student, a history major, I immediately checked the rolls of my past courses, glad to discover that he had not been in one of my classes.  And, as each victim’s name was released, I checked again. 

None of the people directly involved were my students or colleagues, but in a university community of approximately 30,000 the odds were that I wouldn’t know them.  I understood, however, that we had all been touched in some way by this evil deed.

On social media, I began to see comments from colleagues. They expressed the full spectrum of emotion, from deep sadness to intense anger.  As for myself, I was confronted with the reality that, like so many times in the past, nothing will change.  This was not the first university shooting, and it, likely, will not be the last.

Trauma such as this changes the lives of everyone connected, even those of us who watched from a safe distance.  Perhaps, as the number of individuals touched by such violence grows, we will ultimately be able to build the necessary mass to challenge the special interests that fight all attempts at reasonable compromise and pretend that the only way to fight violence is with more violence.

Universities should be a safe haven, providing a transition from childhood to adulthood. Universities should not be blood soaked killing fields.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171892 https://historynewsnetwork.org/article/171892 0
James Byrd, Jr., John William King, and the History of American Lynching

A group of African Americans marching near the Capitol building in Washington DC, to protest against the lynching of four African Americans in Georgia.

 

 

In February, 1999, John William King – who was executed in Huntsville, Texas on April 24, 2019 –became the first white man in modern Texas history to be sentenced to death for killing a black person.  How that black person, James Byrd, Jr., died was no mystery. Three self-proclaimed white supremacists had drawn up a plan to start a race war while they were in prison. These men chained Byrd to the back of their pickup truck and dragged him for a mile and half until his head and right arm were torn from his body by a concrete culvert on Huff Creek Road in Jasper County.  

 

What proved to be a mystery in the aftermath of this gruesome event was what to call this particular crime.  Most mainstream reporters and columnists insisted that it was not a “lynching.”  Many would not accept that an event that clearly met Congress’ definition of lynching could happen at the end of the twentieth century.  The books published on what happened in Jasper, Texas between 2002 and 2004 called it “a hate crime,” “the dragging,”and “the murder,” but none would call it a lynching. Columnists in black newspapers and African American intellectuals who wrote for or were quoted in the mainstream press were nearly the only people to insist John William King’s death was indeed a lynching.  

 

The weekend James Byrd, Jr., was lynched was also the first anniversary of President Bill Clinton’s “dialogue on race.”  It appeared that this dialogue was scripted from the same debates that had roiled the nation a half-century earlier when white anti-lynching advocates had begun to yearn for “the end of lynching,” and regularly declared something to be “the last American lynching.”  It was more important, in 1940 and 1998, to deny the existence of a practice that shamed a nation than to face it.  

 

Thanks to several groundbreaking contemporary historians and organizations like the Equal Justice Initiative, we now have a fuller account of lynching in the United States – including its transformation over time and the extent to which it served and fueled nationalist, populist, racist, and nativist purposes.  What we perhaps equally need is an understanding of the historical role of the discourse of lynching and how it continues to shape America’s enduring “dialogue on race.”  

 

The discourse of lynching was produced by Southern newspaper editors, politicians, and mob leaders in the last two decades of the nineteenth century. It claimed that white vigilantes were as inevitably driven by principles of chivalry to lynch as African Americans were compulsively driven by their sexual lusts to rape. It was a discourse that represented a momentous shift from the rationale and justification for enslavement: those who had been happy-go-lucky Sambos in the plantation romance the South fed itself had now become ravening beasts in the new genre of what Jacquelyn Hall memorably called “a kind of folk pornography in the Bible Belt.”  It was a discourse African American intellectuals and organizations fought to dispel for the next half-century, including Frederick Douglass, Ida B. Wells, the Association of Southern Women for the Prevention of Lynching, Tuskegee, and the National Association for the Advancement of Colored People.

 

Lynching discourse could be used and abused, then and since.  Dixiecrat Senators consistently used it to deny the passage of federal anti-lynching bills.  Some opportunistic African American public figures claimed to be victims of it metaphorically, as both Supreme Court nominee Clarence Thomas and embattled Detroit mayor Kwame Kilpatrick did.  

 

What is perhaps most noxious and subtle about that discourse is what it sometimes explicitly, and most often implicitly, claimed about African American women. Under enslavement, black women were routinely and regularly raped, even as slave codes insisted that this was legally and logically impossible.  In the postwar period, some slavery apologists claimed black women brought it on themselves. For instance, in her 1906 Lost Cause paean, Dixie After the War, Myra Lockett Avary claimed that the “heaviest part of the white racial burden” was “the African woman, of strong sexual instincts and devoid of sexual conscience, at the white man’s door, in the white man’s dwelling.”  In the postbellum lynching discourse, black women were cast as the source of black men’s sexual depravity: the witchery black enslaved women had used to drive white slave masters to rape them, black freedwomen were now applying to their black husbands, which insatiably drove them to rape white women.  The “average plantation negro does not consider rape to be a very heinous crime,” argued Philip A. Bruce, because he “is so accustomed to the wantonness of the women of his own race.”  This was a discourse that made victims into criminals, even as the ignored and denied crimes against the bodies and souls of women of African descent produced and reproduced the labor force that made America’s economy possible. The antebellum discourses that made them “unrapeable,” and the postbellum one that made them the ultimate source of danger to white women, continue to operate in insidious and hateful ways.

 

Many noted that the only charge of the over forty accusations of sexual harassment, assault, and rape Harvey Weinstein felt compelled to contest directly and specifically was that of the only African American woman accuser.  Any number of factors could explain this anomaly, one supposes, but those of us who believe that there is an enduring historical force in those discourses, practices, and values that a society adopts and transforms over time suspect that one such factor is thehistorical legacy of that discourse.  It has at different times represented black women as “unrapeable” and sexually available – at the white man’s door, in their dwelling, as Avary put it, and, apparently, on their casting couches.

 

It is that same discourse, in the end, that inspired John William King and his two accomplices to commit the lynching they did.  One of those accomplices, Shawn Berry, told Dan Rather in an interview prior to his trial, that King had proclaimed in the course of their crime, “That’s what they used to do when a black man got caught messing around with a white woman, in the old days.”  There was no white woman involved in James Byrd’s life, nor did King or his accomplices believe there to be.  They were inspired by a discourse that forced them to misperceive the reality they inhabited. That is what a hegemonic discourse, an ideology, an enduring historical narrative people tell themselves does. It makes us deny what we see. Maybe we should strive to see what some of us choose to deny, and what others of us--who were victimized by the practice, and continue to be victimized by the discourse--tell us is really there.

 

Maybe then we could see what the discrepancy between the number of white people on death row for the murder of black people and the number of black people on death row for the murder of whites can tell us (in the dismal way that only capital punishment statistics can) about whose lives do matter. 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171900 https://historynewsnetwork.org/article/171900 0
A Tale of Two Suffragists: Hazel Hunkins and Maud Wood Park

 

 

Two suffragists arrived in Washington, D.C. in late 1916, one from Billings, Montana and the other from Boston. Born twenty years apart, they spent the next three years in the nation’s capital working for the same goal by radically different means.  If by chance their paths had crossed, they probably would have not have spoken to each other, so deeply did they identify with the strategies of their rival organizations.  But they were equally passionate about winning the vote. 

 

Hazel Hunkins was the younger of the pair.  Born in 1890 in Colorado but raised in Montana, she was a proud graduate of Vassar who was frustrated when she couldn’t pursue a career in chemistry. Temporarily living at home, she met a field organizer for the upstart National Woman’s Party (NWP), which Alice Paul had just founded to push for a federal suffrage amendment.  Hunkins became an instant convert to the feminist cause.  After crisscrossing the West as a paid organizer for the NWP, she moved to Washington to oversee its organizers in the field.   When Alice Paul sent out “Silent Sentinels” to picket the White House in January 1917, Hazel Hunkins was one of the most stalwart volunteers.  She was twenty-six years old.  

 

Maud Wood Park would never have done anything as radical as picketing the White House but her commitment to women’s suffrage was just as firm. An 1898 graduate of Radcliffe College, she was recruited to join the suffrage movement in college by Alice Stone Blackwell, the only child of Lucy Stone, whose 1855 refusal to take her husband’s name in marriage spawned the term “Lucy Stoners.” Temperamentally committed to working within the system, Park served on the executive board of the Massachusetts Woman Suffrage Association, and she helped found the Boston Equal Suffrage Association for Good Government and the College Equal Suffrage League. In 1916 Carrie Chapman Catt recruited her to come to Washington to become the chief lobbyist of the National American Woman Suffrage Association (NAWSA), the oldest and largest mainstream suffrage organization in the country.

 

The Congressional Committee that Maud Wood Park soon headed earned the nickname “Front Door Lobby” because, in the words of one journalist, they “never used backstairs methods.” She methodically kept tabs on the 96 senators and 435 members of the House of Representatives who held the fate of the Nineteenth Amendment in their hands. This lobbying lacked the glamour and excitement of marching in a suffrage parade or participating in an open-air meeting, but it was absolutely crucial to the ultimate success of the movement.  In January 1918, the House passed the so-called Susan B. Anthony Amendment but the Senate would not follow suit until June 4, 1919.  Neither victory would have happened without the deliberate and scrupulously non-partisan efforts of Maud Wood Park.

 

 

Hazel Hunkins chose a different path even if she was after the same goal: she turned to militant action to force Woodrow Wilson and other elected officials to support the federal amendment. Hunkins was arrested on at least three occasions, mainly on trumped up charges of disorderly conduct or obstructing traffic.  When she was imprisoned after protesting at Lafayette Square, across from the White House, Hunkins and her fellow suffragists immediately embarked on a hunger strike to highlight the terrible conditions at the local jail. Weakened not just by hunger but by contaminated water, the suffragists were released after five days.  Hunkins went home in an ambulance.  She was arrested one more time in January 1919 for burning Woodrow Wilson’s speeches in “watchfires for freedom” across from the White House. That was her last militant act. 

 

There was no love lost between the rival wings of the suffrage movement, but it is too simplistic to reduce the clash between the NWP and NAWSA to a generational dispute between brash youngsters committed to militancy and “old fogeys” dedicated to working within the system.  In the final decades of suffrage activism, younger women flocked to NAWSA, swelling its ranks with new recruits.  And even though the NWP styled itself as “the young are at the gates,”  one of the first pickets to be arrested was Lavinia Dock, who was almost sixty years old. Dock spent a total of forty-three days in jail for the cause.  

 

Despite their deep-seated differences over tactics and strategy, there were some surprising commonalities between the two groups.  The most striking was how both wings of the suffrage movement provided a welcoming space for a range of living and working arrangements that definitely fell outside the bounds of heteronormativity. At the height of her suffrage militancy, Hunkins began an affair with a married man whose wife refused to give him a divorce.  Undaunted, the couple moved to England in 1920, where they had four children before finally marrying in 1930.  Maud Wood Park married an architect while she was a student at Radcliffe, but she kept that marriage secret so as not to interfere with her studies. When she was widowed, she kept her second marriage secret as well, reasoning that her career would be taken more seriously if she wasn’t suspected of neglecting her husband.  The deeper we dig, the more examples we find of suffragists young and old leading far more unconventional lives than their somewhat dour public reputations might suggest.

 

Both Hazel Hunkins and Maud Wood Park enjoyed significant careers after the Nineteenth Amendment was passed.  Park served as the first president of the National League of Women Voters and later was instrumental in the establishment of the Woman’s Rights Collection at Radcliffe in the 1940s.  Hazel Hunkins-Hallinan (as she was now known) joined the Six Point Group, the leading British feminist organization, and served as its chair in the 1950s and 1960s.  After speaking at Alice Paul’s memorial service in 1977, she took part in a march in support of the Equal Rights Amendment organized by the National Organization for Women.  Once a feminist, always a feminist. 

 

Thousands of women took different paths and pursued multiple strategies to win the goal of securing the right to vote. Their individual acts of courage and persistence, their quiet determination and flashes of militancy put human faces on the collective drama of social change.  As we count down to the centennial of the Nineteenth Amendment, these personal stories remind us that the road to women’s full participation in public life has been a long and contested one, sometimes even pitting women against each other as they fought for the goal of equality.  The women’s suffrage movement was stronger because of this diversity of approaches.  The split may even have hastened its ultimate success.  

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171894 https://historynewsnetwork.org/article/171894 0
Why the Middle East Studies Association Should Not Defend Omar Barghouti

 

Last month, the Middle East Studies Association (MESA)’s Committee on Academic Freedom (CAF) published a letter to Secretary of State Mike Pompeo denouncing the U.S. refusal to admit Omar Barghouti, the founder of the Boycott Divest and Sanctions (BDS) movement. The professional organization representing academics who specialize in the Middle East regularly finds fault with Israeli policies and politicians and whitewashes Palestinian terrorism. 

Amazingly, Barghouti lives in Israel, not in Gaza, Nablus, or Ramallah, as one might expect from a man who describes Israel as an apartheid state. He was refused entrance to the U.S. on April 10 when immigration officials prevented him from boarding a flight at Ben-Gurion airport. Barghouti was scheduled to speak at several American universities, be interviewed by journalist Peter Beinart and Temple University professor Marc Lamont Hill, and attend his daughter’s wedding. 

In its CAF letter, MESA describes Barghouti’s denial of entry as “an act of political censorship” and a “politically motivated attack on the right of Americans to hear and engage with the full range of viewpoints on the Israeli-Palestinian conflict.” It quotes Amnesty International’s mischaracterization of Barghouti as a “human rights defender” and contends that the U.S. and Israel are engaged in a deliberate effort to undermine “the principles of academic freedom.” 

MESA portrays Barghouti as a dissident who has been unfairly treated simply “because the U.S. government does not like his political views.” But Barghouti is much more than an outspoken activist whose opinions conflict with U.S. policy. He presides over an intricate web of anti-Israel groups. They offer the illusion of legitimate political organizations and charities, but they provide cover for terrorist organizations.

Barghouti founded the Palestinian Campaign for the Academic and Cultural Boycott of Israel (PACBI) in 2004. It spawned a franchise model of campus organizations collectively known as the Boycott, Divestment and Sanctions (BDS) campaign. In 2007, Barghouti founded the BDS National Committee (BNC). Tablet magazine calls it “the main West Bank and Gaza-based cohort advocating for sanctions against Israel.” But there’s more to it than advocating sanctions. The BNC lists a number of supporting “Unions, Associations, Campaigns,” one of which is called the Council of National and Islamic Forces in Palestine/Palestinian National and Islamic Forces (PNIF). The PNIF lists among its member organizations five Palestinian terrorist organizations: Hamas; the Popular Front for the Liberation of Palestine (PFLP); the Popular Front-General Command (PLFP-GC); the Palestine Liberation Front; and Palestinian Islamic Jihad (PIJ). This is the face that Omar Barghouti hides and MESA ignores.

In its spirited defense of Saint Omar, MESA fails to mention these salient facts. This is hardly a surprise. As I’ve argued before, MESA has a long history of omitting inconvenient truths in its defense of Palestinian academe. 

So, what would Omar Barghouti say on his MESA-approved speaking tour? If his recent appearances in academic settings are any indication, it would be a combination of conspiracy theories, anti-Semitic canards, and variations on “the right of return” rhetoric.

Barghouti is an accomplished trafficker in conspiracy theories. On February 24, 2015, he told a Portland State University audience that “The U.S. and Israel are benefitting a lot from this ISIS phenomenon,” and that the U.S. and Israel “created Taliban, they created al-Qaeda . . . so why not ISIS?”

Although he denies it, Barghouti’s anti-Semitism is indisputable. On January 11, 2014, he regaled an audience at Wayne State University by telling them that “Israel has a hold on Congress, has enormous influence on Congress...Congress is bought and paid for by the Israel lobby.”   

The destruction of Israel is high on Barghouti’s wish list. The influx of 5 to 7 million Palestinian refugees will mean the end of Israel. As one of the authors of the “One State Declaration” (2007), he has long pined for a Palestinein which “Jews will by a minority.” He justifies this fantasy with jargon-filled hyperbole about Israeli “ethnic cleansing” and the Jewish colonization of Palestine. 

In Spain, a judge has admitted a criminal complaint against Barghouti lodged by the Madrid-based group ACOM which has called attention to Barghouti’s claims that “BDS demands would result in the destruction of Israel.”

MESA’s claim that the administration should not “ban people from entering the United States on ideological grounds” is a sham. If a similar media tour were scheduled for a “white nationalist” claiming that non-whites control Congress, urging the removal of non-whites from the U.S., and linking non-whites with terrorist groups, MESA would be singing a different tune. Yet MESA supports Barghouti who claims that Jews control Congress, advocates for the removal of Jews from Israel, and links the U.S. and Israel with the creation of al-Qaeda, the Taliban, and ISIS. 

Omar Barghouti should not be let into the U.S., now or ever. A thorough investigation of his network of organizations should earn him a place on the U.S. Department of Treasury’s Specially Designated Nationals and Blocked Persons List. Shame on MESA for supporting his subterfuge under the guise of academic freedom.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171893 https://historynewsnetwork.org/article/171893 0
Denying Omar Barghouti Admittance to the U.S. Is a Free Speech Issue

 

A. J. Caschetta claims that the Middle East Studies Association “whitewashes terrorism,” citing a letter that its Committee on Academic Freedom recently sent to Secretary of State Pompeo protesting the Trump administration’s decision to deny Omar Barghouti, a Palestinian leader of the BDS movement, entry to the United States. This is just one of the baseless and tendentious assertions in his post, which repeatedly conflates criticism of Israel and of Zionism with anti-Semitism. It should come as no surprise that Mr. Caschetta is associated with the Campus Watch website which, as many HNN readers no doubt know, has since 2001 repeatedly defamed scholars of the Middle East as enemies of Israel, anti-Semitesand/or terrorist sympathizers because of their views on Israeli policies toward the Palestinians or on US policy in the region.

What is really at stake here? In fact, MESA has no official position regarding BDS. However, in keeping with the principles of academic freedom, MESA is committed to defending the right of faculty and students at this country’s institutions of higher education to speak, teach and advocate about the Israeli-Palestinian conflict as they see fit. This is a right that is currently under attack, not only by outfits like Campus Watch but also by state governments, members of Congress and administration officials who seem determined to suppress, or even criminalize, one particular political position – support for BDS – in academia and beyond. 

MESA’s Committee on Academic Freedom also sees the denial of entry to Omar Barghouti as a free speech issue: we do not believe that it is acceptable for the U.S. government to ban people from entering the United States on ideological grounds, thereby preventing Americans from hearing views the government dislikes. That’s what we believe this case is about. Mr. Caschetta justifies the Trump administration’s action with respect to Mr. Barghouti by comparing him to an openly racist white nationalist who might legitimately be denied entry. This analogy is absurd: whether or not one agrees with the goals or strategy of the BDS movement, the methods it advocates are nonviolent and its core demand is that international human rights principles be applied to the Israeli-Palestinian conflict. 

HNN readers should judge for themselves by viewing the lengthy and substantive conversation that journalist Peter Beinart (who is not a supporter of BDS) conducted with Omar Barghouti – by video link rather than in person as originally planned. That was the real point of MESA’s letter: Americans should be able to hear Mr. Barghouti’s views and decide for themselves what they think.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171895 https://historynewsnetwork.org/article/171895 0
Capitalism? Socialism? How about Just a Fair and Moral Economy?

 

 

In early 2019, Donald Trump warned against socialism: it “promises unity, but it delivers hatred and it delivers division. Socialism promises a better future, but it always returns to the darkest chapters of the past. That never fails. It always happens. Socialism is a sad and discredited ideology rooted in the total ignorance of history and human nature.” 

 

His campaign claimed that “Bernie Sanders has already won the debate in the Democrat primary, because every candidate is embracing his brand of socialism.” But in February and March 2019, numerous candidates such as Beto O’Rourke, Kamala Harris, Elizabeth Warren, and John Hickenlooper scrambled to answer whether they favored capitalism or socialism. If I were any of them, I would simply say, “I’m for a fair and moral economy, call it what you will. Politics is too full of labeling.”  My rationale for such a statement is that both capitalism and socialism have too much historical baggage. 

 

Socialism and socialist have long been scare words conservatives have hurled at opponents. Social Security? Socialist! Medicare? Socialist! In the 1960s when Medicare was being debated, the American Medical Association (AMA) had a leading actor speak out against it on a record entitled “Ronald Reagan Speaks Out Against Socialized Medicine.”

 

Bernie Sanders might admit to being a “democratic socialist,” but his critics like to ignore the difference between this type of socialism, common in Europe, and the kind that existed in the Union of Soviet Socialist Republics (USSR) or in modern Venezuela. In his State of the Union speech, President Trump claimed that Venezuelan “socialist policies have turned that nation from being the wealthiest in South America into a state of abject poverty and despair,” and that “in the United States, we are alarmed by the new calls to adopt socialism in our country. America was founded on liberty and independence—not government coercion, domination, and control. We are born free and we will stay free. Tonight, we renew our resolve that America will never be a socialist country.”

 

But capitalism also has become increasingly distasteful to many Americans, especially younger ones, who associate it with too many negative connotations (see here and here).  Its past includes the exploitation of labor (including children), as Karl Marx and Charles Dickens portrayed; the U. S. Gilded Age and era of Robber Barons such as J.P. Morgan, Andrew Carnegie, John D. Rockefeller, and railway tycoon Jay Gould;  and the refusal of governments to interfere in the “free market” to alleviate suffering in times such as the Irish famine of the late 1840s or the Great Depression—in 1932, President Hoover thought that “Federal aid would be a disservice to the unemployed.” 

 

There have also been the periodic depictions of greedy capitalists (e.g. , the 1987 film Wall Street, in which Michael Douglas’s character, Gordon Gekko, proclaims, “Greed, in all of its forms — greed for life, for money, for love, knowledge—has marked the upward surge of mankind”); the rapacity of corporations such as Wells Fargo that created fake accounts and  Purdue Pharmacy that put money-making before health and sparked the opioid crisis; and the present incarnation of ugly capitalism, the author of the Art of the Deal and our current president, Donald Trump. 

 

One of the main problems with unrestrained capitalism is—as sociologist Daniel Bell in his The Cultural Contradictions of Capitalism: 20th Anniversary Edition indicated—is that it has “no moral or transcendental ethic.” As conservative economist Milton Friedman wrote in 1970, “The social responsibility of business is to increase its profits.” 

 

There was one period in U. S. history that attempted to provide the moral ethic that capitalism ignored—the Progressive Era (1890-1914). One historian characterized the progressive movement of the time as an attempt “to limit the socially destructive effects of morally unhindered capitalism, to extract from those [capitalist] markets the tasks they had demonstrably bungled, to counterbalance the markets’ atomizing social effects with a counter calculus of the public weal [well-being].” This movement did not attempt to overthrow or replace capitalism but to constrain and supplement it in order to insure that it served the public good. 

 

Although Progressivism succeeded in some ways, its forward momentum was stalled by World War I and 12-years of Republican presidents from early 1921 to early 1933.  The presidency of Franklin Roosevelt (1933-1945) and his New Deal economic and social policies renewed the progressive effort. Despite ebbs and flows in the continued advancement of progressive economics after World War II, the U. S. State Department in 2001 still declared that though “the United States is often described as a ‘capitalist’ economy,” it “is perhaps better described as a ‘mixed’ economy, with government playing an important role along with private enterprise.

 

To further such a mixed economy Pulitzer Prize winning economist Joseph Stiglitz, in his new book People, Power and Profits: Progressive Capitalism for an Age of Discontent, recommends “progressive capitalism.” In an interview, he explains that he believes in a market economy, but also in government regulation. And his term suggests that, like the progressives of the Progressive Era, he believes that our economy should recognize a “moral or transcendental ethic”—seeking the common good.  (Such a goal has also been recommended by Pope Francis, who has criticized modern-day capitalism as “unjust at its root.”) 

 

Stiglitz’s term “progressive capitalism,” however, still has one main drawback: It fails to avoid the trap he tried to avoid regarding socialism. No matter what adjective you put before a baggage-laden word (democratic before socialist or progressive before capitalism), the main word is still too emotionally tinged.

 

“A fair and moral economy,” however, is easy to defend while still having historical roots. “You’re against it? What do you want, an unfair and immoral economy?” In President Harry Truman’s 1949 State of the Union address he declared, “Every segment of our population and every individual has a right to expect from our Government a fair deal.” As part of such a deal he proposed universal health insurance. (“We must spare no effort to raise the general level of health in this country. In a nation as rich as ours, it is a shocking fact that tens of millions lack adequate medical care. We are short of doctors, hospitals, nurses. We must remedy these shortages. Moreover, we need—and we must have without further delay—a system of prepaid medical insurance which will enable every American to afford good medical care.”) A conservative coalition of Republicans and Southern Democrats, however, blocked such health insurance and many other aspects of Truman’s proposed Fair Deal. 

 

A “moral economy” also has worthwhile precedents. One of capitalism’s earliest heroes, Adam Smith, taught “Moral Philosophy” in Glasgow, and his The Theory of Moral Sentiments (1759) provided the groundwork for his later more famous The Wealth of Nations (1776). However right or wrong his ideas might be, one cannot claim that Smith was indifferent to a “moral economy.”

 

More recently, in 2016, Bernie Sanders gave a talk in Rome entitled “The Urgency of a Moral Economy: Reflections on the 25th Anniversary of Centesimus Annus.” In response, I devoted a substantial essay to the talk. Thus, only a brief recap of its main points are needed here.   

 

First, Sanders noted that the Catholic Church’s “social teachings, stretching back to the first modern encyclical about the industrial economy, Rerum Novarum in 1891, to Centesimus Annus, to Pope Francis’s inspiring [environmental] encyclical Laudato Si’. . . have grappled with the challenges of the market economy. There are few places in modern thought that rival the depth and insight of the Church’s moral teachings on the market economy.” Sanders also claimed that Pope Francis in his 2013 “apostolic exhortation” Evangelii Gaudium “stated plainly and powerfully that the role of wealth and resources in a moral economy must be that of servant, not master.”    

 

I ended my essay by writing that one of Sanders’s most important contributions might have been leading us to reexamine the question “How moral is our economy?” Now, three years later, some of the Democratic candidates for the 2020 presidential nomination, such as Sanders and Elizabeth Warren, are suggesting that our present Trumpian economy, which favors the rich, furthers inequality, and despoils our environment, is unfair and immoral.   

 

In his new book, People, Power and Profits, and in other places, Joseph Stiglitz indicates that our present “economy is not only failing American citizens. It's failing the planet, and that means it's failing future generations.” But he not only damns the present Trumpian economic approach, but also suggests various more ethical solutions. All the present Democratic candidates need to do the same. They don’t have to answer whether or not they favor capitalism or socialism.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171899 https://historynewsnetwork.org/article/171899 0
Navassa Island: The U.S.’s 160-year Forgotten Tragedy

The Navassa Lighthouse

 

On December 8, 1859, to forestall a Haitian attempt to take possession of Navassa, a Caribbean island south of Cuba, U.S. Secretary of State Lewis Cass made a momentous decision. He officially recognized an American ship captain’s claim filed under the Guano Islands Act of 1856. 

 

The law allowed American citizens to claim and possess islands “not within the lawful jurisdiction of any other government” for the purpose of mining guano (accumulated excrement of seabirds, valuable as agricultural fertilizer). In such an instance, “said island, rock, or key may, at the discretion of the President of the United States, be considered as appertaining to the United States.”

 

In his 1956 book Advance Agents of American Destiny, diplomatic historian Roy F. Nichols noted,“In this humble fashion, the American nation took its first step into the path of imperialism; Navassa, a guano island, was the first noncontiguous territory to be announced formally as attached to the republic.” None have been under U.S. administration for a longer time.

 

Cass’s decision ignored the fact that for more than two centuries Haitian fishermen had landed at the island to harvest shellfish. It also ran counter to every Haitian constitution since 1801, which had declared Haiti’s sovereignty over all its coastal islands including Navassa. 

 

However, since the United States did not recognize the government of Haiti in 1859, these facts on the ground were considered of little consequence. Of greater importance was the perceived existential threat Haiti’s history of successful slave insurrection and emancipation while a French colony posed to American slave owners and their representatives in Washington.  

 

Haiti and U.S. Imperial Ambitions

 

While slaveholders worried that their slaves might revolt against them as the Saint-Domingue slaves had done in 1791, many American politicians and journalists advocated the conquest and annexation of all the Caribbean islands, especially Hispaniola, as the logicalway to extend American imperial power.

 

For example, in 1850 James Gordon Bennett, editor of the New York Herald, the largest and most popular daily newspaper in America, had advocated a plan “to annex Hayti, before Cuba.” He wrote that a war in pursuit of that aim “would be a source of fun and amusement, ending in something good for the reduction of the island to the laws of order and civilization. . . . St. Domingo will be a State in a year, if our cabinet will but authorize white volunteers to make slaves of every negro they can catch when they reach Hayti.”

 

The Haitian government countered such threats by maneuvering diplomatically among the European powers who risked seeing their hold over their own Caribbean colonies weakened and lost if they failed to thwart American plots against Haiti. But when a Haitian naval delegation attempted to take control of Navassa, President James Buchanan ordered the U.S. Navy to send a warship to Haiti to restore the American guano operation. Haiti’s commercial agent protested, but the State Department dismissed his letters.

 

An advisor to Haitian President Faustin Soulouque wrote to him candidly, “Even though the law is on our side in this affair, justice and the legitimacy of our cause will triumph only when certain barriers in the United States are broken down. Even after those fall, we should not believe their promises until they no longer attach economic importance to Navassa.” 

 

From 1857 to 1898, American companies based in Baltimore and New York mined and sold Navassa’s guano, employing black laborers supervised by white managers.

 

The 1889 Navassa Revolt and its Consequences

 

On September 14, 1889, African American workers at Navassa rose in revolt against their cruel white supervisors. By the time the battle ended, four whites lay dead. A fifth would die several days later. 

 

Removed to Baltimore, 43 insurgents were charged with crimes ranging from rioting to murder. Two African-American organizations — the Brotherhood of Liberty and the Order of Galilean Fishermen — hired a legal team of three black and three white lawyers to defend them.

 

Tried in U.S. Circuit Court, three defendants were convicted of murder and sentenced to hang. Others were convicted of lesser crimes — 14 of manslaughter and 23 of rioting — and sentenced to prison terms. Three were acquitted. The executions were stayed pending an appeal to the U.S. Supreme Court styled Jones v. U.S.

 

In that proceeding Jones’s lawyers challenged the constitutionality of the Guano Act, the authority of the United States government over Navassa, and the jurisdiction of the American court. Among the issues was Haiti’s claim to the island. The high court rejected those arguments and affirmed the conviction.

 

In language that has freighted international relations ever since, the court declared on November 24, 1890:

 

. . . if the executive, in his correspondence with the government of Hayti, has denied the jurisdiction which it claimed over Navassa, the fact must be taken and acted on by this court as thus asserted and maintained; it is not material to inquire, nor is it the province of the court to determine, whether the executive be right or wrong; it is enough to know that in the exercise of his constitutional functions he has decided the question.

 

Supporters of the defendants mounted a petition campaign, urging President Benjamin Harrison to grant the insurgents executive clemency. Harrison responded favorably. Citing the inhumane conditions imposed on Navassa workers, he wrote, “They were American citizens, under contracts to perform labor, upon specified terms, within American territory, removed from any opportunity to appeal to any court, or public officer, for redress of any injury, or the enforcement of any civil right.” He commuted the death sentences to life imprisonment.  

 

Guano mining continued at Navassa for another eight years, “longer and more extensively than any other island, rock, or key that ever appertained to the United States,” according to Jimmy M. Skaggs, author of the 1994 book The Great Guano Rush.

 

Navassa Island in the TwentiethCentury

 

By the turn of the twentieth century, Americans had abandoned Navassato castaways, Haitian fishermen, and nature. However, a new purpose revived Navassa’s importance. Anticipating substantially increased maritime traffic after the Panama Canal opening in 1914, some naval authorities feared that in stormy weather Navassa would become a dangerous hazard to navigation. In 1913 Congress authorized construction of a lighthouse on the island. 

On January 17, 1916, shortly before construction began, President Woodrow Wilson codified the island’s status as a site for a lighthouse and reaffirmed its status as a possession “under the exclusive jurisdiction of the United States and out of the jurisdiction of any other government.”

 

After World War I the Navy established a radio station at Navassa. In 1929 the lighthouse was automated. During World War II, the Coast Guard stationed a reconnaissance unit and a rescue launch there to defend against German submarines.

 

Navassa and its “Appurtenance” Apparition after World War II

 

The end of the war restored Navassa to its Wilsonian status as a lighthouse reserve, periodically serviced by the Coast Guard and visited by Haitian trespassers who paid no heed to the American Guano Act. But the heritage of the Guano Act and the Jones v. U.S. Supreme Court precedent continued to cast a long shadow beyond that single small island.

 

Less than a month after Japan’s surrender President Harry S. Truman proclaimed that “the Government of the United States regards the natural resources of the subsoil sea bed of the continental shelf beneath the high seas but contiguous to the coasts of the United States as appertaining to the United States, subject to its jurisdiction and control.” 

 

As if to wring as much international mischief as possible from Truman’s proclamation, the April 1947 issue of Nation’s Business magazine published an article titled “A Legal Key to Davy Jones’ Locker” with the teaser subhead “A forgotten murder provides a background for our announced right to seek oil in the Gulf of Mexico.” Navassa as a metaphor for the unrestricted exercise of extraterritorial power had superseded the significance of the island itself.

 

Haitian Hopes Raised and Dashed

 

Nine years later, Rep. William L. Dawson (R-IL) introduced “A bill to disclaim any rights of the United States to the island of Navassa,” which was referred to the Committee on Foreign Affairs in the House of Representatives. 

Although the bill stood no chance of being reported out, intellectuals in Haiti seized the opportunity to reprise their country’s claim to Navassa. The cultural journal Optique devoted 28 pages of the August 1956 issue to the subject. An unsigned introductory article reviewed the history of the dispute, summarized Haiti’s legal position, and cited American attitudes both pro and con. 

 

African Americans and advocates of a just and democratic foreign policy tended to sympathize with Haiti’s claim; the Eisenhower administration and career State Department diplomats ignored them. A monthly Coast Guard patrol continued to maintain the lighthouse. From time to time, beginning in 1956 and continuing to the present, U.S. amateur radio hobbyists have obtained permission to set up temporary broadcasting stations at Navassa.

 

Clandestine Attack on Cuba from Navassa Island

“Cuban Outbreak of Swine Fever Linked to CIA” headlined a January 9, 1977, article in Newsday, a Long Island, New York, daily paper. It began,

 

With at least the tacit backing of U.S. Central Intelligence Agency officials, operatives linked to anti-Castro terrorists introduced African swine fever virus into Cuba in 1971. Six weeks later an outbreak of the disease forced the slaughter of 500,000 pigs to prevent a nationwide animal epidemic.

 

A U.S. intelligence source told Newsday he was given the virus in a sealed, unmarked container at a U.S. Army base and CIA training ground in the Panama Canal Zone, with instructions to turn it over to the anti-Castro group. 

 

The 1971 outbreak, the first and only time the disease has hit the Western Hemisphere, was labeled the “most alarming event” of 1971 by the United Nations Food and Agricultural Organization. . . .

 

Another man involved in the operation, a Cuban exile who asked not to be identified, said he was on the trawler where the virus was put aboard at a rendezvous point off Bocas del Toro, Panama. He said the trawler carried the virus to Navassa Island, a tiny, deserted, U.S.-owned island between Jamaica and Haiti. From there, after the trawler made a brief stopover, the container was taken to Cuba and given to other operatives on the southern coast near the U.S. Navy base at Guantanamo Bay in late March, according to the source on the trawler.

 

Six days later the CIA officially denied the story, which had been widely reprinted, but the Newsday reporters had cited so many corroborating sources, with such specific details, that the denial was not widely believed.

 

A previously unreported documentlends circumstantial support to the Newsday story — a 1986 typescript draft of an article by U.S. Coast Guard lighthouse historian Neil Hurley titled “Navassa Island Light, ‘Where Chickens Only Miraculously Survive the Attacks of Lizards’.”

 

When Hurley’s article appeared in the Winter 1988 issue of The Keeper’s Log, under the title “Navassa Lighthouse, ”these two sentences from his earlier draft were omitted: “In 1971, a U.S. Navy Research team visited the Island to look for animal diseases that could be transmitted to man. They found one bird carrying malaria.”

 

It might be a coincidence, but it seems remarkable that the Navy was investigating the possible presence of biological toxins at about the time that agents were reported to have brought dangerous microbes to Navassa for a biological attack on Cuba. 

 

What made the Newsday report credible was the fact that the only place in the Western Hemisphere where the virus was known to have been kept before the Cuban outbreak was the secret Plum Island laboratory off the eastern tip of Long Island. (Newsday reporters had been cultivating sources there since the paper’s sole visit in October 1971.)

 

The Newsday article made no mention of Plum Island, perhaps to protect its reporters’ sources, but other writers quickly made the connection. In his 2004 book about Plum Island, Lab 257, Michael Christopher Carroll wrote that although “no one will say on the record that the virus for the Cuban mission was prepared on Plum Island,” that was almost certainly its source. “Efforts to explain away the outbreak as a natural occurrence do not hold up to close examination.”

 

What Lies Ahead for Navassa Island?

 

Haiti has never relinquished its claim to Navassa, and its citizens have continued to flout U.S. authority. Following the example set by their North American peers, in the spring of 1981 members of the Radio Club d’Haiti were issued the call sign HH0N and flown to the island by helicopter.

 

Upon arrival they raised the Haitian flag and sang their national anthem. When an American military officer asked to see their authorization to land, they answered, “We need no permit to travel in our country.” The officer relented and welcomed the Haitians to camp. After a seven-day stay. they returned the way they had come.

 

Today the Fish and Wildlife Service of the Interior Department administers the island as a national wildlife refuge. News reports of a 1998 scientific expedition called Navassa “a unique preserve of Caribbean biodiversity” but paid scant attention to Haiti’s claim, or to the heartless history that awaits atonement. We can take the first step along that path by teaching it.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171898 https://historynewsnetwork.org/article/171898 0
Grover Cleveland Bergdoll and the Long Reach of World War I’s Jingoism

 

 

A few months before the start of World War II, a man identified as Bennet Nash boarded the German passenger liner Bremen in Hamburg. Somewhere in the middle of the Atlantic, on the advice of his attorney, he tore up his passport and threw the pieces overboard. The document was a fake, the name borrowed from a star in the Big Dipper.

 

When the Bremen arrived in New York on the afternoon of May 25, 1939, a Coast Guard cutter met the vessel before it docked. A group of immigration officials came aboard and found “Nash” loitering in the smoking lounge. When asked for his identification, the man replied, “I have no passport. I am Mr. Bergdoll.”

 

Grover Cleveland Bergdoll, the most notorious American draft dodger of World War I, had finally come home.

 

Bergdoll’s case shows the surprising longevity of the jingorism that roiled the United States during the First World War. Following the U.S. entry into the war in 1917, the conflict had been framed as a noble endeavor to safeguard democracy and liberty. Those who disagreed with this rationale were roundly shouted down or denounced as traitorsandthe Sedition Act of 1918 crackeddown on dissenters. 

 

But two decades later, public opinion in the United States had veered strongly toward isolation. The rise of dictatorships overseas had undermined the rationale for the First World War; it seemed like the war had not made the world safe for democracy, but only paved the way for brutal regimes. Congress passed Neutrality Acts aimed at keeping the nation from becoming entangled in the same situations that had led the United States to declare war on the Central Powers in 1917.

 

But while these attitudes reflected a growing disdain for the motives behind World War I, few expressed sympathy for a man who had willfully avoided service in this conflict. Bergdoll’s sensational case remained in the headlines for roughly two decades, and veterans’ groups never ceased their demand that he be brought home to face justice.

 

Bergdoll’s fame, and later infamy, was initially confined to his hometown of Philadelphia. He was born into a wealthy brewing family, dabbled in auto racing, and became one of just 119 people to train at the Wright Brothers’ flying school in Ohio. He purchased a Model B aircraftand regularly usedit to perform aerial spectacles for adoring crowds. In 1912, he became the first person to fly from Philadelphia to Atlantic City.

 

While he earned praise for his aviation achievements, Bergdoll was frequently denounced for his behavior on the ground. He was once accused of assaulting a police officerand spent three months in jail in 1913 for causing a serious car accident. He briefly attended the University of Philadelphia, but was thrown out after publishing an offensive newspaper. His own brother tried unsuccessfully to have him declared insane in 1915.

 

This publicity, both good and bad, made Bergdoll a more recognizable figure in Philadelphia when he was charged with evading the World War I draft. Although he registered for conscription as required, he failed to show up for an appointment with his local draft board. In August 1918, he was automatically inducted into the military and then immediately declared a deserter. 

 

 

The case gained national attention after two sensational events in 1920. The first came on January 7, when Bergdoll was captured while visiting his home. Officers searching the stately mansion first had to pacify Bergdoll’s mother who was armed with a pistol and blackjack. They discovered Bergdoll hiding in a window seat and transported him to the Army’s disciplinary barracks on Governors Island in New York. At a subsequent court-martial, he was convicted of desertion and sentenced to five years’ imprisonment.

 

Four months later, Bergdoll told his attorneys that he had buried a considerable amount of gold in Maryland while in hiding. He worried that someone else might stumble upon the treasure while he was incarcerated and wanted to recover it. His attorneys struck a deal with military officials where he would be temporarily released, under guard, for a trip to the site where the gold was purportedly hidden.

 

The expedition was so poorly managed that it would spark a congressional investigation. To keep him from looking too conspicuous, Bergdoll was not handcuffed and wore a uniform virtually identical to that of the two sergeants guarding him. The party also stopped at Bergdoll’s home in Philadelphia instead of proceeding directly to Maryland.

 

On May 21, Bergdoll gave his guards the slip and fled in one of his vehicles. The strange circumstances of the escape again launched Bergdoll into the spotlight. The public’s ire would only grow more intense when it was found that the fugitive had managed to make his way to Germany, taking up residence in a hotel owned by an uncle.

 

Since the Senate had not ratified the Versailles Treaty, the United States was still formally at war with Germany (a separate peace would be concluded in 1921). To many, especially veterans who recently fought German soldiers, Bergdoll’s offense was no longer simply a matter of cowardice; he was now denounced as a traitor to his country.

 

This rage was at its peak in the first few years after Bergdoll’s escape. On two occasions, in 1921 and 1923, American soldiers stationed in Europe tried unsuccessfully to kidnap the fugitive. During the inquiry into Bergdoll’s escape, one congressman became so incensed during questioning of one of Bergdoll’s brothers that he nearly drew a pistol on the witness. 

 

The American Legion and other veterans groups routinely demanded that the government do more to bring Grover back to the United States. Occasionally, sentiments against Bergdoll hit enough of a fever pitch that a group or official would demand that he change his name, or that he only be referred to as “G.C.,” to avoid disparaging the President for whom he was named. Bergdoll’s mother pointed out during her congressional testimony that President Grover Cleveland had paid a substitute to serve in his place during the Civil War, but it did little to dampen the outcryagainst the President’s namesake. The efforts had a certain resemblance to the more ridiculous campaigns to omit any references to Germany during World War I, such as renaming German measles “liberty measles.”

 

The outrage had faded considerably by the spring of 1939, when Bergdoll announced that he intended to return to the United States and surrender to military authorities. Some people even opined that his only crime was having the foresight to realize that World War I would not be a war to end wars. “Any coward can fall into a draft line and march off to a foreign slaughter,” two women wrote to the Milwaukee Journal in April. “Only a real hero can resist and fight against mob hysteria as did Bergdoll. Anyone can follow the wild mob but few have the grit and real patriotism to oppose the bloodthirsty mob.” 

 

But the harsher, jingoistic attitudes that swept the nation during the war had not died out entirely. In the House of Representatives, Forest A. Harness introduced legislation aimed to bar Bergdoll from entering the country. The Indiana Republican, who had served in the infantry and suffered combat wounds in World War I, introduced a bill establishing that any deserter who had fled to a foreign country to escape punishment should not be readmitted into the United States.

 

Bergdoll’s attorney, Harry Weinberger, thought the bill was doomed to failsince it was a bill of attainder and ex post facto law - both of which are forbidden under the Constitution. But the disdain for Bergdoll trumped this concern. To Weinberger’s alarm, the bill sailed through committee and passed unanimously in a vote before the full membership of the House.

 

The vote set up a dramatic confrontation between Harness and Weinberger in a Senate committee hearing as the Bremen neared New York. If Harness prevailed, his bill would be given priority and put to the vote in Senate; if it passed, a likely possibility, Bergdoll would be turned away at Ellis Island and sent back to Germany. 

 

The argument put forth by Harness was little more than invective against Bergdoll, suggesting that he was an unsavory specimen who shouldn’t be allowed to live in the United States. Claiming the fugitive had effectively committed treason, he suggested that Bergdoll only wanted to come home because Hitler’s regime was interfering with his lavish lifestyle. Harness even suggested that Bergdoll might have personally irritated the Fuhrer and that he was trying to escape punishment at the hands of Hitler’s goons. Relishing the fantasy, he declared, “As loathsome and revolting as are the Gestapo methods of Hitler, this might be an occasion where we could almost view them with tolerance.”

 

Weinberger framed the matter differently, saying the patriotic zeal of people like Harness was threatening the very democracy they purported to protect. He warned that the bill would set a dangerous precedent, especially in light of the fascist dictatorships in Europe, of allowing Congress to strip the rights of citizenship from anyone who happened to rankle the government. “The question of the passage of the Harness bill is greater than Bergdoll or any individual,” said Weinberger. “It goes to the fundamentals of American democracy and liberty. It is the first possible step to establish dictatorship by ex post facto laws and bills of attainder.”

 

Cooler heads prevailed, and the Senate committee opted not to fast-track the legislation. A second court-martial proceeding ordered Grover to serve the remainder of his original sentence plus three years for his escape, bringing the long saga of his case to a somewhat anticlimactic conclusion.

 

Bergdoll was hardly the only person who had avoided the World War I draft. He was just one of 337,649 men who had been declared guilty of the offense; about 161,000 of this group would never face punishment. Most draft evasion cases were resolved quietly, with little outcry from the public.

 

But none of these other offenders had attracted such widespread attention for their crimes. Bergdoll’s wealthy status opened him up to accusations that he had only managed to avoid justice because of his deep pockets. He gave varying reasons for his decision not to serve, limiting any sympathy he may have earned if he had provided a consistent reason for not joining the military. Further, he was frequently portrayed as being a waggish, boastful, and otherwise unpleasant person.

 

All of these factors combined to make Bergdoll’s case a lightning rod for lingering jingoism after World War I. Although it presented an extreme case, the Bergdoll saga demonstrates how the sentiments that had brought about “liberty measles” in 1917 were still alive and well more than two decades later.

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171896 https://historynewsnetwork.org/article/171896 0
An Interview with Historian Michael Adas

 

Michael Adas is an American historian and currently the Abraham E. Voorhees Professor of History at Rutgers University. He specializes in the history of technology, the history of anticolonialism and in global history. 

 

Why did you choose history as your career?


Most of my reading as a boy was focused on history. So, despite being put off by the way it was taught in secondary school, I continued to read both fiction and non-fiction focused on historical events, particularly relating to the impact of warfare on historical development. Having been previously engaged with school plays and debate tournaments, when I entered college I intended to pursue a career in acting. Very mixed reviews of a number of minor roles I played in my freshman year, when pitted against several superb history classes soon led to the conviction that I should focus on the latter. Over time it became clear that my interests and skills were a good fit for a career as a college teacher and author.  



What was your favorite historic site trip? Why?


While on a global-spanning, government-sponsored trip in the summer between my sophomore and junior year, I spent several days in the splendid Japanese city of Kyoto. Though I later decided to focus my graduate studies on South Asia, where I spent two months of the trip, the visit to Kyoto and other early Japanese cities initiated a lifelong study of garden design and cultivation. I also developed a fondness for Chinese and Japanese architecture and culture more generally that has informed my teaching and especially my ongoing contributions to eight editions of the World Civilization textbook I have coauthored over nearly three decades with Peter Stearns and Stuart Schwartz. 

If you could have dinner with any three historians (dead or alive), who would you choose and why?

 

Assuming they are historians I have not worked with or met, hence on the basis of their writings, I would 
choose Barbara Tuchman, Carlo Cippola, and Christopher Clark.

 

What books are you reading now?

Lynn Olson, Last Hope Island

Dahr Jamail, The End of Ice

Peter Wohlleben, The Hidden Life of Trees

Russell Shorto, Amsterdam

Kazuo Ishiguro, Never Let Me Go

 

(As long as I can remember, I’ve read several books simultaneously)

 

What is your favorite history book?

 

Michael Shaara’s Killer Angels

 

What is your favorite library and bookstore when looking for history books?


 

Library: The Main Reading Room (now defunct, alas) formerly in the British Museum. Book Stores: A tie between Foyles in book heaven along Charing Cross Road in London and Blackwell’s in Oxford.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I am not that sort of collector. But over the years students have given me historical artifacts from World War 1 and the Vietnam War. I have several first editions of soldiers’ accounts, novels, poetry, and journalists’ memoirs on the global disaster that became known in retrospect, the First World War.   




 

Which history museums are your favorites? Why?


 

My wife Jane and I share a fascination with museums throughout Europe that focus on World War II. For other historical eras, excluding art museums with antiquities, I prefer the remains of, for examples, Roman villas in Morocco, trench memorials and buildings historically reconstructed in Belgium and northern France, and Frank Lloyd Wright’s remarkable structures in Tokyo and across North America.

 

Which historical time period is your favorite?

 

I am reluctant to limit this, but if compelled to do so I would opt for the late-nineteenth and twentieth centuries. Predictably this is the period about which most of my writing (less so my teaching) has been focused.

 

What would be your advice for history majors looking to make history as a career?

Read widely, but selectively, in the history of culture areas and time periods that you think you’d like to teach and write about – and be open to changing your choices in this regard. Even before you enter a graduate program, give serious thought to the courses you’d like to teach and how you would approach them. When reading the works of prominent authors pay careful attention to the ways they organize their narrative, include analysis, and seek to make the events they cover come alive for the reader. Keep a running account of you impressions.

 

Who was your favorite history teacher?


 

A tie between Ernest Breisach, a historian of the Italian Renaissance, whose undergraduate courses impressed me with the challenges and pleasures of writing and teaching history, and John Smail, my advisor and mentor in graduate school at the University of Wisconsin, where I focused on Southeast Asian and Global and Comparative history.

 

Why is it essential to save history and libraries?

 

The two are inextricably connected. They provide essential sources for gaining intelligent (but not necessarily correct) understandings of present events and broader developments, thereby making possible well-considered options for decisions that at the highest levels will shape the future of humanity and our planet. 

 

You’re a member of the American Historical Association, World History Association, and the Society for the History of Technology. What unique opportunities have these organizations provided?

 

These and other organizations have provided numerous opportunities for me to present papers on both books and articles recently published (and in some cases revisit those that have been out for years). Equally important, both the AHA and Shot in particular have provided venues that have enabled me to connect with and learn from authors who are working on issues related to my own (or my students) works in progress. These occasions have often proved particularly valuable in shaping my ongoing work. Depending on the sites chosen for a particular year, the conventions of these and other scholarly organizations have also made it possible to renew contacts with former graduate students who have gone on to teach in distant places, editors present and past, and fellow historians (as well as anthropologists, political scientists, and freelance authors) at other institutions.

 

Many historians worry about how to preserve digital history. In your experience, how can historians overcome this challenge and preserve the history of technology?

 

Digital archiving strikes me as a superb way to preserve a wide range of historical research and writing, including a growing corpus of informal oral contributions that would otherwise be lost. Both in terms of dissemination and development, digital technologies have become essential to the preservation of a wide range of historical events and historians’ responses that would otherwise be lost.

 

In 2018, you published Everyman in Vietnam A Soldier's Journey into the Quagmire. Why did you choose to write about the Vietnam War? 

During the decade when I was pursuing my undergraduate and graduate education, the build-up to and ultimately the full-scale U.S. military interventions in Vietnam escalated steadily. As my engagement in global history and international politics deepened, that ill-fated “crusade” proved to be the single most persistent and important historical process shaping my political and moral assumptions about the reasons for pursuing the teaching and writing of history. Nonetheless, I was frustrated at the time by the extremist and counter-productive nature of most of the local opportunities in Madison, Wisconsin for participation in protests against the war as well as my neophyte status that precluded publishing about the conflict.

 

In the decades that followed I managed to address the tragic history of that failed American “crusade” in my college teaching and some of my published works. But I was unable to find a way to bring together all of the forces and levels of conflict that came together in the war. That possibility finally emerged from an unexpected quarter: A seminar presentation by one of my undergraduate students, Joseph Gilch, about his uncle’s combat service and death in the conflict.  A remarkable trove of Jimmy Gilch’s letters to his parents and friends provided the means of connecting critical aspects of and lessons from the war that we as a society still need to absorb. These ranged from the ignorance of the people and history of Vietnam that prevailed among the leaders – both political and military – who made the decisions for the massive interventions that took the lives of Jimmy and so many other young Americans, and utterly devastated the people and land of Vietnam. The letters provided the basis for an inclusive, coauthored, and interpretive narrative of the conflict and its effects on both societies that I had sought to write for several decades. 

 

From your experience writing this book, how can America better help our veterans?

 

Since the book is focused on the war and a soldier whose ordeal we viewed as a prism for participants and engagements on all sides, there is only limited coverage of the aftermath of the conflict or the veterans who survived. But Joe and I have exchanged views on the war and subsequent conflicts with Veterans on radio broadcasts, and engaged in often-intense discussions with Vets during and after book talks. Joe also had early contacts with Vietnam Veterans in his uncle’s unit, and these figured in important ways in the combat portions of the book. Above all, we wrote the book to contribute to the literature on Vietnam and America’s subsequent “little wars.” We intended the book to add to a growing corpus of works in multiple fields that provide the basis for present and future, politically-engaged Americans to resist leaders who seek to send our armed forces to fight foreign wars that do not involve our vital interests and brutally distort the historical trajectories of the peoples and nations that become the targets of these interventions.

 

Are you working on any new books?

 

In addition to essays for edited collections and the eighth edition of the co-authored Global History textbook for Pearson, I am currently working on a book on “Misbegotten Wars and the Decline of Anglo-American Global Hegemony in the Long Twentieth Century” for the Harvard University Press. I am also researching contextual historical introductions for a volume of excerpts from key writings on the diverse causes and effects of climate change and the ways developed or proposed the far to ameliorate the ongoing and increasing threats they pose for human societies and the life of the planet as a whole.  

 

How can people who love history help save and preserve global history and history of technology? How can social media help? 

 

I don’t think that either global history or the history of technology is in danger of being lost or marginalized at this juncture, and I cannot imagine their demise in the foreseeable future. The venues for publishing and organizing conferences in both genres have increased significantly in recent decades. With issues, such as international migration, race and gender relations, and climate change receiving ever more attention world-wide, I think the prospects for both fields of inquiry andsocial activism are almost certain to grow significantly in the decades ahead. And given the responsiveness, especially in recent years, of my students at all levels and the teachers in NJ-NY I have been working with in recent years, I believe that the importance of advanced technologies and the essential linkages provided by international organizations will significantly increase the impact of work in both fields. In dealing with global climate change and other vital trans-continental challenges, it will become more and more imperative to fund and expand collaborative historical projects and enhance the ways in which scholarly and activist findings in these fields are made widely available and acted upon across cultures. 

 

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171897 https://historynewsnetwork.org/article/171897 0
The Template for the Holocaust- Germany's African Genocide

German Schutztruppe in combat with the Herero in a painting by Richard Knötel.

 

“Within the German borders every Herero, with or without a gun, with or without cattle, will be shot.” - General Lothar von Trotha, Commander of German Forces in South West Africa, 1905

Hundreds of emaciated prisoners look out helplessly. Many have died of disease or malnourishment, but their German masters drive them on each day to hard forced labor. As soon as they cannot work anymore, they are shot. Women and children are not exempt from this brutal system. A mother with a young baby falls in the stifling heat and both are beaten mercilessly by a guard. The baby cries, but the mother knows to stay silent.

In this camp, fewer than half the prisoners will survive. Even the commandant considers the rations “in no sense adequate.” With only a handful of uncooked rice to sustain them, these starving people fight for any scrap of food they can find. 

This scene comes not from 1940s Eastern Europe but South West Africa some thirty-five years earlier. The treatment of the Herero and Nama grimly foreshadows the treatment of the Jews of Europe. In the 20th Century’s first genocide, Imperial Germany’s campaign of annihilation provided precedents for the racial ideologies and exterminatory tactics of the Holocaust.  

A Colonial Problem

Germany, which had only unified in 1870, was a latecomer to the colonial game. Otto von Bismarck, Chancellor of the German Empire,had little interest in costly colonial adventures, once declaring “my map of Africa lies in Europe.” Yet, with the rise of Kaiser Wilhelm II, Bismarck’s sober “Realpolitik” was replaced with the expansionist “Weltpolitik” (world politics),expressing Germany’s desire to become a global power. In 1897, Foreign Minister (and later Chancellor) Bernhard von Bulow eloquently captured the essence of “Weltpolitik” by demanding Germany’s “place in the sun.” To secure that place, Germany needed a colonial empire.

To Germany’s chagrin, the other European powers (especially Britain) already occupied the most valuable territories. To “catch-up,” Germany sought to rapidly develop the few colonies she possessed. In South West Africa (modern Namibia), the Reich promoted “Germanization” through the arrival of settlers, the conscription of native people, and the construction of railroads. Germany’s imperialist aspirations posed a mortal danger to South West Africa’s original inhabitants, who were systematically deprived of their possessions, their lands, and their civil rightsRelentless German encroachment led to growing tensions, which burst into open rebellion in 1904. 

Samuel Maherero, leader of the Herero people, wrote to the Nama chief Hendrik Witbooi advocating solidarity against the colonial oppressors. The Herero and Nama began attacking settlements and dealt the Germans a series of embarrassing reverses. The colonial governor requested reinforcements. In early June, Lieutenant General Lothar von Trotha arrived with 14,000 troops to take command. 

Trotha had no intention of making peace with the Herero or Nama, as he explained, “my intimate knowledge of many central African nations has everywhere convinced me…that the Negro does not respect treaties but only brute force.” For Trotha, the aim was not pacification, not subjugation, but annihilation. Troth would advocate “unmitigated terrorism” and pledge to “destroy the rebellious tribes by shedding rivers of blood and money.” On the eve of WWII, Hitler would issue a similar pronouncement to the Wehrmacht: “Close your hearts to pity. Act brutally.”

Racial War

On August 11th, Trotha’s troops met a large Herero force commanded by Samuel Maherero at Waterberg plateau. While the Herero outnumbered the Germans, the Germans broke the Herero lines with heavy artillery and superior firepower. Then, the Germans drove their defeated adversaries into the Omaheke Desert along with their women and children. German pursuit turned into outright slaughter. A witness wrote “the Germans took no prisoners,” not even sparing “mothers holding babies at their breasts.” The soldiers murdered indiscriminately, killing the wounded, the unarmed, and the infirm. 

Many of the initial survivors of the massacre would die of starvation and dehydration in the coming weeks. Trotha commanded his troops to patrol all the watering holes and shoot any Herero trying to escape the desert. The Germans may have also poisoned local wells. In October 1904, Trotha issued his infamous “extermination order,” decreeing that “within the German borders every Herero, with or without a gun, with or without cattle, will be shot.” No exception would be made for women and children. 

Trotha’s barbarism requires an understanding of Social Darwinism and the scientific racism that had gained currency in this period. Trotha and other nationalists considered the conflict a “racial war.” It was only natural for the superior German race to destroy their inferior foes. Trotha claimed that Germany could not prosecute a war “against non-humans” humanely. Considering the Herero a pestilence, he wrote: “I think it is better that the Herero nation perish rather than infect our troops.” The Herero were a threat not only to German settlers but to German blood.

Trotha’s pursuit of racial purity and “living space” for the Teutonic race were well understood by German nationalists. Germany had banned mixed-race marriages in South West Africa, considering them a form of “Rassenschande” or “racial defilement.” The legal disenfranchisement of non-Germans and the fixation on blood would find a chilling echo in the 1935 Nuremburg Laws

These racial laws were supported through the disreputable pseudoscience of Eugenics. Disturbingly, many German academics collaborated in developing the dangerous mythology of racial exceptionalism. Not only would scientific racism be used to justify Germany’s exterminatory policies, but, as will be seen, it would play ghoulish role in the genocide itself.

Final Destruction

After breaking the resistance, the Germans rounded up the surviving Herero and Nama into concentration camps. The prisoners were then used as slave labor to build railroads and dig mines. At the notorious Shark Island camp, inmates who had been worked to death were discarded into the shark-infested waters. Shark Island also housed a gruesome medical complex, where German doctors tortured prisoners by injecting them with dangerous chemicals and diseases. These doctors also performed lethal experimental operations without anesthetic. Conditions were so horrific that numerous inmates committed suicide

At Swakopmund, women were forced to boil heads of the deceased and strip the flesh so that the skulls could be sent to Germany for research. German universities received hundreds of skulls and other remainsfrom the victims of the genocide. These grisly artifacts were then used to demonstrate the allegedly subhuman characteristics of the Africans.

Despite international condemnation, the genocide continued until 1908. By then, the nearly 85,000 Herero had been reduced to 15,000 “starving refugees.” Almost half the 20,000 Nama had been killedThe Witbooi sub-tribe was reduced from 1600 in 1905 to just 38 in 1912. The survivors had lost everything, and many would suffer in destitution for years.

Forgotten Victims

Over one-hundred years later, the genocide has been largely forgotten in the Western consciousness. Shamefully, the race of the victims plays a role in our collective ignorance. Moral outrage can be selective, conditional, and racial. Contrast the international reaction to the events in South West Africa to the reaction to German atrocities in Belgium in 1914. Ironically, in the case of the Herero and Nama genocide, racism belittles the memory of its victims.

 

For decades, Germany possessed hundreds of remains of Herero and Nama people. Finally, in 2011, the German Medical History Museum began restoring some of these remains to Namibia. In 2004, the German government issued an apology for its grievous violation of human rights. In 2015, the German government acknowledged what a UN report had concluded thirty years earlier, that the destruction of the Herero and Nama was genocide. 

 

However, justice has proved elusive. Victim remains are still in museums around the world, including at the Museum of Natural History in New York. While the German government has apologized, it has ruled out the type of reparations for victim’s families that were provided to Holocaust victims. As small minorities within Namibia, the Herero and Nama are still grappling with the economic effects of the genocide. Their reduced numbers limit their political power, and without their lands, many struggle with cyclical poverty. In 2017, some Herero and Nama sued Germany, demanding financial restitution. Those lawsuits remain pending.

 

In the long history of racial violence, the Herero and Nama genocide represented a unique horror. For the first time in history, a modern, industrial nation sought the complete destruction of another people. Germany was guided by a vision of racial superiority, bolstered by pseudoscience and jingoism. Those same forces would form the bedrock of the brutal Nazi regime. 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171868 https://historynewsnetwork.org/article/171868 0
Sidney Blumenthal on Abraham Lincoln and Why History Always Matters

To celebrate the History News Network's arrival at the George Washington University, HNN hosted "Why History Always Matters." As part of the event, HNN Editor Kyla Sommers interviewed Mr. Blumenthal on the importance of history and his 5-part biography of Abraham Lincoln.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171867 https://historynewsnetwork.org/article/171867 0
Roundup Top 10!  

 

In an era of rising anti-Semitism, should Jewish Americans tack left or right?

by Andrew Paul

What a 70-year-old riot says about solidarity.

 

Why the Labor Movement Has Failed—And How to Fix It

by Sarita Gupta, Stephen Lerner, and Joseph A. McCartin

The arc of the economic universe has bent badly toward injustice.

 

 

The Madness of Nuclear Deterrence

by Mikhail Gorbachev

The dangers have only become more acute in the decades since I tried to convince Thatcher.

 

 

Trump's regime is leading America in an insurrection

by Carol Anderson

Trump’s regime has ignited the base by conjuring up a vision of whiteness imperiled by ‘illegals’, ‘black identity extremists’ and Muslim terrorists.

 

 

How Franklin Graham betrayed his father’s legacy

by Nancy Beck Young

Instead of treating issues of sexuality with compassion, Graham has weaponized them.

 

 

The Other Notre-Dame Was Not Rebuilt

by Amy Wilentz

Perhaps France should help Haiti, its former colony, rebuild the cathedral lost in the 2010 earthquake.

 

 

The Poway shooter used an age old terrorist tactic. The media fell for it.

by Ibrahim Al-Marashi

The history behind terrorists’ favorite tactic and how we can fight it.

 

 

The centuries-long fight for reparations

by Ana Lucia Araujo

There is a long and old tradition of black men and women demanding restitution for the time they were enslaved.

 

 

Spring Stirrings and Misgivings

by Rebecca Gordon

Of Autocrats and Uprisings in the Middle East and North Africa.

 

 

US declining interest in history presents risk to democracy

by Edward Luce

In an ever more algorithmic world, Americans increasingly believe humanities are irrelevant.

 

 

A Moral Stain on the Profession

by Daniel Bessner and Michael Brenes

As the humanities collapse, it’s time to name and shame the culprits.

</

 

Attack on the AHA Couldn’t Be More Wrong

by Joy Connolly

"As interim President of a large public graduate school, I believe passionately in providing education that empowers students to make the most of their lives, whether or not they pursue careers in the academy."

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171891 https://historynewsnetwork.org/article/171891 0
Most Americans Reject Trump’s “America First” Policy

 

As president, Donald Trump has leaned heavily upon what he has called an “America First” policy.  This nationalist approach involves walking away from cooperative agreements with other nations and relying, instead, upon a dominant role for the United States, undergirded by military might, in world affairs.

Nevertheless, as numerous recent opinion polls reveal, most Americans don’t support this policy.

The reaction of the American public to Trump’s withdrawal of the United States from key international agreements has been hostile.  According to a Reuters/Ipsos opinion poll conducted in early May 2018, shortly before Trump announced a pullout from the Iran nuclear agreement, 54 percent of respondents backed the agreement.  Only 29 percent favored a pullout.  In July 2018, when the Chicago Council on Global Affairs surveyed Americans about their reaction to Trump’s withdrawal from the Iran nuclear agreement and the Paris climate agreement, it found that 66 favored remaining within the Iran accord, while 68 percent favored remaining within the Paris accord―an increase of 6 percent in support for each of these agreements over the preceding year.

Most Americans also rejected Trump’s 2019 withdrawal of the United States from the Intermediate-Range Nuclear Forces (INF) Treaty with Russia.  A survey that February by the Chicago Council on Global Affairs reported that 54 percent of Americans opposed withdrawal from this nuclear arms control treaty and only 41 percent favored it.  Furthermore, when pollsters presented arguments for and against withdrawal from the treaty to Americans before asking for their opinion, 66 percent opposed withdrawal.

In addition, despite Trump’s sharp criticism of U.S. allies, most Americans expressed their support for a cooperative relationship with them.  The Chicago Council’s July 2018 survey found that 66 percent of Americans agreed that the United States should make decisions with its allies, even if it meant that the U.S. government would have to go along with a policy other than its own.  Only 32 percent disagreed.  Similarly, a March 2019 Pew Research poll found that 54 percent of American respondents wanted the U.S. government to take into account the interests of its allies, even if that meant compromising with them, while only 40 percent said the U.S. government should follow its national interests when its allies strongly disagreed.

Moreover, despite the Trump administration’s attacks upon the United Nations and other international human rights entities―including pulling out of the UN Human Rights Council, withdrawing from UNESCO, defunding UN relief efforts for Palestinians, and threatening to prosecute the judges of the International Criminal Court―public support for international institutions remained strong.  In July 2018, 64 percent of Americans surveyed told the Chicago Council’s pollsters that the United States should be more willing to make decisions within the framework of the UN, even if that meant going along with a policy other than its own.  This was the highest level of agreement on this question since 2004, when it was first asked.  In February 2019, 66 percent of U.S. respondents to a Gallup survey declared that the UN played “a necessary role in the world today.”

But what about expanding U.S. military power?  Given the Trump administration’s success at fostering a massive military buildup, isn’t there widespread enthusiasm about that?

On this point, too, the administration’s priorities are strikingly out of line with the views of most Americans.  A National Opinion Research Center (NORC) survey of U.S. public opinion, conducted from April through November 2018, found that only 27 percent of respondents thought that the U.S. government spent “too little” on the military, while 66 percent thought that it spent either “too much” or “about the right amount.”  By contrast, 77 percent said the government spent “too little” on education, 71 percent said it spent “too little” on assistance to the poor, and 70 percent said it spent “too little” on improving and protecting the nation’s health. 

In February 2019, shortly after Trump indicated he would seek another hefty spending increase in the U.S. military budget, bringing it to an unprecedented $750 billion, only 25 percent of American respondents to a Gallup poll stated that the U.S. government was spending too little on the military.  Another 73 percent said that the government was spending too much on it or about the right amount.

Moreover, when it comes to using U.S. military might, Americans seem considerably less hawkish than the Trump administration.  According to a July 2018 survey by the Eurasia Group Foundation, U.S. respondents―asked what should be done if “Iran gets back on track with its nuclear weapons program”―favored diplomatic responses over military responses by 80 percent to 12.5 percent.  That same month, as the Chicago Council noted, almost three times as many Americans believed that admiration for the United States (73 percent) was more important than fear of their country (26 percent) for achieving U.S. foreign policy goals. 

Unlike the president, who has boasted of U.S. weapons sales to other countries, particularly to Saudi Arabia, Americans are also rather uncomfortable about the U.S. role as the world’s pre-eminent arms dealer.  In November 2018, 58 percent of Americans surveyed told YouGov that they wanted the U.S. government to curtail or halt its arms sales to the Saudi Arabian government, while only 13 percent wanted to maintain or increase such sales.

Finally, an overwhelming majority of Americans continues to express its support for nuclear arms control and disarmament.  In the aftermath of Trump’s withdrawal of the United States from the INF treaty and announcement of plans to build new nuclear weapons, 87 percent of respondents to a February 2019 poll by Chicago Council said they wanted the United States and Russia to come to an agreement to limit nuclear arms.

The real question is not whether most Americans disagree with Trump’s “America First” national security policy but, rather, what they are willing to do about it.

 

[This is a revised version of an article published by Foreign Policy in Focus on April 25, 2019.]

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171862 https://historynewsnetwork.org/article/171862 0
Showing the Data: The Political Uses of the Past Browser The Political Uses of the Past Project collects statements by federal elected and appointed officials, and has long had a goal of making the collection accessible. The table below is a first step.

Each row represents a statement that makes use of the past. The table can be filtered, searched, and sorted. Clicking on a row will show the entire statement in a box underneath the table. The table was originally going to stand alone, but I wanted to provide some sort of visual overview, and that led me to create the tag plot. This feature provides a window into the collection, and any subset of the collection that users create through searching and sorting. (more below…)

A wider version of the table can be viewed here.

The plot shows tags for all the statements in the filtered set. Larger type and a higher position reflects frequency (please note that the y axis is set to a log-10 scale to make the lower half of the plot easier to read). Color and left-right position show whether the tag appears more often with one party or another.

The x axis is based on a simple index. A value of -1 means the tag only appears in statements by Democrats (in the filtered set), and a value of 1 means the tag only appears with Republican statements. A value of zero means it’s an even split. Please note that I included both independents in Congress with the Democrats because they caucus with them (and this shortcut saved me many headaches).

To take an example, the following plot showed up on April 28, 2019 after filtering the statement tags on "voting" (on April 28, 2019). Most of the statements come from the debate on HR 1, the Democrats' We the People Act.

When Democrats make historical references while discussing voting, they referenced Lincoln, racism, slavery, and Martin Luther King Jr. Several Republican statements in this dataset reference an alleged historic primacy of states in the election process. Others referenced the Soviet Union. Rep. Andy Barr (R-KY), for example, insisted that the We the People Act would "Stalinize" American elections. Both parties made reference to the founders in about equal measure.

The Political Uses of the Past Project is collecting these statements to discover patterns and develop insights into how views of the past shape policy. With this searchable table, anyone can do the same. But I had some other uses in mind as well.

  • Historians interested in correcting the record can search the data for statements in their area of expertise. This project is undertaking some fact-checking of these statements (examples here and here), but will never keep up with the volume.
  • Anyone writing on current policy or politics can use the statements to find quotes and ideas on how the past is shaping contemporary debates.
  • Teachers of history, civics, or political science can mine this list for inspiration or source material, or they can point their students to this browser for ideas or assignments.
  • Anyone who is tired of hearing how the study of history doesn’t matter can send those detractors here!

Of course we'd love to hear about any applications of this table or its data; plase contact the project here if you've found it useful (or if you notice any bugs). There is more information about the data and search tools on the browser's dedicated page, here. All suggestions and feedback welcome!

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/blog/154206 https://historynewsnetwork.org/blog/154206 0
6 Presidents Who Never Lost an Election

Americans are used to seeing people who become president as winners. After all, they have defeated an opponent sometimes decisively, sometimes narrowly to reach the highest office in the land. 

However, few American presidents reach that point without experiencing electoral defeat at some point in their careers. In fact, of America’s 44 presidents, only six have won every election they have contested. 

Thirteen presidents lost the presidency or presidential nomination on the way to ultimate victory---Thomas Jefferson, James Monroe, Andrew Jackson, William Henry Harrison, John Tyler, James Buchanan, Abraham Lincoln, Andrew Johnson, Grover Cleveland, Lyndon B. Johnson, Richard Nixon, Ronald Reagan, and George H. W. Bush.  Eighteen presidents have lost the presidency after first winning the position (John Adams, John Quincy Adams, Martin Van Buren, John Tyler, Millard Fillmore, Franklin Pierce, Andrew Johnson, Ulysses S. Grant, Rutherford Hayes, Chester Alan Arthur, Grover Cleveland, Benjamin Harrison, Theodore Roosevelt, William Howard Taft, Herbert Hoover, Gerald Ford, Jimmy Carter, George H. W. Bush).  Tyler, Pierce, Andrew Johnson, Hayes and Arthur were denied re-nomination by their party and any alternative possibility that they sought, while Van Buren, Fillmore, and Theodore Roosevelt lost third party bids. Grant was denied an opportunity to come back four years after his Presidency in his own Republican Party convention in 1880.

Two presidents (Lincoln and George H. W. Bush) lost Senate races on the way to the presidency, while Lyndon B. Johnson lost his first Senate race before winning his second race by the margin of 87 votes statewide in Texas.  Others have lost House of Representatives races or the state governorships or state legislative races or an even more local race, such as a school board. In the case of the two Roosevelts, Theodore Roosevelt lost the New York City mayoralty election in 1886 and Franklin D. Roosevelt lost the vice presidency in the 1920 presidential election. John F. Kennedy lost the open battle for the vice- presidential nomination at the 1956 Democratic National Convention.  And Calvin Coolidge lost a race for the Northampton, Massachusetts School Board in 1904.

Of the six successful presidents, three shared a common career path. George Washington, Zachary Taylor, and Dwight D. Eisenhower all had notable military careers, and never sought office other than the presidency. It is also worth noting that Washington and Eisenhower easily won re-election. Taylor on the other hand died shortly after taking office. 

Illness and death also cut short the careers of two other winning presidents. James A. Garfield, a “dark horse” nominee in 1880, never lost the House seat he first won in 1862. The Ohio State Legislature elected him to the Senate in 1880 but he never served since he also became the Republican presidential candidate that same year.  Sadly, he was assassinated shortly after taking office.

Woodrow Wilson’s electoral record was even thinner. Following a single two-year term as governor of New Jersey, he won the 1912 presidential election in a three-way contest with Republican candidate William Howard Taft and former president Theodore Roosevelt who was running on a third-party ticket. Although he was re-elected in 1916, Wilson’s second term was cut short by a paralytic stroke that left him totally unable to govern in the last seventeen months of his second term, with his wife conducting cabinet meetings and keeping Vice President Thomas Marshall in the dark on Wilson’s true health condition until the end of the term in 1921, the longest period of incapacity of any President.

Donald Trump is the outlier in the group. He is the only presidential candidate, other than Taylor, without previous electoral experience to run in a single election and win. Should he run and win again in 2020, he will have won every election he contested.   

So the road to the Presidency has seen its occupants broadly experience the agony of defeat, but also the later joy of victory and often a later repudiation that sobers their self-image. Defeat at some point is widely common.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171863 https://historynewsnetwork.org/article/171863 0
The Upstart Press Mogul Who Changed How We Understand the World

 

Americans are so busy arguing over who is more biased, Fox News or MSNBC, or whether President Trump will win or lose his fiery war with the nation’s press, that we forget the story of the bold, brazen audacious, in-your-face media mogul who changed our world and made us look at newspapers and television in an entirely different way – Australian upstart Rupert Murdoch.

Murdoch, the son of an Australian newspaper editor, burst into the world at the age of 32, just after his father died. In 1952, he formed his own publishing company and began to gobble up newspapers in Australia, the United Kingdom and the United States. Over the years, he assumed ownership of nearly 800 different media outlets, including television’s Fox News, the New York Post, the book publisher HarperCollins and the Wall Street Journal in America. His style of journalism was wildly different than that of most news outlets, although many soon followed in his footsteps. Murdoch’s view of the media world was loud, big and bodacious. Every story needed a BIG headline, lots of subheads, large photos, the racier the better, and stories of politicians, celebrities, sports figures and anybody and everybody in involved in a scandal. Sex scandals, of course, were the best. He excerpted numerous books, too, the sexier the better, such as the Sensuous Woman.

His rather incredible story, that crossed three continents and fills our homes today, is being told in a solid new musical, Ink, by James Graham, that just opened in New York at the Samuel Friedman Theater on W. 47th Street. It is, like its subject, loud, big and brawling. It is a deep, detailed and mesmerizing look at how the media operated from 1969 to the current day and a marvelous history lesson on the ever-changing role of the media in public life in the UK and in America. It is also a searing look at Murdoch, either loved or hated by Americans of all stripes.

I wanted to see Ink because for 23 years I was a reporter at the New York Daily News, the rival of Murdoch’s New York Post. We did battle with him, big headline vs. big headline and large photo vs. large photo, our movie stars against his movie stars, every day and I remember those fights well. I wanted a chance to get this behind-the-scenes look at the man with whom we jousted from sun up to sun down.

The play is a nice look at what makes media moguls like Murdoch tick. You must remember that nobody in the 1960s (there were similar sensational newspapers, such as the New York Graphic, but these were way back in the 1920s) that were as a bold, and wildly different, as Murdoch in those days. He had his product giveaway contests, celebrity interviews, scary crime stories, one after the other, and photos of girl after girl, wearing as little as possible.

His partner in the play, and in real life, was editor Larry Lamb, who was as outrageous as Murdoch. Lamb was determined to increase the circulation of the Sun, day by day, until it was the number one seller in all of the UK. And he did. The play is about Murdoch, but Lamb is its centerpiece. 

Murdoch’s New York Post, like all of his papers, featured lots of bombastic headlines. My favorite was that above the story of a woman murdered and decapitated, her body left in a strip club. The headline was HEADLESS BDY IN TOPLESS BAR.

The plot of the story is simple. Young Murdoch arrives in England with lots of money and buys the Sun, a struggling tabloid. He remakes it and turns it into a sensational, populist newspaper. He hires taskmaster editor Larry Lamb and they drive people as hard as they can to make the Sun unique. He is hooted and jeered, but never loses his way and becomes very successful

There are some great lines in the play. At one point, Lamb says the paper is getting ugly and Murdoch, glee in his voice, says that “ugly is an art form.”   Someone says to Murdoch that the Sun and Mirror are like David and Goliath. “We all know how that turned out,” Murdoch answers with a sneer

Ink is a delightful, if cheerleaderish, look at hard driving journalist who wants to conquer the media world, and does. It is great fun in many spots, full of energy and pounding songs, and a sobering look at the press, and its affect on the world, in others. It is driven by two mercurial and extremely gifted actors, Bertie Carvel as Murdoch and Jonny Lee Miller as Larry Lamb. Others fine performances, smoothly directed by Rupert Goold, are turned in by Colin McPhillamy as Sir Alick McKay and Rana Roy as Stephanie Rahn. The ensemble is full of talented performers.

There is a tremendous amount of media history in the play. You learn of the seismic shift of news coverage, and its style, brought about by Murdoch. You see how the media operates behind the scenes, including an eye opening look at exactly how the plates of each page are put together and, in the end, hammered tight into a collar. Nothing is spared to let you see how the newspaper media operated in the 1960s and ‘70s. 

On the negative side, the play is very long, nearly three hours. It drives its sensationalist and populist lesson home, but does it too often. It has a lot to do with England and little to do with the United States. It never explains how powerful Murdoch became. He owned television stations all over the world, and even above the world with Sky News Network, the satellite channel. He owned hundreds of newspapers, magazines and book publishing companies. He was close to numerous British prime ministers, a buddy of Margaret Thatcher, Tony Blair and David Cameron. He was married several times, had four sons as media mad as him. Murdoch was a bit like America’s William Randolph Hearst, a Citizen Kane of the world as Hearst was in that famous film. The play never gets into all of the animosity towards Murdoch over the years and all of his legal battles and disputes with government regulatory agencies, either.

Murdoch would love this play, though -INK TOPPLES ALL OTHER MEDIA !!! with a really big photo.

PRODUCTION:  Scenic design ad costumes: Bunny Christie, Lighting: Neil Austin, Music, Adam Cork, Projection Design: Jon Briscoll, Choreography: Lynn Page.  The play is directed by Robert Gold. It runs through June 23.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171864 https://historynewsnetwork.org/article/171864 0
Beware Anew the Military-Industrial Complex: Revisiting Eisenhower's Warning

 

“The only thing we learn from history is that we learn nothing from history.”

“What experience and history teaches us is that people and governments have 

never learned anything from history, or acted on principles deduced from it.”

  – G.W.F. Hegel

 

 

The military-industrial complex, a term brought to life by President Dwight D. Eisenhower in his 1961 farewell address to the nation, is widely acknowledged, quoted, and even embraced today.Yet, ironically, the ubiquity of this embrace hasn’t actually affected the outsized influence of that complex.

 

“In the councils of government,” Eisenhower warned, “we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex [‘military-industrial-congressional complex’ in the original text]. The potential for the disastrous rise of misplaced power exists and will persist.” 

 

He had prefaced these words in the same address by observing: “The total influence [of the then-new conjunction of an immense military establishment and a large arms industry] – economic, political, even spiritual – is felt in every city, every Statehouse, every office of the Federal government. We recognize the imperative need for this development. Yet, we must not fail to comprehend its grave implications.” 

 

If we were blessed with an ideal state of civil-military relations in this country, its distinguishing characteristics would include – in addition to a strategically effective military and strategically competent civilian overseers – what we might call a properly subordinated military-industrial complex. This complex would, accordingly, be subordinated to and supportive of national interests, aims, and responses. Quite the opposite has been the case, though, throughout the almost six decades since Eisenhower alerted the American people the centrality of military industries to civilian life. Today, it is a mammoth, strategically distorting, even strategically dysfunctional confluence of political, ideological and, yes, economic interests that warrants renewed attention, if not alarm. This is especially important as the military-industrial complex has been given new, unfettered life and license by the 2018 National Defense Strategy document, now in place, that has assumed the de facto position of representing America’s current strategic posture.

 

Roughly 13% of the U.S. federal budget now goes to private-sector contractors, 63% of those contracts beingfor defense. Some 51% of defense contracts are for products, 41% for services, the rest for R&D. Defense contracts represent 52% of overall defense spending. In FY 2018 alone, the Defense Department awarded over $358 billion in contracts. The top 10 defense contractors– Lockheed Martin, Raytheon, Northrop Grumman, Boeing, General Dynamics, United Technologies, L3 Technologies, Huntington Ingalls, Leidos, Booz Allen Hamilton – collectively receive some $167 billion a year in defense revenues. That amount exceeds the GDPs of more than 130 of the world’s countries. Lockheed’s defense revenues alone – $48 billion – exceeds the annual military expenditures of all but six countries; while numbers 2 and 3 – Raytheon and Northrop Grumman – each receive over $20 billion in defense revenues, thereby exceeding military expenditures of all but 13 countries. 

 

To be fair, economically speaking, overall defense expenditures represent 3.5% of U.S. GDP; the defense sector provides over 4 million jobs; and roughly 10% of the $2.2 trillion in U.S. factory output goes to arms production. These are not inconsequential considerations for the politically minded and politically motivated among us. At the same time, the United States claims the dubious honor of ranking first in the world in international arms transfers, commanding 36% of the global market. This also isn’t inconsequential for thestrategically minded among us who recognize the potentially destabilizing, arms-race-inducing effects of such transactions.

 

Because Eisenhower’s warning conveyed concern about the well-being of democracy, it is especially important to note thatthe defense sector of industry assiduously exercises its First Amendment rights through lobbying and campaign contributions: $128 million spent on the former, $30 million on the latter in the 2017-18 federal campaign cycle alone. Since 1990, the defense sector has accounted for nearly $200 million in campaign contributions – dwarfed, in comparison, by other sectors, but nonetheless more than mere “beanbag,” as they say.

 

Eisenhower’s April 1953 “Chance for Peace” speech to the American Society of Newspaper Editors, delivered almost immediately after he took office, is even more telling even than his 1961 farewell address. Also broadcast nationwide, it is worth quoting at some length for the examples it affords:

 

Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed. 

 

This world in arms is not spending money alone.

 

It is spending the sweat of its laborers, the genius of its scientists, the hopes of its children.

 

The cost of one modern heavy bomber is this: a modern brick school in more than 30 cities.

 

It is two electric power plants, each serving a town of 60,000 population. 

 

It is two fine, fully equipped hospitals. It is some 50 miles of concrete highway.

 

We pay for a single fighter plane with a half million bushels of wheat.

 

We pay for a single destroyer with new homes that could have housed more than 8,000 people.

 

This, I repeat, is the best way of life to be found on the road the world has been taking.  

 

This is not a way of life at all, in any true sense. Under the cloud of threatening war, it is humanity hanging from a cross of iron.

 

Of course, that was then, when a destroyer cost something on the order of $6 million apiece, a bomber $2.4 million, a fighter jet $211,000. This is now, when an Arleigh Burke-class destroyer runs $1.2 billion (and a Zumwalt-class destroyer $4.5 billion), a B-21 stealth bomber $564 million, an F-35 Joint Strike Fighter $141 million. Thanks to the National Priorities Project, we can see what some selected tradeoffs akin to those Eisenhower offered might look like:

 

For the $19.95 billion we are paying for nuclear weapons and associated costs, we could create, for example, 359,099 infrastructure jobs for a year, or pay 246,838 elementary school teachers for a year.

 

For the $11.45 billion we are paying for the F-35 Joint Strike Fighter, we could provide, for example, 1.11 million military veterans VA medical care for a year, or 344,676 scholarships for university students for four years.

 

For the $1.51 billion we are paying for Predator and Reaper drones, we could provide, for example, 638,124 children or 424,963 adults low-income healthcare for a year.

 

These aren’t just incongruent apples-and-oranges tradeoffs; they are strategic tradeoffs. Those, most notably, who either produced or zealously support the NDS consider such tradeoffs anything but strategic, precisely because the types of domestic spending alternatives offered aren’t militarily relevant. 

 

The NDS is a retro, militaristic call for a self-reaffirming, self-serving, self-fulfilling New Cold War that implicitly bows to and embraces a dominant and domineering military-industrial complex.The NDS claims that (a) the world we face today is determined by Great Power competition, defined in predominantly military terms, in which “revisionist powers” China and Russia seek to unseat us; (b) our supremacy in every domain of warfare – air, land, sea, space, and cyberspace – is now contested and begs restoration; (c) peace (through strength) is achievable primarily, if not solely, by being prepared for war; and (d) the primary line of effort for carrying out this “strategy” is heightened lethality. 

 

The NDS would have us believe that the preferred vehicle for restoring America’s deserved primacy in all the aforementioned domains of warfare is the technological superiority provided by what is now labeled the “National Security Innovation Base.”  Such labeling seemingly implies that national security and defense are essentially synonymous, and that a future-oriented Innovation Base is somehow different (at least rhetorically) than a backward-looking Defense Industrial Base. Irony again intrudes here, by the way, since mobilization is implicitly given new life, but mobilization in the most parochial World War II, giant on-off switch terms. 

 

For the various parties that make up the military-industrial complex, the New Cold War ideology put forth in the NDS is a boon of inestimable consequence, an incestuously preserved warfighting profiteer’s dream. It is also a tacit guarantee that defense industry, not government of, by, and for the people, will continue to call shots most of us don’t even acknowledge it calls on strategic priorities and commitments, military doctrine, the perversely irreversible American Way of War, technology, force structure and disposition, and manpower requirements. And, lest we forget, there is the massive international traffic in conventional arms that ensures the perpetuation and expansion of the arms industry (to “keep the industrial base warm,” of course), even as it feeds provocation, escalation, and destabilization abroad.

 

Though it may seem hyperbolic, even alarmist, one is tempted to harken back to the post-World War I period, when soul-searchingly pejorative “merchants of death” rhetoric was in vogue. With irony again our guide for the moment, one of the most outspoken critics of war profiteering was Marine Major General Smedley Butler, a two-time Congressional Medal of Honor recipient who had spent his 34-year career in uniform dutifully fighting various colonial wars at the turn of the 20th century. His highly publicized 1935 speech/short book War is a Racket spoke bluntly in terms worth remembering today: 

 

War is a racket. It always has been. It is possibly the oldest, easily the most profitable, surely the most vicious. It is the only one international in scope. It is the only one in which the profits are reckoned in dollars and the losses in lives. A racket is best described, I believe, as something that is not what it seems to the majority of the people. Only a small “inside” group knows what it is about. It is conducted for the benefit of the very few, at the expense of the very many. Out of war a few people make huge fortunes.

 

Let us now, if we are to restore or establish strategic sanity in this post-post-Cold War era, revisit these words and those of President Eisenhower. We – the taxpayers, the citizenry, the source of citizen-soldiers, the repository of popular sovereignty – owe ourselves, our progeny, and our future nothing less.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171840 https://historynewsnetwork.org/article/171840 0
The Notre-Dame Fire and Digital Preservation

 

The Notre-Dame fire was a tragedy. Millions mourned the fire’s impact by sharing their memories, posting images of time spent there. Others expressed regret at never visiting or over walking past her every day, taking the Cathedral’s presence for granted. This fire reminds us that even the most iconic representations of our history and culture are fragile. Our treasures can be gone in an instant.

 

Decades ago, the only way to experience a treasure lost to time might be through the types of photos and writings being shared this week. Today, we are far more fortunate. Through the power of technology, we are able to preserve our heritage – books and manuscripts, artworks, performances, architectural structures and sites – in remarkable ways. Some might argue that there is no substitute for visiting an historic site like Notre-Dame in person, but if you have not tried a virtual reality visit to a historical site, you should. We now have the capability to capture these places digitally and to experience and study them in deep, meaningful ways.

 

Years ago I had the opportunity to work with a professor of photogrammetry at the University of Cape Town, Dr. Heinz Ruther. Ruther spent thirty years traveling throughout Africa and around the world to capture UNESCO sites in painstaking detail. My role was to help bring those materials online for others to use as part of a larger initiative led by the non-profit Aluka, to make content from and about Africa accessible for research. The result of Ruther’s work and our collaboration is the ability for researchers to now virtually visit and navigate through many World Heritage sites like Kilwa, Tanzania and Lalibela, Ethiopia. Historians and others studying these sites have now accessed these digital site replicas millions of times. Some may be using them to supplement research trips to the actual site, but most will never travel there. 

 

Technology is enabling access in ways unimaginable in the past, but it is also an essential tool in preservation. One does not have to dig very deep in the imagination to envisage situations where sites like these could be destroyed and these replicas become the only way to experience them. In fact, we worked with a consortium of private libraries, the Andrew W. Mellon Foundation, and Northwestern University to digitize Arabic manuscripts from Timbuktu, Mali. Years later, turmoil in the region brought the manuscripts, their whereabouts, and the comprehensiveness of the collection into question. While the outcome could have been far worse (many manuscripts survived), we were fortunate to have captured these manuscripts digitally to protect against their vanishing forever. This is not always, or even often, the case. 

 

Amidst the sadness surrounding the Notre-Dame fire, I learned that Notre-Dame was captured by Vassar College art historian Andrew Tallon using 3-D imagery. The late Professor Tallon’s images could be used to help in the Cathedral’s restoration. Good news for sure, but I hope that rather than relief we feel a sense of urgency. We have the technology in hand to preserve what is most precious to our past. Our challenge is to make this a priority. 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171841 https://historynewsnetwork.org/article/171841 0
Racism’s Longue Durée: Why the Citizens’ Councils Matter Now

 

How does racism survive?  This is one of the questions taken up by Stephanie Rolph in her new book, Resisting Equality: The Citizens’ Council, 1954-1989.  Born in Mississippi, the Councils played a central role in massive resistance to Brown v. Board of Education, eschewing the terrorist tactics of the Ku Klux Klan for more subtle strategies aimed at thwarting black activism, including economic pressure and pseudo-scientific racist propaganda.  For many, the Councils embodied a type of “uptown” or “white-collar” Klan, a point underscored by Harper Lee when she made Atticus Finch a member in her novel Go Set a Watchman.  Of course, the Councils failed to stop the civil rights movement, leading historian Neil R. McMillen to describe them as “a poignant, perhaps even pitiable, symbol” of those few Americans “unwilling to pay more than lip service to the nation’s equalitarian ideals.”  

That was 1971.  

Today, racism seems to be on the rise – again.  And the Councils might explain why.  Building on McMillen’s landmark study, Rolph brings the group’s story forward, into the 1980s, and shows how the Councils went from a grassroots organization focused on racial intimidation to a much more specialized type of racist think tank, an organization that warehoused and distributed racialist views long after such views had been discredited by the federal government, the mainstream media, and the academy.  In this new guise, argues Rolph, the Councils began to occupy a strategic space in radical far right circles not just in the South, but across the United States and the world, including embattled white enclaves in Rhodesia (now Zimbabwe) and South Africa.    

What ensues is a fascinating meditation on the survival of racial thinking, thanks in part to the Councils refusal to adapt their views.  While many white southerners muted their racial sentiments and assumed a more anodyne “sunbelt conservativism,” as Matthew Lassiter, Joseph Crespino, and Kevin Kruse have shown, the Councils remained defiantly racist, like a stubborn rock formation resistant to erosion, providing simplistic, race-based explanations for complex social problems like crime, joblessness, and urban disorder.  

Of course, historians like Tom Sugrue, Richard Rothstein, and James Forman, Jr. have all shown that urban disorder stemmed from a host of complex, intersecting forces, including southern migration, deindustrialization, white flight, red-lining, suburbanization, and aggressive law enforcement, but the Citizens’ Councils kept it simple.  As Rolph demonstrates, the Councils explained black poverty and joblessness as a simple factor of racial difference, “a predictable outcome of the actions of a race of people biologically incapable of self-regulation.” (p. 173)

For the uneducated and uninformed –  i.e. those who have not taken a seminar in urban history –  this was, and remains, an appealing idea, a unified theory of American society grounded in the perceptible, physical structures of that society, its crumbling cities, its pristine suburbs, it’s urban blight, it’s suburban bloom.  To borrow from Fernand Braudel, it is a way of thinking that survives over the long term, or “longue durée,” not because racists are inherently bad, but because it provides an easy explanation for complicated realities.   

And herein lies a stubborn irony.  Before the civil rights victories of the 1960s, African Americans could point to concrete examples of racial discrimination that were hard, even for racial conservatives like Harry Truman, to ignore.  Lynching in the South provided an example, as did segregated schools, segregated buses, segregated lunch counters, and a host of other explicitly racist institutions and policies.  Today, however, racial oppression is more subtle, less visible, and – ultimately – harder to discern, a topic for advanced seminars, a subject of advanced study.  

But what of those who don’t study?  For them, racism is a theory that fits in a tweet, a short explanation for the longue durée. 

 

To read more from Anders Walker, check out his most recent book:

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171836 https://historynewsnetwork.org/article/171836 0
Why FDR Wouldn't Condemn Hitler

This editorial cartoon by Jerry Doyle, published in the Philadelphia Record on April 22, 1939, contributed to the erroneous perception among some Americans that the people of Danzig were opposed to Hitler. In fact, election results in Danzig demonstrated overwhelming support for the Nazis.

 

“Danzig is a German city and wishes to belong to Germany!”

With that declaration eighty years ago this week, Adolf Hitler once again threw down the gauntlet to the international community. No other country had interfered when Nazi Germany illegally remilitarized the Rhineland in 1936, annexed Austria in 1938, and gobbled up Czechoslovakia in 1938-39. So now Hitler set his sights on his next target: the city-state of Danzig.

Situated strategically on the coast of northwestern Poland but inhabited overwhelmingly by ethnic Germans, Danzig had gone back and forth between German and Polish rule over the centuries. The Versailles Treaty after World War One established it as a “Free City” under the control of the League of Nations.

As Nazism rose in Germany in the late 1920s and early 1930s, so too did it gain in popularity in Danzig. The city’s Nazi party went from winning one seat in the Danzig parliament in the elections of 1927 to twelve (out of 72) in 1930, then 38 in 1933, giving it a majority.

But Hitler did not act immediately. In the mid and late 1930s, the Nazis were still in the process of re-arming and testing the West’s responses to their actions. The failure of the international community to challenge Hitler over the Rhineland or Austria sent a clear message. That was followed by the sacrifice of Czechoslovakia, in the 1938 Munich Agreement. Then came Hitler’s announcement to the Reichstag on April 28, 1939, demanding the surrender of Danzig along with a land corridor leading to it. 

Reporters were keen to learn how President Franklin D. Roosevelt would respond to this latest, blatant challenge by the Nazi leader to the authority of the League of Nations. FDR, however, was not too keen to comment.

On April 29, the New York Times reported:  “Anticipating the nature of Herr Hitler’s address and the barrage of questions on his reaction to it that would have been inevitable under the circumstances, the President late yesterday had canceled his usual Friday press conference.”

The Times added that during President Roosevelt’s meeting with the prince and princess of Norway that day, a conversation was overheard in which the president was asked what he thought of Hitler’s Danzig threat. FDR reportedly responded, 

“How can any one have a reaction to a speech that lasts more than two hours?” And then: “Six o’clock in the morning is rather early, don’t you think?”

The next day, April 30, the president spoke at the opening of World’s Fair in New York City. In his first public remarks since the Hitler speech, FDR spoke vaguely of the need for “peace and good-will among all the nations of the world,” but made no mention of the Nazi leader or the fate of Danzig.

Finally, on May 2, the president held a regularly scheduled news conference, at which point there was no way avoid questions about his reaction to Hitler’s threat. Here’s how the exchange went:

Q: Have you seen the full text of the Hitler speech yet?

FDR: What?

Q: Hitler’s speech?

FDR: Only the one that came out in the papers. Probably the State Department is still translating it.

Q: It takes a while, I imagine.

FDR: Do you suppose that the text was handed to them, translated into English in Berlin?

Q: Yes, sir; one of the stories said it was handed to them in an English translation.

FDR: Was it?

Q: Official translation. The English translation was flown to London, I saw in one story.

FDR: Well, the State Department was doing its regular translating for what they had taken down on the verbal stuff. I don’t know how much he followed the text. As you know, sometimes I do not stick to the text.

 

President Roosevelt is best remembered for leading America towards military preparedness  and, later, in the war against Nazi Germany—yet he was remarkably reluctant to even verbally criticize Hitler in the 1930s.

Throughout the pre-war period, FDR strove to maintain cordial diplomatic and economic relations with Nazi Germany. He sent Secretary of Commerce Daniel Roper to speak at a German-American rally in New York City in 1933, where the featured speaker was the Nazi ambassador to Washington, and a large swastika flag was displayed on stage. The president allowed U.S. diplomats to attend the mass Nazi Party rally in Nuremberg in 1937, and his administration helped the Nazis evade the American Jewish community’s boycott of German goods in the 1930s by permitting the Nazis to deceptively label their goods with the city or province of origin, instead of “Made in Germany.”

Despite the intensifying anti-Jewish persecution in Germany in the 1930s, Roosevelt not only refused to criticize the Hitler government, but he personally removed critical references to Hitler from at least three planned speeches by Interior Secretary Harold Ickes in 1935 and 1938. Even Roosevelt’s criticism of the infamous Kristallnacht pogrom—a public statement which has often been cited as proof of the president’s willingness to denounce the Nazis—did not contain a single explicit mention of Hitler, Nazism, or the Jews.

Roosevelt said nothing about Hitler’s action in the Rhineland (1936); applauded the Munich agreement, which handed western Czechoslovakia to the Nazis (1938); and, eighty years ago this week, ducked reporters’ questions rather than utter a single critical word regarding Hitler’s threat to Danzig.

FDR was, of course, saddled with the burden of a largely isolationist public and Congress. He was understandably reluctant to be seen as doing anything that might seem to edge America close to war with Germany. Yet a president’s job is to lead, not to follow. A few words from the White House directly taking issue with Hitler’s aggressive actions and persecution of the Jews could have helped alert the public to the Nazi danger. 

Explaining President Roosevelt’s refusal to comment on Hitler’s remilitarization of the Rhineland in 1936, the diplomatic correspondent of the Washington Evening Standard reported that the president “is determined not to take sides under any circumstances.” But there are circumstances when, even if it is unpopular, a president needs to publicly “take sides”—to take the side of good against the side of evil.

A stronger response from President Roosevelt over Danzig or the earlier crises also would have indicated to Hitler that there might be consequences for his actions—something that was particularly important in the early and mid 1930s, when the Nazi leader was still testing the waters. 

“It is not trade but empire that is Hitler’s goal,” a New York Times editorial acknowledged following the Danzig speech. “How far he will go and how fast he will go toward acquiring it will depend solely upon how much opposition is offered him.” 

FDR’s non-response to Danzig sent Hitler exactly the wrong message.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171833 https://historynewsnetwork.org/article/171833 0
Breaking the Grip of Militarism: The Story of Vieques

 

Vieques is a small Puerto Rican island with some 9,000 inhabitants.  Fringed by palm trees and lovely beaches, with the world’s brightest bioluminescent bay and wild horses roaming everywhere, it attracts substantial numbers of tourists.  But, for about six decades, Vieques served as a bombing range, military training site, and storage depot for the U.S. Navy, until its outraged residents, driven to distraction, rescued their homeland from the grip of militarism.

Like the main island of Puerto Rico, Vieques—located eight miles to the east―was ruled for centuries as a colony by Spain, until the Spanish-American War of 1898 turned Puerto Rico into an informal colony (a “nonsovereign territory”) of the United States.  In 1917, Puerto Ricans (including the Viequenses) became U.S. citizens, although they lacked the right to vote for their governor until 1947 and today continue to lack the right to representation in the U.S. Congress or to vote for the U.S. president.

During World War II, the U.S. government, anxious about the security of the Caribbean region and the Panama Canal, expropriated large portions of land in eastern Puerto Rico and on Vieques to build a mammoth Roosevelt Roads Naval Station.  This included about two-thirds of the land on Vieques.  As a result, thousands of Viequenses were evicted from their homes and deposited in razed sugar cane fields that the navy declared “resettlement tracts.”

The U.S. Navy takeover of Vieques accelerated in 1947, when it designated Roosevelt Roads as a naval training installation and storage depot and began utilizing the island for firing practice and amphibious landings by tens of thousands of sailors and marines.  Expanding its expropriation to three-quarters of Vieques, the navy used the western section for its ammunition storage and the eastern section for its bombing and war games, while sandwiching the native population into the small strip of land separating them.

Over the ensuing decades, the navy bombed Vieques from the air, land, and sea.  During the 1980s and 1990s, it unleashed an average of 1,464 tons of bombs every year on the island and conducted military training exercises averaging 180 days per year. In 1998 alone, the navy dropped 23,000 bombs on Vieques.  It also used the island for tests of biological weapons.

Naturally, for the Viequenses, this military domination created a nightmarish existence.  Driven from their homes and with their traditional economy in tatters, they experienced the horrors of nearby bombardment.  “When the wind came from the east, it brought smoke and piles of dust from their bombing ranges,” one resident recalled.  “They’d bomb every day, from 5 am until 6 pm.  It felt like a war zone.  You’d hear . . . eight or nine bombs, and your house would shudder. Everything on your walls, your picture frames, your decorations, mirrors, would fall on the floor and break,” and “your cement house would start cracking.”  In addition, with the release of toxic chemicals into the soil, water, and air, the population began to suffer from dramatically higher rates of cancer and other illnesses.

Eventually, the U.S. Navy determined the fate of the entire island, including the nautical routes, flight paths, aquifers, and zoning laws in the remaining civilian territory, where the residents lived under constant threat of eviction. In 1961, the navy actually drafted a secret plan to remove the entire civilian population from Vieques, with even the dead slated to be dug up from their graves.  But Puerto Rican Governor Luis Munoz Marin intervened, and U.S. President John F. Kennedy blocked the Navy from implementing the plan.

Long-simmering tensions between the Viequenses and the navy boiled over from 1978 to 1983. In the midst of heightened U.S. naval bombing and stepped up military maneuvers, a vigorous local resistance movement emerged, led by the island’s fishermen.  Activists engaged in picketing, demonstrations, and civil disobedience―most dramatically, by placing themselves directly in the line of missile fire, thereby disrupting military exercises.  As the treatment of the islanders became an international scandal, the U.S. Congress held hearings on the matter in 1980 and recommended that the navy leave Vieques.

But this first wave of popular protest, involving thousands of Viequenses and their supporters throughout Puerto Rico and the United States, failed to dislodge the navy from the island.  In the midst of the Cold War, the U.S. military clung tenaciously to its operations on Vieques.  Also, the prominence in the resistance campaign of Puerto Rican nationalists, with accompanying sectarianism, limited the movement’s appeal.

In the 1990s, however, a more broadly-based resistance movement took shape.  Begun in 1993 by the Committee for the Rescue and Development of Vieques, it accelerated in opposition to navy plans for the installation of an intrusive radar system and took off after April 19, 1999, when a U.S. navy pilot accidentally dropped two 500-pound bombs on an allegedly safe area, killing a Viequenses civilian.  “That shook the consciousness of the people of Vieques and Puerto Ricans at large like no other event,” recalled Robert Rabin, a key leader of the uprising. “Almost immediately we had unity across ideological, political, religious, and geographic boundaries.”

Rallying behind the demand of Peace for Vieques, this massive social upheaval drew heavily upon the Catholic and Protestant churches, as well as upon the labor movement, celebrities, women, university students, the elderly, and veteran activists.  Hundreds of thousands of Puerto Ricans throughout Puerto Rico and the diaspora participated, with some 1,500 arrested for occupying the bombing range or for other acts of nonviolent civil disobedience.  When religious leaders called for a March for Peace in Vieques, some 150,000 protesters flooded the streets of San Juan in what was reportedly the largest demonstration in Puerto Rico’s history.

Facing this firestorm of protest, the U.S. government finally capitulated.  In 2003, the U.S. Navy not only halted the bombing, but shut down its Roosevelt Roads naval base and withdrew entirely from Vieques.

Despite this enormous victory for a people’s movement, Vieques continues to face severe challenges today.  These include unexploded ordnance and massive pollution from heavy metals and toxic chemicals that were released through the dropping of an estimated trillion tons of munitions, including depleted uranium, on the tiny island.  As a result, Vieques is now a major Superfund Site, with cancer and other disease rates substantially higher than in the rest of Puerto Rico. Also, with its traditional economy destroyed, the island suffers from widespread poverty.  

Nevertheless, the islanders, no longer hindered by military overlords, are grappling with these issues through imaginative reconstruction and development projects, including ecotourism.  Rabin, who served three jail terms (including one lasting six months) for his protest activities, now directs the Count Mirasol Fort―a facility that once served as a prison for unruly slaves and striking sugar cane workers, but now provides rooms for the Vieques Museum, community meetings and celebrations, historical archives, and Radio Vieques.

Of course, the successful struggle by the Viequenses to liberate their island from the burdens of militarism also provides a source of hope for people around the world.  This includes the people in the rest of the United States, who continue to pay a heavy economic and human price for their government’s extensive war preparations and endless wars.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171839 https://historynewsnetwork.org/article/171839 0
Bangladesh Prime Minister Hasina's War on Yunus and America Two popular myths have been swirling around the world as to why Prime Minister Sheikh Hasina of Bangladesh declared war on Nobel prize winner Muhammad Yunus. Hasina ignored Moriarty's request and ousted Yunus in 2011 when she won a legal battle to kick out the micro-loan guru from the Grameen Bank.

First, Yunus conspired with the powerful military to exile the nation's top two politicians, while prepping himself to enter politics; second, Hasina felt jealous because Yunus won the Nobel prize that she believed she deserved for her role in ending a decades-old tribal insurgency.

In fact, neither one was the direct cause of the Hasina-Yunus duel. The Awami League, the political party led by Hasina, put Yunus in the dock long before he made public his political ambitions or won the Nobel.

Abdul Jalil, general secretary of the Awami League, publicly refused to accept Yunus as non-party interim government chief to supervise parliament polls a month before he received the prize and five months before he revealed his intention to get involved in politics.

Jalil's comments came a day after Yunus told a civil-society forum in Dhaka, the Bangladeshi capital, that he would be pleased to be the chief adviser of the caretaker administration. Jalil not only dismissed the possibility, but also opposed the Grameen Bank founder's ideas to reform Bangladesh's governance.

 

Hasina Acted on India's Advice

Awami League's bellicose attitude toward Yunus partly resulted from information Hasina received from India. New Delhi and Washington were on the same page regarding Bangladesh on almost every issue, except for one: India opposed giving the theocratic Jamaat e-Islami party political space in Bangladesh. But the United States feared that Jamaat might turn highly radical if it was pushed underground. Delhi was also worried that Washington wanted Yunus to replace Hasina, a staunch Indian ally.

Rumors were rife in Dhaka in December 2006 that America wanted neither Hasina nor former Prime Minister Khaleda Zia to win the election set for 2007. Mohan Kumar, joint secretary for Bangladesh at India's foreign office, told a U.S. diplomat in New Delhi that sources continued to report that the United States was positioning the 2006 Nobel prize winner to run in the election. He said Bangladeshi elite speculated that America "fixed" the award for the U.S.-trained economist.

According to Kumar, people in Dhaka suspected that the United States arranged for Yunus to win the prize to enhance his political credentials. Although Kumar did not subscribe to the allegations, still he wanted Washington to know that the rumors were alive in Dhaka. 

The United States and India had a common understanding on Bangladesh policy, but New Delhi was still concerned about Washington's "lack of conviction" regarding Jamaat's links to terror, Kumar said. He added that India "does not understand" American view that entry into the political mainstream would moderate Jamaat. Further, he observed, America was biased toward the anti-India Bangladesh Nationalist Party, headed by Zia, and Jamaat. He noted that "the Bangladeshis are very aware of it." 

Hasina perceived Yunus to be a Zia supporter, even though Yunus denied having any links with any political party. Her suspicion deepened in 2006 when she learned that Yunus had been nominated by the BNP to be an adviser of the interim government.

 

Yunus' U.S.-Link Irked Hasina

Hasina found Yunus' close link with America problematic, too. She told a cabinet meeting that Yunus was engaged in a conspiracy to undermine her government with help from his American friend, former Secretary of State Hillary Clinton.

Yunus faced Hasina's direct wrath after she returned to power for the second time in early 2009. He had long desired a change in the law that gave the government control over the appointment of the bank chairman. The immediate-past interim regime had amended the ordinance to transfer the power to the bank's board of directors. But the constitution required that parliament must approve the amendment.

He was clearly worried, because of her negative attitude toward him, that Hasina might oppose the parliamentary approval. So he sought assistance from her colleagues. Despite support from several cabinet members, the prime minister refused to ratify the change. Yunus, a naturalized American, then approached the United States for help.

On 10 May 2009, Yunus sought the U.S. ambassador's input on the best way to request Hasina to reconsider her refusal. The envoy pledged to arrange for the beleaguered banker a meeting with the prime minister and put in a good word to her for him. Yunus talked with Ambassador James F. Moriarty after a meeting with Clinton in Washington a month earlier when he discussed his problem with the prime minister. Hasina ignored Moriarty's request and ousted Yunus in 2011 when she won a legal battle to kick out the micro-loan guru from the Grameen Bank.

 

Yunus' Threat Angered Hasina

Hasina was further miffed by Yunus when he said that 8.3 million Grameen Bank members – who represented forty million Bengalis, or twenty-five percent of the nation's total population, according an estimate by Yunus – were not only citizens but also voters. This was a veiled threat that these voters could punch a mega hole in the Awami League's ballot box.

Her rage at Yunus stemmed also from the professor's public criticism of politicians as corrupt. Soon after winning the Nobel, Yunus said, “political leaders should give up revengeful politics and spiteful activities to offer a better political environment to the nation.” Yunus believed that the political system was hindering Bangladesh's progress. He offered an alternative agenda and announced his plan to start a political party, drawing a sharp public rebuke from Hasina.

The question, however, remained if Yunus wanted both Hasina and Zia out of politics, why didn't Zia go against the Nobel laureate? Zia instead urged Washington and London to protect Yunus from Hasina's fury. Zia's sympathy for Yunus cemented Hasina's impression that the micro-finance guru was in bed with Zia, politically speaking, and doomed the banker. Yunus' hobnobbing with the military and flirting with the idea of entering the messy world of Bengali politics were the last straw that broke the camel's back.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171837 https://historynewsnetwork.org/article/171837 0
Yet Another Assault on the Meaning of an Education

 

On Valentine’s Day the University of Tennessee at Martin offered students a little Valentine’s gift: a shortened path to a college degree and a law degree.  Known as the 3+3 Program, students majoring in Political Science or English can now opt for taking three years of undergraduate work, and, with an appropriate LSAT score, proceed directly to UT Knoxville’s Law School.  On completing three years of law school the student will then earn both hisor her BA or BS and Law Degree. Most attractive of all, the program covers tuition for the first year of law school, which can be very expensive, with the various scholarships and loans that would have covered the fourth year of undergraduate education.

I know that on the surface some of our students will find this program very appealing.  I know that as an undergraduate I would certainly have grabbed at such an opportunity.  Nevertheless, I want to suggest that this program is detrimental to our students, to our school, and to the very meaning of the word education.

My department, the Department of History and Philosophy, has begun to discuss whether we should offer this same option to our History and Philosophy majors.  I recently sent other faculty members in my department an email opposing our joining this program and asking that we, as a department, register our opposition to the program and appeal to English and Political Science to reconsider their participation in the program.  I said in my email that I was opposed to the program for three principal reasons.

“First and foremost, this program, which substitutes the first year of law school for the final year of a student’s undergraduate education, deprives our students of the strongest possible grounding we can give them in the liberal arts and humanities.  Our students will miss not only two semesters of what our department offers them – four upper division history or philosophy classes – but likely will miss two or three upper division classes in other branches of the liberal arts as well: English, Sociology, Political Science, Psychology, to name a few subject areas.  Substituting for six or seven upper division classes, then, classes which ground our students in an understanding of their society, and of how our society shapes all of us, students will take first year law classes:

“Semester 1: Civil Procedure I, Contracts I, Criminal Law, Legal Process I, Torts I “Semester 2: Civil Procedure II, Contracts II, Legal Process II, Property, Torts II

“… I hope that we can agree that these law school classes do not in any serious way allow students to better shape the values that will guide their lives, the very purpose of an education and the clear function of humanities classes.”

Let me emphasize this point here: the purpose of education is to help students understand themselves; help them understand their relation to the society and the universe in which they live; and help them choose the values and the principles by which they will live their lives.  We live today in tremendously dangerous times, times of the most rapid, frightening changes, times that demand that we understand what is going on around us – lest we be caught unaware and intellectually unarmed in the face of ongoing and potentially catastrophic wars and economic depressions.  Only a liberal arts education – a grounding in history and literature and psychology, to name some key areas of that education – allows us to understand something about the society in which we live, and something about ourselves, something that will allow us to act intelligently in the face of these contemporary events.  Absent this education, we are simply tools in the hands of the powerful, servants to be stampeded in this direction or that.

My email continued:  “Second, while I certainly believe that a university education should challenge all students, and ground them in a sense of their own humanity, those students who take up the law need even more grounding in the liberal arts than students pursuing other areas of study – if for no other reason than that lawyers, far more than other occupations, deal with power, and power demands an education in ethics and in the humanities.”

 “Finally, this 3+3 proposal is part of a larger trend in higher education, a trend that devalues the liberal arts and pushes students through career tracks as quickly as possible.  We do ourselves no favors by yielding to this trend. On the contrary, we set the precedent of practically declaring that our disciplines, and the humanities in general, are merely stepping stones to careers, rather than being essential components of responsible citizenship and the leading of meaningful lives.”

I know that a growing body of students on this campus hunger for a real education.  But real education, that education which allows us to discover ourselves, who we are, and where our potentialities and passions lie, that education can only be achieved if we demand it.  A small group of concerned students and faculty are building a “Campaign for the Humanities.”  Please contact me if you’d be interested in joining us at dbarber@utm.edu.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171835 https://historynewsnetwork.org/article/171835 0
Aram Goudsouzian: Study History, Then Make It

 

Aram Goudsouzian is the chair of the History Department at the University of Memphis and the editor of the “Sport and Society” series published by the University of Illinois Press. His books include Down to the Crossroads: Civil Rights, Black Power, and the Meredith March Against Fear (Farrar, Straus, and Giroux, 2014), King of the Court: Bill Russell and the Basketball Revolution (University of California Press, 2010), Sidney Poitier: Man, Actor, Icon (University of North Carolina Press, 2004), and The Hurricane of 1938 (Commonwealth Editions, 2004). Aram’s latest book is The Men and the Moment: The Election of 1968 and the Rise of Partisan Politics in America (University of North Carolina Press, 2019). 

 

What books are you reading now?

 

I’m just now emerging from underneath a pile of books nominated for the annual award given by my university’s Benjamin L. Hooks Institute for Social Change. The award goes to a non-fiction book, published in the previous year, “that best furthers understanding of the American Civil Rights Movement and its legacy.” It is my turn as committee chair, which means that I sift the nominees down to five finalists – a job that is both enjoyable and excruciating. On one hand, I get to lean back and read lots of great new books. On the other hand, I feel wracked with guilt when eliminating some quality books by great historians.

 

Our five finalists, however, are terrific: Keisha Blain, Set the World on Fire: Black Nationalist Women and the Global Struggle for Freedom (Pennsylvania); Mary Schmidt Campbell, An American Odyssey: The Life and Work of Romare Bearden (Oxford); Elliott J. Gorn, Let the People See: The Story of Emmett Till (Oxford); Wil Haygood, Tigerland: 1968-1969: A City Divided, a Nation Torn Apart, and a Magical Season of Healing (Knopf); and David Margolick, The Promise and the Dream: The Untold Story of Martin Luther King, Jr., and Robert F. Kennedy (Rosetta).

 

What is your favorite history book?

 

No book had a more profound impact on me than Parting the Waters, the first volume in Taylor Branch’s “America in the King Years” trilogy. Spanning from the Montgomery Bus Boycott of the mid-1950s through the Birmingham Campaign of the early 1960s, it not only traces the rise of Martin Luther King, but also sheds light on various corners of American politics, from the Oval Office to the Mississippi Delta. It is a model for how to write narrative history. Crafted on an epic scale, it is nevertheless grounded in humanity.

 

Why did you choose history as your career?

 

When I graduated from college, I worked as a customer service representative for a mutual fund company. That job sounds about as soul-deadening as it is. But I had no idea what to do with my life. Once removed from academic life, however, I realized how much I loved the world of ideas and books. So I applied for graduate school. 

 

But history as a career? That always seemed like a longshot. I got rejected from so many graduate programs, both for my MA and Ph.D., and spent many futile years on the academic job market. I chose history for a career, but it wasn’t clear if history would choose me back! 

 

What qualities do you need to be a historian?

 

You can answer this question a hundred different ways. There are intellectual traits such as curiosity and logic, practical skills such as organizing and writing, and humane principles such as compassion and empathy. For me, though, nothing is more important than taking satisfaction in the process. I find it important to define small goals and then achieve them – whether that is learning a body of literature, plowing through another reel of microfilm, or churning out a few good pages. There’s no secret to writing books. It just takes time. Lots of time. One way or another, you have to embrace each hour.

 

Who was your favorite history teacher?

 

Mr. Walsh, who taught my U.S. history class in high school, was a huge man who somewhat resembled a walrus. The rumor was that he moonlighted as a bartender, which seemed really cool to a dorky teenager. And he had a gift for teaching. He would get so emotional when discussing historical events, and he continually forced us to think in more complex terms. For our assignments, he would give us photocopied excerpts from books by major scholars and have us write on big historiographical questions such as “New Deal: Evolution or Revolution?” Much later, I appreciated how he made us think like historians.  

 

What is your most memorable or rewarding teaching experience?

 

In the fall of 2017, I taught a course called “Memphis and the Movement,” which sought to provide a longer, more contextualized account of our city’s black freedom movement, especially since we were about to observe the 50th anniversary of the Memphis sanitation strike and Martin Luther King assassination. With students from both History and Journalism, it was a diverse bunch in terms of race, gender, age, and background. What a class! It was full of creative tension, with both intellectual analyses of our reading and passionate discussions about its meaning. Because it was our own city’s history, it was often raw and personal. We had many guest speakers, including historians and journalists with particular areas of expertise in Memphis history. For their final project, the students analyzed oral histories from the Memphis Search for Meaning Collection, an extraordinary resource in our own Special Collections.

 

Many of those students then took a Spring 2018 course in Journalism called “Reporting Social Justice,” interviewing the city’s activists from the past and present. Those interviews, in turn, lent the foundation for a documentary film directed by Journalism professors Roxane Coche and Joe Hayden, entitled Once More at the River: From MLK to BLM.  

 

What are your hopes for history as a discipline?

 

Survival. The larger forces in higher education – an emphasis on professional degrees, a financial dependence on the business community, state funding models that reward shortcuts to student graduation – are threatening to erode the central place of the humanities in the college experience. I have spent the past six years as department chair with model scholar-teachers for colleagues and a supportive dean. We have extraordinary advisers and a terrific in-house tutoring center. We do energetic community outreach and directly seek out history majors. For all these successes, though, we are battling to keep the humanities at the core of the college experience.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

If there are two kinds of book owners, and one kind is the type that keeps the book in mint condition, then I am the other kind. I check off passages, write questions in them, and leave myself reminders. So I’m no book collector. I guess I’m more of a book abuser.

 

What have you found most rewarding and most frustrating about your career? 

 

As much as I like teaching and love mentoring graduate students, the most personally rewarding aspect of my career has been writing books. The challenge is so appealing: telling a story, crafting a world around it, and rendering it in human terms. And now that I am firmly ensconced in middle age, I tend to associate each book with a distinct phase in my life; they take on layers of personal meaning in that way.

 

The most frustrating aspect has been serving as department chair. My term ends this year, however. Soon I can stop solving problems and start causing them!  

 

How has the study of history changed in the course of your career?

 

The historiography has certainly evolved in the fields that I write about, such as the civil rights movement, post-war U.S. politics, and sports history. Nothing has changed more, however, than the teaching of history. We have so many pedagogical tools now, both theoretical and practical, that help get our studentsdoinghistory, rather than just learning it.

 

What is your favorite history-related saying? Have you come up with your own?

 

I did, once, sort-of come up with a history slogan. About five years ago I yanked my car to the side of the road, because I’d just seen a billboard advertising the University of Memphis that said, “DON’T STUDY HISTORY. MAKE IT!” I went rather bananas. Soon I was writing rage-filled emails to every administrator in the university. Our very calm and reasonable university president called me to suggest an alternative. 

 

A few days later, the billboard read, “STUDY HISTORY. THEN MAKE IT!” Cheesy, I know, but better than the alternative.

 

What are you doing next?

 

I just finished a short book about the presidential election of 1968, called The Men and the Moment, that will be out in April. For my next project, I hope to write a big narrative history of American sports. I think it can be a way to tell a story that courses along the broad contours of the nation’s history, while grounded in the personal triumphs and struggles of a diverse set of characters. 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171834 https://historynewsnetwork.org/article/171834 0
To His Followers, Trump is a Folk Hero Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

 

There is nothing new in trying to figure out Trump. His appeal and his personality have been the subject of countless analyses and speculations since long before he ran for President. Yet the mysteries continue. Why do people like him? Why does he act so badly?

 

Charles Blow of the New York Times produced a thoughtful explanation of Trump’s popular appeal a couple of weeks ago in an opinion column entitled “Trumpism Extols Its Folk Hero”. Blow believes that Trump has become a “folk hero”, that rare person who “fights the establishment, often in devious, destructive and even deadly ways,” while “those outside that establishment cheer as the folk hero brings the beast to its knees.” Because the folk hero engages in the risky David vs. Goliath struggle against the awesomely powerful “establishment”, his personal sins are forgiven: “his lying, corruption, sexism and grift not only do no damage, they add to his legend.”

 

Thus the persistent belief among Trump’s critics that exposing his manifest dishonesty will finally awaken his base to reality is mistaken. His ability to get away with every possible form of cheating is part of his appeal, because he is cheating the establishment, the elite, the “deep state”, the “them” that is not “us”.

 

For his fans, the Mueller report is only the latest example of this extraordinary success. Despite years of investigation, Trump skates. It’s not important whether he was exonerated or not. What matters is that he can claim he was exonerated and go on being President, no matter what the report says, no matter what he actually did.

 

Wikipedia provides a list of folk heros, every one a familiar name, including Johnny Appleseed, Daniel Boone, Geronimo and Sitting Bull, Nathan Hale and Paul Revere, all people who really were heroic. The key early elements of the Robin Hood folklore, developed hundreds of years ago, are that he fought against the government, personified in the Sheriff of Nottingham, and that he was a commoner, who gave his ill-gotten gains to the poor.

 

That is one way to become a folk hero, but not the only one. Neither politics nor morality determine whether someone can become a folk hero. Wikipedia also tells us that the “sole salient characteristic” of the folk hero is “the imprinting of his or her name, personality and deeds in the popular consciousness of a people.” It would be hard to find anyone who has done a better job of doing just that for decades than Trump.

 

Villainy unalloyed by any goodness has also propelled many people, almost all men, into the ranks of folk heroes, like Jesse James, Butch Cassidy, and Bonnie and Clyde. These criminals captured the popular imagination, not despite being bad, but because of it. They were great in their villainy, outlaws in both the legal and social sense, stealing other people’s money for their own benefit, but that does not detract from their appeal. 

 

Enough people love bad boys that they can achieve legendary status, or even more rarified, a TV series. The popularity of series with villains as heroes demonstrates the broad appeal of bad people. “Breaking Bad” attracted giant audiences and honored by Guinness World Records as the most critically acclaimed show of all time. 

 

Since he first came into the public eye, Trump has reveled in being the bad boy. He grabbed women at beauty contests and bragged about it. He delights in his own running racist commentary on people who are not white. He lies when he knows he’ll get caught, and then keeps repeating it. He celebrates himself in his chosen role as the bad guy. Meanness was at the heart of his role in “The Apprentice”, where his greatest moments were saying “You’re fired!”

 

One writer recently asked, “Why does Trump fall in love with bad men?” Trump says nicer things about the world’s most notorious political thugs than would be normal for speaking about the leaders of our closest allies. After meeting North Korea’s Kim Jong Un, Trump told a rally ,“Then we fell in love, okay. No, really. He wrote me beautiful letters. And they’re great letters. We fell in love.” Trump met President Rodrigo Duterteof the Philippines in November. The White House said they had a “warm rapport” and a “very friendly conversation” on the phone. Trump said “We’ve had a great relationship.” Duterte sang the Philippine love ballad “Ikaw” to Trump at a gala dinner.

 

The prize goes to Trump’s open admiration for Vladimir Putin. During his campaign, Trump said he had met Putin and “he was nice”. Then said, “I never met Putin. I don’t know who Putin is. He said one nice thing about me. He said I’m a genius.” Putin never said that, but for Trump that made Putin “smart”. He claimed a “chemistry” with Putin. Here’s what Trump cares about: “He says great things about me, I’m going to say great things about him.”

 

Trump’s attraction to this international rogues’ gallery is personal and emotional. He wants the exclusive club of dictators, macho men, tough guys, to love him and to accept him as one of them. Donald Trump’s foreign policy is his attempt to become the leader of the bad boys of the world.

 

But at the heart of connection between bad boy folk hero Trump and his adulating base is a fundamental misunderstanding. Trump is not fighting the establishment. Trump is not using his powers to help his angry supporters. Trump is screwing them.

 

He attacks their health by eliminating rules which reduce corporate air and water pollution. He hasn’t stopped his repeated attempts to cut their health insurance by Medicare, Medicaid, and Obamacare. He is dismantling the bank and lending regulations overseen by the Consumer Financial Protection Bureau. Nothing good will come to average Americans from the foreign members of Trump’s club. These are all assaults on the standard of living, present and future, of non-elite America.

 

The 2017 tax cuts are the best example of how Trump betrays his base. Poor and middle-income Americans got small tax cuts, but also inherit gigantic future deficits to pay for the enormous cuts in corporate and income taxes for the very wealthy.

 

Trump is good at what he does, but that is bad for everybody else, especially for those who cheer him on.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/blog/154205 https://historynewsnetwork.org/blog/154205 0
Roundup Top 10!  

Why We Need a New Civil War Documentary

by Keri Leigh Merritt

The success and brilliance of the new PBS series on Reconstruction is a reminder of the missed opportunity facing the nation.

 

The public, not Robert Mueller, will determine Donald Trump’s fate

by Kathryn Cramer Brownell

Will Trump be Richard Nixon or Bill Clinton?

 

 

CNN Shows Zero Interest In Questioning Conventional Wisdom About Watergate

by Geoff Shepard

The producer assured me that CNN was committed to presenting a balanced view in its recent series on Richard Nixon, but she never even called back, and I think I know why.

 

 

Trump’s Taxes Are Fair Game. Just Ask Warren Harding.

by Stephen Mihm

The Teapot Dome corruption scandal resulted in a 1924 law that gives the House Ways and Means Committee authority to demand returns.

 

 

It’s time to return black women to the center of the history of women’s suffrage

by Susan Ware

Erased by white suffragists, black women’s work was vital to the fight for women’s rights.

 

 

If China wants to lead the global order, it will need more than the Belt and Road Initiative

by Gregory Mitrovich

The program falls well short of the world-changing Marshall Plan.

 

 

It’s time to get rid of reform schools

by Amber Armstrong

We need to seize the opportunity to rethink our juvenile justice system.

 

 

Why we need history majors to understand our future

by Knute Berger

Featuring Margaret O’Mara, a history professor at the University of Washington.

</

 

Resistance can't be tweeted: Social and political change is built on reading

by Jim Sleeper

What's the value of liberal education? Without intellectual exploration, we'll never make a better world.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171832 https://historynewsnetwork.org/article/171832 0
How the New Deal’s Federal Arts Programs Created a New American History

A Class at the Harlem Community Art Center Funded by the Federal Arts Project

 

 

Tensions have been brewing at George Washington High School in San Francisco over a series of murals that tell a less than heroic story about America’s first president.  Completed in 1936 by a left-wing immigrant painter, Victor Arnautoff, the murals have prompted discomfort among students and parents.  Their objections focus not on the mural’s critique of Washington but on its inclusion of a dead Native American and African American slaves.  Although Arnautoff apparently intended to expose Washington’s racist practices – his ownership of slaves, his role in killing Native people – the mural also shows people of color in positions associated with servitude and violence. Given that, it’s not hard to imagine the uneasiness students of color might feel as they walk, everyday, past these paintings.  A committee recently recommended painting over the offending frescoes. 

 

Members of the George Washington High School community should have the ultimate say in the types of images chosen to represent their school.  But there’s also a backstory to these murals – and other art works like it – that could easily be obscured in this discussion.  A recent New York Times article puts the San Francisco dispute in the context of the many controversies currently swirling over “historical representations in public art”, including protests about “Confederate statues and monuments” that have recently “been dismantled”. While it’s true that Confederate monuments were placed in public spaces – like city parks and courthouse squares – and so might be considered a type of “public art”, the George Washington High School murals are a different order of “public art” altogether.   Both were placed in public spaces but only one took shape as a result of public funding.  

 

The San Francisco murals sprang from a broad government-funded arts initiative, part of Franklin Roosevelt’s New Deal, which made possible the creation of thousands of art projects around the United States in the 1930s.  Part of the Works Progress Administration, these arts initiatives included numerous dramatic performances organized by the Federal Theatre Project; countless posters and murals created by the Federal Art Project; and the mammoth American Guide series as well as oral histories of black and white Americans done under the auspices of the Federal Writers Project.  Significantly, these projects offered employment to artists, writers, dramatists, and musicians hit hard by the economic circumstances of the Great Depression. 

 

 In contrast, the money behind Confederate monuments and statues came almost exclusively from white private organizations, societies like the United Daughters of the Confederacy and the Sons of Confederate Veterans, who called on wealthy donors and used their political connections to get monuments placed in public settings.  With Jim Crow measures effectively silencing black Americans in the political arena – and so preventing them from raising objections to the placement of these statues – tributes to the Confederacy appeared in prominent public spaces in towns and cities across the South in the first half of the twentieth century.  

 

The New Deal arts programs – including the program that sponsored Victor Arnautoff’s San Francisco murals – represented a singular response to the kind of “public art” initiatives that celebrated the Confederacy.  Writers, actors, and artists who lacked the economic clout of the UDC received government funding and were able to keep their art in the public eye.  As a result, a diverse array of artistic approaches and interpretations circulated, including work produced by left-wing muralists like Arnautoff and African American writers like Richard Wright and Sterling Brown.  The WPA even supported a “Negro Theatre Project” that was established in twenty-three cities across the US.  Because of the New Deal’s commitment to fund artists without extensive resources, it was possible to create, even for a short period of time, a more racially, ethnically, and politically diverse conversation. Indeed, the New Deal’s commitment to the arts made it possible, for the first time since the Civil War, for a richer and more democratic conversation about the American past to unfold in public settings. This should signal to us how vastly different the WPA’s version of “public art” was from the “public art” sponsored by Confederate apologists. 

 

 

 

 

None of this means that New Deal art followed a modern-day standard for “political correctness” or that these works were without historical distortions. These programs did, however, allow for a more inclusive narrative.  Consider, for example, that within a three-month period in 1936, two plays telling vastly different stories about the Civil War era appeared under the auspices of the Federal Theatre Project.  One, Jefferson Davis, made its New York premiere in February 1936.  With a UDC-approved script, this federally funded play upheld the Confederacy’s right to secede not over slavery but so the South “may decide the question for ourselves as the constitution promises we may”. The other play, Battle Hymn, opened in New York in May, and told of John Brown’s antislavery campaign in Kansas and Harper’s Ferry. Written by two left-wing dramatists, Michael Blankfort and Mike Gold, Battle Hymn sympathetically portrayed Brown as a reluctant insurrectionist, ultimately compelled to use violence because of his abhorrence of slavery.  In his review, the theater critic for the New York Post explained: “I’d rather miss any show in New York than this one.”  

 

In 1939 the newly created House Un-American Activities Committee cut the cord on the Federal Theater Project.  HUAC members objected, most of all, to the left-wing leanings of WPA artists and writers.  Ironically, they judged the work of dramatists like Blankfort and Gold “un-American” while a play honoring the Confederacy’s four-years-long military effort to break up the United States never even popped up on HUAC’s radar. The Federal Art Project continued through 1943, giving artists a few extra years to create murals and sculptures for schools, post offices, and government buildings, including a mural by William Scott, installed in the Recorder of Deeds building in Washington, DC that shows Frederick Douglass urging Lincoln and his cabinet members to enlist black men in the Union army.  Even after the Art Project folded, many of these more permanent forms of art – including Scott’s mural as well as Victor Arnautoff’s frescoes – remained in place.  While Arnautoff’s murals have little to say about the Civil War – they focus, of course, on the school’s namesake – they nonetheless challenged a well-established pro-Confederate history that had a strong hold in the public arena. 

 

Like other New Deal initiatives, those murals push back against a history that papered over racist atrocities, whether it was the first president’s betrayal of Native Americans, or the brutal injustices practiced by slaveholders – from George Washington to Jefferson Davis - in the pursuit of economic and political gain.  Without government funding, it would have been almost impossible for this alternative narrative to gain much of a foothold in the public imagination. 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171780 https://historynewsnetwork.org/article/171780 0
The Value of Rating Presidents

A screen shot from the latest CSPAN Presidential Historians Survey. Click to go to the website. 

 

The performance of modern presidents is continually addressed in myriad public opinion polls as they serve. Presidential leadership surveys are intended take longer views of our chief executives, while also providing comparisons with their predecessors.  These surveys can be seen as the progeny of historian Arthur M. Schlesinger, Sr., who made the initial foray into the field in 1948 for the then widely-popular Life magazine. As historian Douglas Brinkley notes in his introduction to C-SPAN’s 10th book, The Presidents, Schlesinger's survey, conducted soon after Franklin Roosevelt's impactful presidency and at the dawn of the broadcasting age, demonstrated "a tidal wave of interest in this new way of looking at all presidents simultaneously," judging by the "bags of colorful letters" Schlesinger received about his results.

 

C-SPAN followed in Dr. Schlesinger’s footsteps beginning in 2000 with our initial Historians Survey on Presidential Leadership, which ranked every president from best to worst. It was envisioned as a more formal academic follow-on to a year-long C-SPAN biographical television series, American Presidents: Life Portraits.

 

Just as Dr. Schlesinger's project had in 1948, C-SPAN's 2000 survey attracted much attention in the media and among historians, and so it continues. C-SPAN has now conducted three such surveys—in 2000, 2009, and 2017—with ninety-one historians participating in our most recent ranking. We ask presidential historians to rank individual performance across 10 qualities of leadership. Conducted just as a sitting president leaves office, we aim to create a first assessment for history. In 2000, Bill Clinton, only the second president to be impeached, nonetheless debuted with an overall rank of 21 of the then 42 men. In 2009 George W. Bush entered the field on the heels of a global financial crisis and an ongoing war; historians ranked him 36th overall. In 2017, Barack Obama's history-making presidency debuted in our survey in an impressive 12th place. 

 

Our leadership qualities were developed with the guidance of three presidential historians: Douglas Brinkley of Rice University; Edna Greene Medford of Howard University; and Richard Norton Smith, biographer of Washington, Hoover, and Ford. Purdue University political scientist Robert X. Browning, executive director of the C-SPAN Archives, tabulates the survey results.

 

The 10 attributes are:

  • Public persuasion
  • Crisis leadership
  • Economic management
  • Moral authority
  • International relations
  • Administrative skills
  • Relations with Congress
  • Vision/setting an agenda
  • Pursued equal justice for all
  • Performance within the context of the times

 

C-SPAN's three surveys take their place among a few other contemporary rankings of presidents, such as the six-time Sienna College historians poll. No matter who's conducting the survey, the top three places seem cemented among the pantheon of Lincoln, Washington, and FDR. For the other presidents each survey is but a snapshot in time, judgments rendered, to borrow a phrase from Donald Rumsfeld, based on "known knowns," but without the benefit of "unknown unknowns."  Much has changed in the nearly two decades since C-SPAN ran its initial survey—three more presidencies; newly opened archival records; new history books written. Our base of historians has also changed with retirements, deaths, and new hires. Importantly, our society continues to transform, impacted by demographics, technology, and evolving sensibilities. These many factors make the changing assessments of the presidents fascinating to watch.

 

It's particularly interesting to see how presidents' rankings change after their debuts. Bill Clinton advanced six places between 2000 and 2009, to an overall rank of 15, where he remained in 2017, with his highest marks in "economic management," and "equal justice." In 2017, with the benefit of eight years' hindsight, George W. Bush edged out of the bottom 10 to an overall rank of 33, with his highest ranking (19) in "equal justice." Perhaps unsurprisingly his lowest numbers are in "economic management" (36) and "international relations" (41). When our next survey is done following the Trump presidency, Bush 43's numbers will be worth keeping an eye on. 

 

 

 

 

There are other notable changes:  Andrew Jackson, for instance, held the 13th spot in our 2000 and 2009 surveys. By 2017, he had fallen five places, the most of any president. Jackson placed just 38th in "equal justice," driven largely by increasing disapproval of his policies toward Native Americans. Another interesting change is the ascension of Dwight Eisenhower into the top five. Ike has moved up four spaces since 2000—the most of any president in the top 10 — as the "hidden hand' theory of Eisenhower's leadership gains wider acceptance.

 

Acknowledging that presidential lists are simple reductions of complicated realities, we're still pleased that they ignite popular interest in history. Presidential rankings play nicely into our national numbers obsession—they are quick, fun, and highly shareable on social media.  They foster lively debate about contemporary presidents and even help resurface some of our most obscure presidents; think Millard Fillmore (#37), or James Buchanan (dead last at #43).

 

Leadership surveys also serve a more substantive purpose. Our historians' rankings create useful data points for deeper analysis of a presidency and those "snapshots in time" become lasting metrics to assess evolving assessments of presidencies.  For C-SPAN, these leadership rankings also add important context to our ongoing public affairs coverage. Our network's archives store nearly every significant public moment by American presidents since Ronald Reagan—literally thousands of hours of video. The historians' rankings, particularly the individual leadership categories, complement that video, providing valuable metrics for today's and future generations to assess presidential effectiveness.

 

In this highly partisan age of President Trump, people invariably ask us how he fares on the 10 leadership attributes. We won't know for sure until the end of his presidency when, once again, C-SPAN will ask historians to formally evaluate him in the context of his predecessors. You, however, can use the ten metrics to form your own judgment of his performance at this midway point of his first term.  And as campaign 2020 gets underway, the metrics also provide a solid starting point for voters to assess the leadership skills of the long lineup of individuals vying to replace him.

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171781 https://historynewsnetwork.org/article/171781 0
The History of the Meaning of Life

 

Life's but a walking shadow, a poor player  That struts and frets his hour upon the stage  And then is heard no more: it is a tale  Told by an idiot, full of sound and fury,  Signifying nothing.

 

Thus, William Shakespeare in Macbeth. As the French novelist Albert Camus said, life is “absurd,” without meaning.

This was not the opinion of folk in the Middle Ages.  A very nice young Christian and I have recently edited a history of atheism.  We had a devil of a job – to use a phrase – finding people to write on that period.  In the West, Christianity filled the gaps and gave life full meaning.  The claims about our Creator God and his son Jesus Christ, combined with the rituals and extended beliefs, especially about the Virgin Mary, meant that everyone’s life, to use a cliché from prince to pauper, made good sense. The promises of happiness in the eternal hereafter were cherished and appreciated by everyone and the expectations put on a godly person made for a functioning society.

Then, thanks to the three Rs, it all started to crumble. First the Renaissance, introducing people to the non-believers of the past. Even the great Plato and Aristotle had little place for a Creator God.  Then the Reformation tore into established beliefs such as the importance of the Virgin Mary. Worse, the religious schism suggested there was no one settled answer.  Finally, the (Scientific) Revolution, showed that this plant of ours, Earth, is not the center of the universe but a mere speck in the whole infinite system. This system works according to unbreakable laws – no miracles – and God became, in the words of a distinguished historian, a “retired engineer.”

There was still the problem of organisms, whose intricate design surely had to mean something.  Blind law just leads to rust and decay.  And yet, organisms defy this.  If a clever technician set out to make an instrument for spotting things at great distances, the eye of the hawk is built exactly as one would predict. There had to be a reason.  As the telescope had a telescope designer, so the eye surely pointed to The Great Optician in the Sky. Along came Charles Darwin with his theory of evolution.  He showed, through his mechanism of natural selection – the survival of the fittest – how blind law could indeed explain the eye. Thanks to population pressures, there is an ongoing struggle for existence, or more importantly struggle for reproduction.   Simply, those organisms with better proto-eyes did better in this struggle, and over time there was general improvement.  The hawks with keener sighthad more babies!  They were “naturally selected.”

Darwin did not disprove God. When he wrote his great Origin of Species, 1859, he still believed in a deistic god, a god of unbroken law.  But he made it possible not to believe in God and to be, in the words of Richard Dawkins, a “fulfilled atheist.” More importantly, Darwin suggested that the deity is like the common perception of the God of Job, indifferent to our fate.  Thomas Hardy, novelist and poet, put it well.  He could have stood a God who hated him and caused untold misery. It was an indifferent God who crushed him.

Crass Casualty obstructs the sun and rain, And dicing Time for gladness casts a moan. . . . These purblind Doomsters had as readily strown Blisses about my pilgrimage as pain.

Christianity continued, of course, in the post-Darwinian era.  There were and are many believers who embrace Darwinism,although, as is too well known, in America especially there is a vibrant evangelical branch that denies evolution and embraces a literal reading of Genesis.  But what of those of us like Hardy?  He and I were raised asChristiansand – I think I can speak for him too -- all our lives we have been in our ways deeply religious.  An indifferent or absent God does not mean we stop worrying about questions on meaning and purpose.  In fact, we probably worry more, not so much because we are scared but as humans these are important issues for us.  Was Shakespeare right?  Is life no more than a tale told by an idiot, signifying nothing, full of sound and fury?  

“The Lord gave, and the Lord hath taken away; blessed be the name of the Lord’’ (Job 1: 21).  Darwin’s theory of evolution has done its fair share of taking away.  Can it also do some giving?  Many evolutionists think it can. In the 150 years plus since the Origin appeared, we find that people who worry about these sorts of things – and there are many who do –embrace one of two approachesto finding an evolution-influenced meaning to life. 

First, there are those who might fairly be called the “objectivists.”  Just as Christians think there is an external reality that rules our lives, so these Darwinians – although they have no promises of an eternity – believe that their theory imposes order and meaning on our lives.  This lies in the progressive nature of the evolutionary process – from blob to human, or (as they used to say in the past) from monad to man.  Evolution is not a slow meandering process going nowhere.  Despite setbacks it is on an upwards trajectory, ending in our species.  Order and meaning come out of this.  It is our duty to aid the evolutionary process and help it forward, or at least not to drop backwards.  

Today’s most enthusiastic evolutionary progressionist is the Harvard ant specialist and sociobiologist, Edward O. Wilson.  He has devoted the last thirty years of his life to the preservation of the lands where he did so much of his original research: the Amazonian rain forests. Wilson’s Darwinian commitments uncover his reasons. He is not interested in the rain forests as such, but rather as an aid and resource for humankind. They fill the heart with beauty and awe, an essential human emotion.  More practically, they still conceal many natural drugs that may prove of great value to humans.  Hence, we should preserve them.

The trouble with this approach is that Darwinian evolutionary theory is not progressive, at least not in the needed absolute sense. The key mechanism of natural selection says that anything may help the possessors in the struggle for existence.  This can lead to improvement or progress.  Those hawks with keener eyesight outbred those with lesser eyesight.  But it’s all relative.  A mole burying underground does not need keen eyesight.  Indeed, functioning eyes might be a problem, both because they are not used and are taking limited physiological resources and because they are prone to infection and consequent ill-health.  It is the same with the supposed superiority of humans.  Relatively, we may do well.  We were cleverer than the Neanderthals and look at who is around today. Absolute progress is another matter. Large brains are high maintenance and alternative strategies may be preferable.  In the immortal words of the paleontologist Jack Sepkoski: “I see intelligence as just one of a variety of adaptations among tetrapods for survival.  Running fast in a herd while being as dumb as shit, I think, is a very good adaptation for survival.” Cow power rules supreme!   

Is there a subjectivist alternative?  One that starts with evolution?  I believe there is, a position I (somewhat grandiosely) call “Darwinian existentialism.”  Sartre said that the key to existentialism is that God’s existence is beside the point. We humans are thrown into the world and must make sense of it ourselves.  Sartre also went on to say that there is no such thing as human nature. On this, as an evolutionist, I disagree strongly. Human nature, Darwinian human nature, certainly exists.  Above all, Homo sapiens is a social creature. We have evolved above all to need and be in with our fellow humans.  The great metaphysical poet, John Donne, hit the nail on the head.

No man is an island, Entire of itself, Every man is a piece of the continent, A part of the main. If a clod be washed away by the sea, Europe is the less. As well as if a promontory were. As well as if a manor of thy friend's Or of thine own were: Any man's death diminishes me, Because I am involved in mankind, And therefore never send to know for whom the bell tolls;  It tolls for thee.

That is the secret, the recipe, for a meaningful life in this Darwinian world.  First, family and the love and security that that brings.  Then society, whether it be going to school, shopping at the supermarket, or simply having a few drinks with friends, and sometimes strangers.  Third, the life of the mind. Shakespeare’s creative works are about people and their relationships – happy (Twelfth Night), sad (Romeo and Juliet), complex (Hamlet), doomed (Macbeth), triumphant (Henry V).  This is the life of meaning.  Take life for what it is.  Enjoy it to the full, realizing that the secret of true happiness is being fully human, taking from and giving to others.  And stop worrying about the future.  There may be one. There may not.  There is a now.

The lover of life knows his labour divine,

And therein is at peace.

The lust after life craves a touch and a sign

That the life shall increase.

 

The lust after life in the chills of its lust

Claims a passport of death.

The lover of life sees the flame in our dust

And a gift in our breath.

         (George Meredith 1870)

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171782 https://historynewsnetwork.org/article/171782 0
Slavery and the Electoral College: One Last Response to Sean Wilentz

Map of the Electoral College for the 2016 presidential election

 

 

In his History News Network response to Akhil Reed Amar’s argument that the Electoral College was conceived as part of the defense of slavery in the United States, Sean Wilentz welcomed continuing the debate on the relationship between slavery and the Constitution. In his essay, Wilentz reiterated his view, first presented in a New York Times op-ed, that the Electoral College, while deeply undemocratic, was not part of a constitutional defense of slavery and he argued that Amar’s position is deeply flawed. In fact, he dismisses Amar’s position as “illogical, false, invented, or factually incomplete.”

 

As evidence to support his position, Wilentz points out that at least two of the leading slave states preferred an executive chosen by Congress to the Electoral College. Of course, the South would have more power in such a system because the 3/5 Clause boosted the South’s representation in Congress. Wilentz asserts “proslavery concerns had nothing to do with the convention’s debates over what became the Electoral College.”

 

Actually, as explained by Pierce Butler in Madison’s Notes on the Constitutional Convention, the selection of the Executive was part of a four-pronged defense of slavery. Butler, a South Carolina rice planter, was one of slavery’s strongest defenders and one of the largest slaveholders in the United States. He introduced the Fugitive Slave Clause into the Constitution, supported the constitutional provision prohibiting regulation of the slave trade for twenty years, demanded that the entire slave population of a state be counted for Congressional apportionment, and championed an Electoral College with voters selected by state legislatures. Such a system permitted wealthy white men to represent white women, landless whites, and enslaved Blacks.

 

Wilentz is correct that some Southern representatives thought an Executive chosen by the national legislature would more effectively defend state prerogatives including slavery and that some Northerners feared empowering a broad electorate. But that does not change the fact that with the 3/5th Clause in place, both proposals to select the Executive by the national legislature and to do so through a separate Electoral College, were designed to ensure the continued existence of slavery.

 

In the New York Times op-ed, Wilentz claimed that the Electoral College would not have effectively helped the slave states in Presidential Elections. But between 1801 and 1837, every President except two were slaveholders from a slaveholding state. The South held the Presidency for 28 of the 36 years. Despite Wilentz’s dismissal of the numbers, without the 3/5 Clause and the Electoral College, John Adams, not Thomas Jefferson would have been elected President of the United States in 1800. Even if the intent of the Electoral College was not to support slavery, though I believe the evidence shows it was, as Gary Willis argued in “Negro President,” Jefferson and the Slave Power (2003), it definitely entrenched slave power in the United States.

 

William Lloyd Garrison dramatically expounded a very negative view of the United States Constitution on July 4, 1854 when he publicly burned a copy at a rally sponsored by the Massachusetts Anti-Slavery Society. For Garrison the Constitution’s tolerance of enslavement condemned it as “a Covenant with Death, an Agreement with Hell” and he and his followers refused to participate in the electoral process because it gave legitimacy to the illegitimate.

 

In an 1832 editorial in The Liberator, Garrison elaborated on his position with greater detail. “There is much declamation about the sacredness of the compact which was formed between the free and slave states, on the adoption of the Constitution. A sacred compact, forsooth! We pronounce it the most bloody and heaven-daring arrangement ever made by men for the continuance and protection of a system of the most atrocious villainy ever exhibited on earth.” “Such a compact,” according to Garrison “was, in the nature of things and according to the law of God, null and void from the beginning.”

 

In March 1849, Frederick Douglass, who later modified his views, called the United States Constitution “a most cunningly-devised and wicked compact, demanding the most constant and earnest efforts of the friends of righteous freedom for its complete overthrow.”

 

The debate over the meaning of the Constitution and its origins as a pro-slavery document during the Abolitionist battle to end slavery in the United States gives us some clues to its deeper meaning as well as weapons in our own struggle to preserve what may be a fragile democracy in the United States today. As historians and public figures, we have an obligation to defend democratic institutions and expose vestigial anti-democratic elements like the Electoral College that threaten democracy, which includes a careful examination of their origin and history. 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171783 https://historynewsnetwork.org/article/171783 0
The Unseen Significance of Jefferson’s Natural-Aristoi Letter to Adams  

On October 28, 1813, Thomas Jefferson crafts a letter to John Adams in a delayed reply to several other letters, written by Adams on his views of aristocracy, which cry out for a reply. Adams had written extensively on the subject in Discourses on Davila and “Defence of the Constitutions of the Government of the United States of America,” and continually complained thereafter that he had been much misunderstood, and has asked for Jefferson’s views. After several letters by Adams, Jefferson, months later, finally replies.

 

Jefferson’s reply (28 Oct. 1813) to Adams is too well-known to need introduction—almost every Jeffersonian biographer has something to say concerning it—yet too few recognize the philosophical significance of it. Jefferson’s letter is an eloquent and luculent summary of a political philosophy, as rich in substance as it is profound in its simplicity. As he believed that the truths of morality were few and straightforward, so too he believed that the principles of good governing were few and straightforward.

 

After some preliminary thoughts concerning interpretation of a passage on selective breeding by the Greek poet Theognis, Jefferson begins what amounts to a polite refutation of Adams’ views on aristocracy. His refutation underscores key differences between the two men’s views of thriving republican government. Both believed that republican government would thrive when the best men (Gr., aristoi) governed. Adams, however, insisted that the best men were those of wealth, good birth, and even beauty. “The five Pillars of Aristocracy, are Beauty[,] Wealth, Birth, Genius and Virtues. Any one of the three first, can at any time over bear any one of or both of the two last,” writes Adams in a prior letter. In support of that claim, Adams appeals to history. People have always preferred wealth and birth, and even looks, to intelligence and morality.

 

“I agree with you that there is a natural aristocracy among men,” Jefferson coltishly, and perhaps insidiously, concedes. He then proceeds to a distinction between “natural aristoi” and “artificial aristoi,” which amounts to disambiguation of the former through an attempt at a precise account of it. Pace Adams—for whom beauty, wealth, and birth individually or conjointly can trump talent and virtue—this natural aristoi for Jeffersoncomprises only the virtuous and talented. Jefferson adds, “There is also an artificial aristocracy, founded on wealth and birth, without either virtue or talents; for with these it would belong to the first class.” Jefferson’s phrasing here is cautious. Virtue and talent are sufficient to place one among the natural aristocracy. Lack of virtue and talent (more precisely, lack of either) is sufficient to exclude one. Jefferson’s distinction aims to refute Adams, and that refutation underscores the essential difference between Jeffersonian and Adamsian republicanism.

 

There is to note Jefferson’s use of “natural,” which is neither discretionary nor incautious, and is often overlooked by biographers. Nature (i.e., God) has foreordained, as it were, that the wisest and most moral ought to preside among men, if only as stewards—that is, primus inter pares. Consequently, the centuries-old practice of the aristoi as the wealthy and wellborn is a contravention, and corruption, of the dictates of nature.

 

Jefferson says more. He offers this rhetorical question. “May we not even say, that that form of government is the best, which provides the most effectually for a pure selection of these natural aristoi into the offices of government?” Nature has foreordained that genius and morality are the defining features of aristoi, so the best government is that which selects the aristoi. There is no place for the wealthy and wellborn in politics, unless they are also endowed with genius and moral sensitivity.

 

The rhetorical question leads naturally to other, relevant issues. All concern establishment of a sort of government, genuinely republican, in which a system is in place that selects for the true, natural aristoi. Who are to be the selectors? “The best remedy is exactly that provided by all our constitutions, to leave to the citizens the free election and separation of the aristoi from the pseudo-aristoi, of the wheat from the chaff.” The vox populi is not infallible, Jefferson acknowledges, but the people “in general … will elect the real good and wise.” Thus, the aristoi will for the most part assume political offices, though they will be watched and will serve for short terms.

 

How can we be sure that the people will “in general” select wisely?

 

On the one hand, the people have the same, if not better, moral sensitivity than those who are fully educated. Why? Moral “judgment” for Jefferson is immediate and sensory, and corrupted by the input of reason (e.g.., TJ to Thomas Law, 13 June 1814). Those schooled in morality in the main corrupt their sensory moral faculty by infusion or intervention of thought. A class on morality for Jefferson is as sensible as a class on hearing. One need not be schooled in hearing. One merely hears. That is why Jefferson was insistent that his nephew Peter Carr should eschew formal education in morality. It would likely be of more harm than of good. He writes to Carr (10 Aug. 1787): “I think it lost time to attend lectures in this branch. He who made us would have been a pitiful bungler if he had made the rules of our moral conduct a matter of science.”

 

On the other hand, Jefferson maintains here, and elsewhere, that hale republican governing, entailing selection and overseeing of governors by the people as well as governmental decisions consistent with vox populi, requires wholesale and systemic educational reform: public or ward schools for general education of all citizens, male and female; higher education for future politicians, educators, and scientists; and education of an intermediate sort (grammar schools) to take men from the ward schools to, say, the University of Virginia. For republicanism to thrive, all people need a general education—comprising reading, writing, and basic math, and perhaps also some history. Jefferson proposed such structural reform in his 1779 bill, Bill for the More General Diffusion of Knowledge, which failed to pass the Virginian legislature due to resistance by Virginian wealthy and wellborn citizens, who refused to be taxed to educate the masses. Virginia’s wealthy and wellborn already had access to quality education through private tutoring or private schooling, and that access, in Jefferson’s eyes, allowed for the perpetuation of their monopoly on governing.

 

And so, even though Jefferson championed thin government, he also and always championed such “infrastructure,” such internal improvements in affairs of wards, of counties, of states, and of the nation, that would most facilitate freedom of all citizens in their pursuit of happiness. Thus, he championed systemic educational reform. Thus, he championed laws eradicating entails and primogeniture. Thus, he championed religious freedom to eradicate “the [unnatural] aristocracy of the clergy.”

 

Jefferson too and most significantly championed science, understood then much more broadly than it is today understood. It was a patronage for which he would be criticized throughout his life. “Science had liberated the ideas of those who read and reflect and the American example had kindled feelings of right in the people.” He continues in a buoyant, rhetorical tone: “An insurrection has consequently begun, of science, talents and courage against rank and birth, which have fallen into contempt. … Science is progressive, and talents and enterprise on the alert.”

 

Jefferson adds before ending his exposition, “I have thus stated my opinion on a point on which we differ, not with a view to controversy, for we are both too old to change opinions which are the result of a long life of inquiry and reflection; but on the suggestion of a former letter of yours, that we ought not to die before we have explained ourselves to each other.” That addendum shows other key features of Jefferson’s political philosophy, reducible to Jefferson’s views on morality: Conciliation and friendliness are always preferable to confrontation.

 

It is often presumed today that Jefferson’s political philosophy—with a focus on thin government, agrarianism, self-sufficiency, full participation by all citizens insofar as talents and time allows, elected officials for short terms as stewards and not tyrants, free trade and amicable relations with all nations, and so on—is bewhiskered: that is, that its tenets cannot be instantiated because they are passé. Proof of that is the fact, most cavalierly assert, that we have gone politically much more in a Hamiltonian than in a Jeffersonian direction. Yet that argument is a fallacy of fatalism. That we have gone in a certain direction means neither that we could not have gone in another direction nor that we cannot still go in another direction. Jefferson’s political philosophy is not bewhiskered. It ought to be studied, reconsidered, and revitalized as an alternative the thick, intrusive government practiced today.

 

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171784 https://historynewsnetwork.org/article/171784 0
We Judge Presidents in part by Who Precedes and Follows Them

 

I’ve been interested in the presidency since I was 10 years old. On WWGH 107.1 FM, Marion, Ohio, (Home of Warren G Harding), I comment weekly on political events and the host, Adam Lepp, has granted me a title that does not really exist, although I think it should be a word—presidentologist. 

 

I am also fascinated by how one evaluates and ranks presidents. Previously, I’ve written about presidential ranking changes over time. Now, I want to discuss a different element of how historians and the public rank presidents: the “company they keep.” Presidents are often judged in comparison to who preceded and followed them. Some Presidents are fortunate to come before or after a President who was not a great success so they seem like a better president in comparison. Others follow and precede a President who was perceived as successful, dimming their reputation.

 

For example, Thomas Jefferson was in office after John Adams and before James Madison. In the latest CSPAN poll ranking of presidents in 2017,  scholars consider Jefferson a major success as they rank him 7th, while Adams was ranked 19th and Madison 17th.  John Adams also served after George Washington who was rated  2nd, so he suffers even more in historical assessment. Of course,  the fact that he was defeated for reelection does not help his comparative reputation.

 

Then, we have the case of Abraham Lincoln who followed James Buchanan and was succeeded by Andrew Johnson. Historians consider both Buchanan and Johnson as the absolute bottom of the Presidency—they rank them 43rd and 42nd, respectively. That just adds to the stature of Lincoln, who is judged in most scholarly assessments as our greatest President, and is first in the C Span poll of historians.  Ironically, Lincoln had far less government experience than either Buchanan or Johnson. 

 

William Howard Taft was in office after Theodore Roosevelt and before Woodrow Wilson, and therefore suffers by comparison.While Teddy is ranked 4th by scholars and Wilson 11th, Taft is ranked 24th. Taft’s reputation is helped a bit because he later served as Chief Justice of the Supreme Court, but he also suffered the greatest reelection defeat in American history.

 

George H. W. Bush  served between two charismatic Presidents, Ronald Reagan and Bill Clinton and he is rated lower by comparison. While Reagan is ranked 9th and Clinton 15th, in the latest CSPAN poll, Bush was 20th. Of course, Bush’s juxtaposition with these presidents is only part of it: he suffered the second greatest reelection defeat in American history, only surpassed by Taft.

 

Finally, Barack Obama is already rated 12th by scholars just two years after leaving the Presidency.  But he is also extremely fortunate, and will continue to be so in the future, as he succeeded George W. Bush and preceded Donald Trump. At this point, the second Bush is rated 33rd and Trump is rated down in the basement with James Buchanan and Andrew Johnson. This will shape the image of Obama for the long run of history.

 

So life is unfair, and that certainly applies to Presidents who would surely prefer to be the success between two failures  than the lesser figure between two giant figures in the Presidency.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171785 https://historynewsnetwork.org/article/171785 0
The Middle East, Land Disputes, and Religious History

 

 

The entire Middle East region that bridges three continents has historically been defined by change: changing people, politics, religious ideas and ideologies. People with power have come and gone, but the land remains and still presents the international community with one of its most challenging conflicts.

 

For centuries Jerusalem has been an interfaith, intercultural, and international city where different faith communities have converged, comingled, and coexisted in relative peace. This diversity and inclusivity are characterized by such landmarks as the site of King Solomon’s First Temple, the Church of the Holy Sepulcher, and the golden Dome of the Rock in the centuries-old mosaic of Jewish, Christian, Armenian, and Muslim quarters that constitute the city of Jerusalem to this day.

 

Jerusalem may have had a Jewish identity during the Biblical period of the Kingdom of Israel 930-720 BCE, but the greater region of the Middle East has been occupied and settled by many people bothbefore and after: The Egyptian Merneptah Stele, 1200 BCE; the Neo-Assyrian Empire 722 BCE; the Neo-Babylonian Empire 586 BCE; the Achaemenid Empire 538 BCE; the Macedonian Greeks 332 BCE; the Hasmonean Kingdom 165 BCE, and the Romans in 64 BCE. They all occupied the region for a time.  The area subsequently became increasingly Christian after the 3rd century and increasingly Muslim after it became part of the Muslim state in 638 CE. The Crusader states in the Levant controlled Jerusalem between 1099-1291, the Ayyubid Sultanate ruled from1171-1260, the Ottomans between 1517 and 1917, and the British from 1917 until 1948 when the Jewish State of Israel was proclaimed.

 

In most of the previous cases, Jerusalem was annexed to existing imperial states. In 1948, however, at least a part of Jerusalem became part of the independent state of Israel. This was partially legitimized on the basis of a scriptural claim that god had promised to the progeny of Abraham. What is often overlooked here is the revisionist interpretations of these scriptures.

 

The varied scriptural reinterpretations of the myth of the Promised Land aligns with and contributes to the historical claims and controversies from the Hebrew Bible of 500 BCE to the more than 20 braided and polished versions that bridge the Septuagint Bible of 250 BCE with the 1611 King James Bible.

 

According to the Hebrew tradition, G-d promises the land of Canaan to the descendants of Abram as He brings them out from Ur of the Chaldeans (Genesis 15:7) and after the Exodus from Egypt (Deuteronomy 1:8). G-d promises this land to all eight of Abraham’s sons by his three wives: Hagar, Sarah, and Keturah.

 

Then, somehow, the scriptural narrative changes as though revealed by a different god.  God becomes a historical revisionist and revokes the covenant with Abraham, as if he didn’t have the foreknowledge of what was to come. This god excludes the other seven sons of Abraham and promises the land only to the descendants of Jacob (Jeremiah 3: 3-34); and the land is conveniently re-named Israel. God also forgets that in addition to Abraham he had specifically made a covenant with Hagar, the mother of Ishmael, when the angel of God called on Hagar: “Lift the boy up and take him by the hand, for I will make him into a great nation” (Genesis 21:18, 17:23, and 17:26) are all forgotten and they are now doomed with slavery (Galatians 4:25).

 

This god not only orders the ethnic cleansing of the land from its existing inhabitants, “the Kenites, Kenizzites, Kadmonites, Hittites, Perizzites, Rephaite, Amorites, Canaanites, Girgashites and Jebusites" (Genesis 15:18-20), he also participates in it, “and I will drive out the Canaanite, the Amorite, and the Hittite and the Perizzite, the Hivite and the Jebusite” (Exodus 33:2). Furthermore, the performance of religious rites becomes a requirement to live on the land: “He that sacrificeth unto any god, save unto the Lord only, he shall be utterly destroyed” Exodus 22:20).

 

There are obvious scriptural and moral inconsistencies here. The seemingly arbitrarily targeted tribes for expulsion from the land in Genesis and those in Exodus do not match. Neither do the boundaries of the Promised Land described in Genesis 5, Numbers 34:1-12, with those in Ezekiel 47: 13-20. But more importantly, this is a conflicted god who destroys his own creation through ethnic cleansing, institutionalizes religious tribalism, and has a weird sense of ‘divine justice’ as he takes land from one group of people and gives it to another, without compensation, transaction, or exchange at a time when 95 percent of the earth’s surface was uninhabited.

 

This idea of a land identified with people rather than with an empire came in handy centuries later when the "Westphalian" doctrine of states led to the formation of 19th-century European nation states. When secular European states were defined in ethnic terms, the religious identity of the European Jews became an anomaly and the ‘descendants’ of Jacob, victims of centuries of racial discriminations found themselves stateless. If before a Jewish person didn’t have a place in a European neighborhood, now the Jewish nation had no place on the European continent.

 

European Jews had two choices. They either had to follow the 18th century “Jewish Enlightenment” movement, the Haskala, that urged Jews to assimilate in Western secular culture or they had to follow the Zionist movement that called for the establishment of a Jewish state in Palestine. Zionism was defined by its inherent paradoxes. On the one hand, due to the integrated racial religious identity of the Jews, Zionism was a racial and religious movement that was diametrically opposed to European nationalism that was racial, but secular. The second paradox, that would become more visible and pronounced in due course, was Zionism’s call for the establishment of a racially and religiously exclusive state in  a racially inclusive and religiously-diverse Middle East. 

 

In spite of these contradictions the call for the for the establishment of a Jewish state in Palestine came in handy for the British. During WWI Lord Balfour, the British Foreign Secretary, used both the British design on the disintegration of the Ottoman Empire from within and scriptural revisionism to “favour the establishment in Palestine of a national home for the Jewish people,” with the clear understanding “that nothing shall be done which may prejudice the civil and religious rights of existing non-Jewish communities in Palestine.”

 

The establishment of a Jewish state outside of Europe may have resolved the European racial conflict, but in reality, it has simply been transferred to the Middle East where every European Jewish settler turns a Palestinian into a refugee.  A hundred years later, the racial divide and religious hatred persists, in a role reversal where today’s Palestinians live the lives of yesterday’s Jews.

 

To make an already bad situation worse, the Trump administration recognizes Jerusalem as ‘the eternal capital’ of Israel. To claim Jerusalem only as a Jewish city in biblical/scriptural terms is based on two false premises: 1) that a certain people would remain faithful to the same belief system throughout history, and, 2) if a fraction of that community happens to change or revise their belief system, as people often do, they would disqualify to inherit the land.

 

Can a land be claimed on the basis of the scriptural myths of one religion with implied exclusion of all other religious communities? Can ethnic cleansing be ‘divinely’ sanctioned as stated in Genesis 15:18-20 and Exodus 33:2?  Such a revelation is incomprehensibly incompatible with democracy and human rights in the age of reason.  

 

Resolving these inconsistencies, biblical, historical, political and otherwise, demand nothing short of a paradigmatic shift in our attitude and approach to a problem that is at the heart of the Middle East crisis. In view of the current state of regional and global politics, such a shift may seem unattainable and certainly not to everyone’s liking, but in its absence, the future looks bleak for all parties concerned.

 

The current asymmetrical warfare is fought on the battleground of the womb and the tomb that is threatened by both the incentivized accelerated Jewish migration and the reactive Palestinian population explosion in a geography where water and land cannot sustain an environmentally sustainable growth much longer. Will there arise responsible leadership in any of the Abrahamic faith communities with a vision to see people’s humanity before their racial or religious identities? Only time will tell.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171786 https://historynewsnetwork.org/article/171786 0
Civil War History Brought to Life  

Animating the American Civil War from My Colorful Past on Vimeo.

 

Matt Loughrey, the founder of My Colorful Past, lives on the west coast of Ireland. His work exists somewhere between history and art, using technology as the storyteller. Above, you can watch his latest project. Below, Elisabeth Pearson interviews him about the value of his work. 

 

1. Were you always interested in combining art and history? How did your interest in reanimation begin?

Digital art has always been in the background for me. My first experience of that was in 1991 and getting myself accustomed to Dan Silva's 'Deluxe Paint' software on the Commodore Amiga. It was a very defining time in light of my own creativity. Almost three decades later, art software is still as groundbreaking and archives of historical material are accessible through the internet, not least the ability to communicate freely with archivists the world over. It always made sense to combine both interests.

 

2. Has living in Ireland influenced your work? If so, how?

In the west of Ireland there's a very real sense of creative community. Creativity is widely accepted and fostered by those that appreciate it. I think that overall acceptance and the peaceful surrounds of home are positive for work.

 

3. How did you discover that these frames could be inadvertently reanimated?

I've spent many years stumbling upon what I thought could be duplicate frames that were uncatalogued in different archives. For the most part I dismissed them, until it became more and more obvious to me that it might be worthwhile looking at closely. I looked at a couple closely and it was like finding treasure in plain sight.

 

4. Your Instagram, My Colorful Past, has taken on quite a following. How do you manage that account? Are you pressured to come up with new content? What are some of your favorite images you’ve posted?

People with specific interests look for quality content, provided I keep that in mind then the account becomes somewhat self managing. It's hard to say what is a favorite image, albeit I've always enjoyed American history as it has been so very visually documented, it makes the experience relatable almost. The Gold Rush, The Dustbowl, The Civil War, Ellis Island...

 

5. What do you hope your viewers get out of the documentary and or the work that you do?

The aim is always to invite a completely new sense of relatability and the opportunity to learn a little more.

 

6. How has this project changed or developed since you first started back in July 2018?

The project itself has been realized, in this instance that was was the main priority. So long as it exampled what is possible then I was going to be happy with the outcome. The interesting part on a professional level are the expressions of interest from libraries and museums that see its potential as an aid to their visitor experiences.

 

7. Do you remember the first time you saw the reanimation of an image? Which image was it and did it provoke any feelings that may have inspired this mini documentary?

The first portrait I animated authentically was of George Custer when he was a Captain. In that moment he was 'alive' and I fast realized the potential as well as importance of seeing the project through. It was very surreal in the sense of discovery, that I do remember well.

 

8. What do you hope to accomplish moving forward with the reanimation of frames?

The integral part of this project is realizing that it is all about preservation and discovery combined. The end goal is to see these animations, and hundreds of others, displayed in the correct learning environment. They are ideal as an engaging visual support for classroom learning in the digital age. What better, than to see in motion, the very people or places you are reading about.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171787 https://historynewsnetwork.org/article/171787 0
A History of Huntington Disease and Beyond

 

In 1974, young neurologist Dr. Thomas Bird founded the first clinic for adults with neurogenetic diseases in the United States. For more than 40 years, he directed this clinic at the University of Washington where he saw thousands of patients and conducted pioneering research on conditions such as cerebellar ataxia, movement disorders, hereditary neuropathy, muscular dystrophies, and familial dementias. Over his career, he has been honored with numerous national awards and lauded for his discoveries about the genetics of hereditary neurological disorders including Alzheimer and Huntington diseases. 

 

To his initial surprise, patients with the cruel and incurable Huntington disease became a prominent part of Dr. Bird’s practice in the early years of his clinic. Huntington’s is a progressive, inherited disease that perniciously and ruthlessly devastates the brain. It can cause incoordination, jerkiness, confusion, impaired judgment, emotional instability, depression, anxiety, social disinhibition, hallucinations, and other problems. And no two patients are alike in terms of their signs and symptoms of the disorder.

 

Dr. Bird addresses this perplexing disease and its many permutations in his groundbreaking new book for general readers and professionals alike, Can You Help Me?: Inside the Turbulent World of Huntington Disease (Oxford University Press). The title comes from a Huntington sufferer’s plea for help from his prison cell, and this desperate call reflects the desire of so many of Dr. Bird’s patients who, through the years, sought his unique understanding and care.

 

In his book, Dr. Bird vividly describes Huntington disease, traces its history and, at the heart of his book, shares dozens of accounts of his own patients in lively prose that evokes the engaging writing of renowned doctor-authors such as Oliver Sacks, Richard Selzer and Atul Gawande. He recounts the physical, cognitive and emotional challenges of his patients and the complex situations that patients and their families face every day. There are wrenching stories of neglect and abuse of vulnerable Huntington sufferers as well as stories of hope and courage and the unselfish—and vital—support of families and friends. These very human accounts come from Dr. Bird’s decades of meeting and treating Huntington patients of all ages, from early childhood to the nineties, and from all walks of life.

 

Physicians are still struggling to understand the clinical manifestations of this condition. No Huntington’s patient is “typical,” as Dr. Bird’s case studies demonstrate. One patient may exhibit jerky movements only, while another may be emotionally explosive with poor judgment but without an obvious movement disorder. Some may experience both severe physical and behavioral problems, especially as brain degeneration progresses. Some patients may alienate their caregivers and some may refuse care and some may lack the financial and other resources to receive care and to survive in today’s complex world.

 

Can You Help Me? reflects Dr. Bird’s compassion and care for patients of this dreaded disease as he offers support and treatment grounded on his trailblazing research into the genetics of neurological diseases. In offering understanding and empathy to each patient, he emulates the admonition of the legendary physician Sir William Osler: “Care more for the individual patient than the special features of the disease.”

 

Dr. Bird is a University of Washington Professor (Emeritus) of Neurology and Medicine (Medical Genetics). In addition to directing the UW Neurogenetics Clinic for more than 40 years, he was also chief of neurology at the Seattle VA Medical Center for 12 years and is presently a retired Research Neurologist in Geriatrics at the VA. 

 

Although retired from clinical practice, Dr. Bird still actively researches genetic diseases of the brain and neuromuscular system; collaborates with molecular biologists and others on genetics projects; and mentors physicians in training and research fellows. He earned his M.D. from Cornell Medical College and is board certified by the American Board of Psychiatry and Neurology. He lives in Lake Forest Park, WA, just outside Seattle, with his wife Ros.

 

Dr. Bird sat down at his University of Washington Medical Center office and generously responded to questions about his career and the history and human stories of Huntington disease.

 

Robin Lindley: Thank you Dr. Bird for talking with me about your distinguished career and your new book on Huntington disease. I’d like to first ask you about your own story. When you were a child, did you dream of becoming a doctor?

 

Dr. Thomas Bird: I grew up in a small town in upstate western New York, in the Finger Lakes area. My maternal grandfather was a country doctor in that small town. I didn't know him really. He died when I was five or six years old, so I only have a few vague memories of him. But our house where I grew up was just down the street from where he lived, and that was the family home on my mother's side, and my mother's brother, my uncle lived in that house, so I was very familiar with the house.

 

My aunt and uncle kept my grandfather’s old office intact. I remember wandering through it as a kid and seeing the examination chair and a little side laboratory with a microscope and shelves loaded with pill bottles.  I was very impressed.

 

So, I had this knowledge of my grandfather, even though I didn't know him. My mother clearly adored him so, when I was growing up as a kid, it was very clear to me that if you wanted to be the best you could be, you would become a doctor. That was never said explicitly, but it was the aura that I grew up with.  Later I got really interested in chemistry and thought I wanted to be a chemist. So it wasn't like I directly wanted to be a doctor. But when I went to college, it was certainly in the back of my mind because I became a premed major. 

 

Robin Lindley: What inspired you to specialize in neurology?

 

Dr. Thomas Bird: There were a lot of lines that led to that. I'm sure that having a brother with mental retardation made a difference in how I viewed people and how I viewed medicine and the things I was interested in. Having a brother that I lived with 24/7 my whole childhood life who had something not right with his brain impacted me a lot. And, when I went to college at Dartmouth, I actually majored in psychology and who knows exactly all the influences for why I did that, but my brother was probably part of it.

 

I also was just fascinated by human behavior. Fortunately for my future, the psychology department specialized in biological psychology so the faculty were very interested in the neuroscience brain piece of psychology. We had Skinner boxes where we did mouse and rat and pigeon experiments on behavior, and I took one course where we dissected a sheep brain and then a human brain. I think that made a difference. My mentor was a clinical psychologist who was actually in the department of psychiatry at the medical school as well as in the department of psychology. He taught me a lot about human behavior and our interests in that topic matched nicely.

 

So, when I went to medical school, I thought my trajectory would be to either be a family doctor, like my grandfather, or a psychiatrist because I was really interested in human behavior. When I got there, I didn't particularly like psychiatry. It wasn't a neuroscience or brain-oriented department of psychiatry so I lost interest in it.

 

Then, mostly by chance, I ended up having my summer project with the new chairman of the department of neurology at Cornell, Fred Plum. I knew nothing about neurology, but I wanted to stay on campus and work with somebody doing research in medicine. So I got hooked up with Dr Plum. He turned out to be a very dynamic, aggressive, energetic person who eventually became world famous. He wrote a bestselling textbook in the 1960s called Stupor and Coma, and he became one of America's leading neurologists. In the beginning I had no idea who he was or what I was getting into, but it turned out to be terrific. 

 

I started with a clinical project in coma. I learned how to do EEGs [electroencephalograms], and I was going around the hospital with a mobile EEG machine and doing EEGs on people in comas. So I got to see all kinds of neurology and I got to see it up close and personal. Then I started going to neurology grand rounds on a regular basis. I just became fascinated with the brain and with neurology. 

 

Back in those days, you didn't have to decide what you wanted to specialize in until you got into your internship. My internship was here at the University of Washington, and I already knew I was interested in neurology. In my internship, I had two neurology rotations and I just loved it and had a great time. The head of neurology department asked me if I’d like to be in the neurology program? And I said, "Sure." And that was it. That was before the days of matching for residency programs and before the days of signing a contract. It was just on a handshake.

 

Robin Lindley: And you then further specialized in genetics and neurology.

 

Dr. Thomas Bird: So I was in neurology, which I really loved. Then, in my last year of the neurology residency, I learned of the Medical Genetics Clinic here that had been started by Arno Motulsky who was one of the very earliest and most prominent medical geneticists in the country. He started medical genetics here at the University of Washington in 1957 and, along with Johns Hopkins, that was the first program in the country looking at the genetics of human disease with a medical orientation. He was very farsighted in doing that initiated a clinic that saw adults with genetic diseases.

 

So, as a senior neurology resident, I started going to that Medical Genetics Clinic and seeing the kinds of patients that I'd never seen before. They were all considered rare diseases--back of the textbook experiences. I'd go to my general neurology clinic and I'd see migraines and back pain and stroke and things like that, which were kind of interesting but I didn't consider them fascinating. And I'd go to this genetics clinic and I'd see Huntington's disease and cerebellar ataxia, muscular dystrophy, and Charcot-Marie-Tooth neuropathy. Things I'd never seen before. And I was fascinated by it. I discovered they had a fellowship training program associated with the clinic. I applied for the fellowship and was able to get some funding.

 

After I finished my residency, I did two years of fellowship in medical genetics and that became my career. I spent the rest of my career specializing in genetic diseases of the nervous system. Nobody did that back then for adults. It was a new area and it turned out to be very timely. I didn't realize it, but I was right on the cutting edge of a revolution in human genetics.

 

Robin Lindley: I'm very impressed by your extensive background. You're a pioneer in the medical genetics of neurological diseases, including Huntington disease, the focus of your new book, Can You Help Me?

 

Dr. Thomas Bird: I started a clinic in adult neurogenetics as a fellow in 1974, 45 years ago. At that time, in the 1970s in 1980s, the most common neurological disease seen in the medical genetics clinic was Huntington's disease. I had no idea that was the case until I got to that clinic. It was considered a rare disease, and yet there were people coming in with it every week. Because it was so common in that clinic, it became something that I couldn't avoid. And I found it extremely fascinating. So I saw what eventually developed to be hundreds of families with Huntington's disease over the next several decades.

 

Robin Lindley: I realize it's complicated, but what is Huntington disease or Huntington's chorea, as it was once called?

 

Dr. Thomas Bird: It was called Huntington's chorea for a long time. In a nutshell, it's a degenerative disease of the brain that's genetic. Those are the key things to know. So it's a brain disease. It's degenerative, so it's progressive and causes a deterioration in the brain. And it's inherited in what's called a dominant fashion. So, if someone has it, each of their children has a 50/50 chance of inheriting it whether they're a boy or a girl, and that's each time they have a child. And it's so progressive that it eventually is fatal. But it's slow, so the typical duration of the disease is about 15 years.

 

The manifestations of the disease primarily fall into three categories. One is trouble with their coordination and [patients] develop movements that they can't control. They have these jumpy, jerky, uncoordinated movements, and it can affect the hands, or the arms, or the legs, or the face. It can affect their whole body. So they develop these jumpy, jerky movements, and when they're walking, they almost look like they're dancing in an uncoordinated way. And that's why it got called chorea. Chorea is the Greek root word for dance, as in choreography, and chorea means dancing. And these people sometimes look like they're doing a dance.

 

Second, they also can develop a kind of dementia. So their judgment and their ability to solve problems can become mildly to moderately to severely impaired.

 

And third, they can have problems with behavior with disinhibition where they're unable to inhibit socially inappropriate activity. And their thinking can become disoriented or disarranged. They can become manic or they can become depressed or they can have delusions or even hallucinations. Their behavior can become quite abnormal.

 

Any of those things can happen to a person with the disease. And someone can just have mostly one symptom or a combination of two or all three. The dramatic piece that people notice is the chorea, and that's why it was called Huntington's chorea. But it became clear to doctors and investigators that there were people who had the disease with little or no chorea and, to be more comprehensive in terms of the name, it was called Huntington's disease rather than Huntington's chorea because Huntington's chorea implied that everybody had chorea and not everybody with it has chorea.

 

Robin Lindley: How common is Huntington's? It's thought of as very rare.

 

Dr. Thomas Bird: Everything is relative. It is rare. If you relate it to Alzheimer's or Parkinson's or cancer or diabetes, it's much more rare than those diseases. It's on the order of 10 cases per hundred thousand population. So that's not a lot, but it's actually more common than ALS (amyotrophic later sclerosis) or Lou Gehrig's disease, which people have heard a lot about. And it's more common than some other genetic diseases. So, it's not common, but it's not as rare as you might think.

 

We've seen hundreds and hundreds of families in Seattle. In my career, I know I've seen more than a thousand people with the disease. I don't think of it as rare. I think of it is uncommon, which is somewhere between rare and common if that makes sense.

 

Robin Lindley: There is no cure for Huntington disease and it's fatal. 

 

Dr. Thomas Bird: Right. So a couple of things to put it in context. First of all, it's called a fatal disease because people with it have a shortened lifespan and they get worse, then they die with the disease, usually of the things that happen to people who can't take care of themselves. It's the same as what happens when you get end stage Alzheimer's or end stage Parkinson's or end stage ALS. It's not really the disease that kills you, but you can't walk, you can't talk, you can't swallow, and you develop malnutrition and pneumonia and that's what you die of. But there's no question that lifespan is shortened, and that's why it's called a fatal disease.

 

But I always remember a woman who is very famous in the world of Huntington's disease. Her name was Marjorie Guthrie. She was Woody Guthrie's wife. When people said, this is a fatal disease, she would get a little upset about that word fatal. She would look you in the eye and say “life is a fatal disease” because she didn't like HD being labeled that way. She said everybody dies of something sooner or later so let's not get too down about this disease. Let’s be more optimistic and move forward.

 

And there is not a cure for this disease. What does that mean? That means that once it starts, there's nothing that stops its progression and there's nothing that prevents it from developing, and people continually go downhill with it. So, in that respect, there is no cure.

 

Robin Lindley: As you detail in your book, genetic testing is available for Huntington’s. Understandably, people who are aware of ancestors with the disease are often reluctant undergo testing.

 

Dr. Thomas Bird:  In 1993 medical science developed the ability to have a simple blood test to identify the mutation causing HD. That dramatically changed the field.  Now people at risk for the disease could actually find out if they had or had not inherited the HD mutation. Obviously that testing decision is fraught with all sorts of complications.  Because there is no effective treatment, most people decide not to be tested.  Those that do get tested may experience the gamut of emotional reactions from elation to serious depression.  In my book I have an entire chapter devoted to this amazing and often unpredictable range of responses. 

 

Robin Lindley: In looking at the history of the disease, Woody Guthrie is perhaps the most well-known Huntington disease sufferer. He’s an example of a patient. What do we know about his disease course?

 

Dr. Thomas Bird: We know a lot about him for many reasons. One, he was famous so a lot was known about him and he wrote his own autobiography. I've tried to read that. I haven't read it cover to cover. It's a strange story. In the first hundred pages, he goes into great detail about growing up as a kid in Oklahoma. And he talks about his friends and he talks about the games they played, and about the town he grew up in, and about the tricks that they played. He talks about the trouble they got into and he goes on and on about those things.

 

Of course, he didn't know that his mother had Huntington's disease. But once you get into his book, he talks about his mother and how he loved her. And he didn't understand why she would behave the way she behaved. She would lose her temper and she would scream and yell and she would throw dishes and she would run out of the house. And she was clumsy and she was always breaking things. He had no idea why, and he loved her dearly, but he recognized that there was something wrong with her.

 

And so, you learn that as his background, and then he became this famous folksinger who was highly popular. Then he began to develop the disease and his behavior changed and his thinking changed and he developed chorea. It became obvious that he had the same disease that his mother had. He got the diagnosis of Huntington's chorea and he eventually was institutionalized. He died in a state institution.

 

Robin Lindley: You detail the history of Huntington disease, which was not identified until the late 19th century, although it certainly had affected humans for thousands of years.

 

Dr. Thomas Bird: Yes. It's called Huntington's because that's the name of the young doctor that first described it best. In terms of history, it's a fascinating story. George Huntington, who the disease is named for, grew up in a little country village, East Hampton, Long Island, in the middle of the 19th century. His father and his grandfather were family doctors in that town. He went around with his father on rounds to see patients. He described going with his father in a buggy, and riding to the homes of people who had this disease. There was a family of people that had chorea and it ran in their family, so he knew about them as a kid because he had seen them with his father. He had grown up knowing of the characteristics of this family.

 

Then Huntington went to medical school, just like his father and grandfather, and became a family doctor. After medical school in New York City, he briefly moved to Ohio to try out a practice. While he was there, the local medical society asked him to present a paper. That was a professional organization and probably every week or every month they had one of their members present a paper. They asked him to do it as a new member. He presented a paper on chorea and I think it's partly because he knew this family, so it was something he felt comfortable writing about.

 

So, he presented this paper on chorea to the local county medical society and it was so good, he wrote it up and published it. He talked about all different kinds of brain diseases that can produce chorea.  At the end, almost as an afterthought, he said he'd like to mention this family he’d known for decades in his hometown because they're so interesting. Then he goes through a very accurate, careful description of the family in terms of their movements, their behavior, their loss of judgment and dementia. He described the fact that the disease was progressive and fatal. The fact that they had an increased frequency of suicide. The fact that males and females both got it. The fact that it was passed down from generation to generation. And if somebody had a parent with it, but that child lived into late adulthood and never showed signs of it, then it didn't show up in their branch of the family. So, Huntington really had cued into the genetic piece of it before medical genetics, and [before geneticists] knew about dominance. He had described dominant inheritance and he didn't even know what he was describing.

 

 He published this paper about chorea in 1872 when he was just 22 years old.  Over the next 20 to 30 years, other people in the US and in Europe realized they were seeing similar families. And they would write them up. And when they referred to them, they would always say we've seen a family with chorea and it's like this family that Huntington described and they would cite his paper. And so very quickly it got to be called Huntington's chorea because he was the one that first described it. And that was an 1872.

 

Robin Lindley: In discussing medicine and genetics, you mention a resurgence of interest in Gregor Mendel’s work in genetics a couple of decades later.

 

Dr. Thomas Bird: Yes. In 1900, biomedicine rediscovered Gregor Mendel's laws of inheritance. Mendel, the monk with his pea plants in what's now the Czech Republic, figured out inheritance of genes. Genes weren't actually known at the time, but that's what Mendel was studying.  He found out that things could be inherited in a dominant or recessive manner. And he very accurately and carefully described that and it was pushed aside and unrecognized and nobody thought anything about it for 30 years. And then in 1900, his papers were rediscovered, and people not only realized that it was relevant to the plant world, but they said, "Oh my goodness. Human diseases are inherited in the same way."

A scientist named Bateson in England was looking around for human diseases that he could say were dominant or recessive, like Mendel's pea plants. He came across the publications on Huntington's chorea and he looked at the pedigrees of families with Huntington's chorea and said, "Oh my goodness. This is autosomal dominant inheritance." This is what Mendel was describing in his pea plants as dominant inheritance occurring in the same way in a human disease.  Bateson started promoting that idea and Huntington's chorea suddenly moved to the front of the book in human genetic studies because it was considered a classic example of dominant inheritance. Even though it was rare, it became very well known in the human genetics field because it was very clear that it was an autosomal dominant disease. 

 

Robin Lindley: Was Parkinson's disease described by then too?

Dr. Thomas Bird: Parkinson's definitely was already described, but nobody thought it was genetic, so it wasn't part of the human genetics literature at all. Same with Alzheimer's. Alzheimer's was described about 1906 or 1907, and was a well-recognized disease, but no one really thought it was genetic. But Huntington's was special because it was clearly a genetic.

 

Robin Lindley: How this whole field has developed is fascinating. And the gene for Huntington’s wasn't identified until about 1993?

Dr. Thomas Bird: Yes, the gene was found in 1993.

 

Robin Lindley: It's incredible that Huntington was so far sighted.

Dr. Thomas Bird: The key advantage he had was realizing [the disease] was genetic, realizing that it was inherited. And he knew that because he had lived in the context of his family and his community for his whole life. He had seen several generations of this family and had no doubt it was inherited. As I mentioned he described the disease when he was 22 years old and never wrote another paper. He wasn't an academician at all. He never did research.

 

Robin Lindley: Huntington was more of a country doctor then?

 

Dr. Thomas Bird: He wanted to be a country doctor and he was a country doctor.

 

Then a couple of things happened with Huntington's that are of historical interest. One, a psychiatrist, I believe in Connecticut, saw families with this disease. He thought that the families in New England that had this disease were all related to each other, and that they had come over as migrants from England in the 1600s. And he thought he had evidence that they were persecuted as witches in New England in the 1600s and 1700s. He wrote about that, and that became a very popular theme about Huntington's disease that made people very uneasy.

 

Robin Lindley: Did this psychiatrist connect his view of Huntington’s with the Salem witch trials?

 

Dr. Thomas Bird: He thought so, but he wasn't able to quite make that connection. But it turns out he was wrong. He was wrong that they were all related to each other. He was wrong that they came over at the same time. And, as far as anybody can tell, he was wrong about them being persecuted as witches. But for years, Huntington’s had this context of being associated with witchery, for whatever that's worth. It was unfortunate and it also was not true.

 

Robin Lindley: And the eugenics movement wanted to rid the US of Huntington’s disease—by sterilizing patients. 

 

Dr. Thomas Bird: Yes. Because of the behavior of these people have and because of how abnormal they look and how deteriorated they get, Huntington’s got tossed into this pot of diseases we want to get rid of, especially because it was hereditary. So it got thrown into the eugenics movement in the first half of the 20th century. The eugenics movement in this country was focused at Cold Spring Harbor on Long Island. It's still a very prominent, biomedical research center even to this day. But back then, it was run by Charles Davenport, America's most prominent eugenicist. It's ironic that Cold Spring Harbor, Long Island, is only a few miles from East Hampton, Long Island, where Huntington lived. 

 

Davenport's idea was that "bad diseases" are genetic. He said alcoholism is genetic. Mental illness is genetic. Mental retardation is genetic. Criminality is genetic. Prostitution is genetic. Dementia is genetic. And, Davenport said, we need to eliminate these from our society and we need to do it by sterilization. And he and the eugenics movement included Huntington's chorea as one of those diseases along with alcoholism and mental illness and criminality. So, Huntington's got a bad rap when it was thrown in with these diseases that were bad for the society, and they wanted to get rid of it by sterilizing people.

 

Robin Lindley: So eugenics was supposedly aimed at improving the health of the society, and there was also an element of class and racial discrimination.

 

Dr. Thomas Bird: Yes. And Huntington's became part of that. Being part of the eugenics theme and being part of the witch theme really gave Huntington's disease a very bad aura. It became a stigma for the communities. It became a stigma for the patients. It became a stigma for their families. So it was something they didn't want to face up to. Patients didn't want to talk about the kind of disease you'd hide in the closet. And it was really difficult to get a handle on this disease in the first half of the 20th century. Plus, because of their behavior and because of the fact that it hits you in your early and midlife, they'd lose employment and often become poverty stricken. They frequently ended up in mental institutions. So, it wasn't surprising that they got thrown into this pot of mental disease that we wanted to get rid of.

 

Robin Lindley: Was there evidence that the Nazis in Germany euthanized people with Huntington's as part of their T4 eugenics program, Hitler’s program to "eliminate" the disabled, those labeled as "life unworthy of life"?

 

Dr. Thomas Bird: I don't know if any of them were actually euthanized. I don't know if that's documented or not, but when you look at the lists of diseases that the Nazis wanted to get rid of, Huntington's clearly appears on those lists.

 

Robin Lindley: You mentioned too, and this touches on our regional history, that there tend to be more West Coast cases of Huntington's disease than in other parts of the country. Do you think that has something to do with migration patterns?

 

Dr. Thomas Bird: I think so. I can't argue too strongly for that because the statistics just aren't there. But when you go back and look at the population numbers for states (and this was mostly done by death certificates which aren't terribly accurate), there are certain states that were noted to have more families with Huntington's than others.  If you look at those statistics HD seemed to be more prominent in Washington, Oregon and California.

 

When we started seeing families with Huntington's in the state of Washington, we wondered what was going on because we were seeing quite a few and we thought maybe they were all related. Maybe one family had moved to this area a hundred years ago, 120 years ago, and we were seeing the descendants of this one family. But we could look at the family trees of the families we were seeing and that clearly was not true. We were seeing very different families and hundreds of different families that were not related to each other. None of them were native western Washington people because there aren't very many native western Washington people. We're an area of migration, so it was clear that these families had come from the East and from Midwest.

 

We could see that the people with Huntington's that we saw were from families that had moved here from the East and the Midwest. When you looked at these death certificate reports, the Western states had a lot more Huntington’s than the Midwest states per population, so we thought that there was a migration factor and they were getting to the West Coast and couldn't go any farther, so they settled down.

 

Why would they migrate? In my mind, one of the reasons would be that people with Huntington's disease tend to be loners. They tend to want to go off by themselves. And they tend to be shunned by their communities because they look different, they act different, and they have social behavioral problems. They were not getting along in their local communities. So they moved. In our country when you move, more often than not, you move west. I think that's what happened to a certain degree.

 

Robin Lindley: That makes sense to me. I think Washington State has a reputation for attracting outcasts, misfits and loners. You also address the history of different approaches to treating Huntington's, including the use of lobotomy in the 20th century.

 

Dr. Thomas Bird: I didn't want to emphasize that very much, but it intrigued me because it became apparent to me that, particularly in the first half of the 20th century, it was common for people with Huntington's disease to get committed to state mental institutions. It's not so much now because there are fewer institutions and their populations have gone down but, up until the 1970s, it was very common for patients with Huntington's to be admitted to state mental institutions. And because they had a progressive disease that didn't get better, they tended to stay there for a long time, sometimes for the rest of their lives.

 

I ran into quite a few people with Huntington's disease in our state institutions and there was no good treatment for it then and there still isn't a good treatment for it. And, in the 1940s and 50s, Walter Freeman developed frontal lobotomy as a treatment for mental illness and it became extraordinarily popular. It was done mostly in mental institutions. Walter Freeman actually traveled around the country and showed psychiatrists and neurologists how to do frontal lobotomies. He went from state hospital to state hospital to state hospital doing that. And then [his trainees] would do it.

 

From 1938 to about the late 1960s, it was done on thousands of Americans in state institutions. I wondered, was it ever done on somebody with Huntington's disease? I actually had never seen anybody with Huntington's who had had that procedure, but it seemed to me, knowing they were in state institutions, knowing there was no good treatment for it, and knowing that this frontal lobotomy had become very popular in the fifties and sixties, the chances were that some people with Huntington's were getting lobotomies. I wondered if I could actually document such a case.

 

I went back to Walter Freeman's original textbooks on his procedure. He wrote two editions of his textbook and they [describe] the procedure. He [included] long lists of patients that he did the procedure on by number. He would give them a case number and then he'd just talk about them a little bit. I quickly realized that he didn't particularly use this procedure for certain kinds of mental illness. He thought lobotomy might be good for almost any mental illness, so he did it on all kinds of patients. He was doing it for schizophrenia and for depression. He was doing it on mania, he was doing it on dementia.  In essence anybody that misbehaved he thought was a prime candidate for a frontal lobotomy.

 

I looked in his index of one of his volumes of his books to see if he listed Huntington's disease. And sure enough, he did. So, I found a case in his records of a patient with Huntington's disease that he had done a frontal lobotomy on. In my mind, that confirmed in fact that this was being done on people with Huntington's disease. I have no idea how many, but knowing that he saw nothing wrong with doing it on Huntington's and knowing that he showed hundreds of doctors how to do it, and knowing that it was fairly commonly done all over the country, my guess is that probably at least a hundred people with Huntington's had that procedure done and maybe even more.

 

Robin Lindley: And Freeman’s lobotomy was such a crude procedure that was done with an icepick and a hammer.

 

Dr. Thomas Bird: Yes. It was very crude and it was not controlled and it was not done in any careful scientific manner. And Freeman was an evangelist for it and he was self-promoting both himself and his procedure. It definitely was out of control. I don't want to emphasize it, but I think it is part of the story of what happened to people who had this disease.

 

Robin Lindley: And you discuss the role of the asylum movement in the history of Huntington’s.

 

Dr. Thomas Bird. Yes. Asylums were built for that kind of person. Actually, the asylum movement was a positive, compassionate approach to help the community and to help the patients with severe mental illness. If you look at the people who were promoting asylums in the 19th century, they were trying to help by doing two things. Number one, they were trying to treat these very sad people who were very difficult to help.  They also were clearly trying to remove these people from society so that they would be separated out.  They thought that made society safer and it made the patients safer. The problem was, once you put them away, nobody paid much attention to them.  They could be abused and no one would know it. There was no mechanism for getting them back into society once they disappeared. And nobody wanted to pay a lot of money to take care of them.

 

And that of course is still a problem today. It's very expensive to take care of people in institutions. And by and large, the states and the communities don't want to put a lot of money into it. They complain bitterly about both the patients with these diseases and the institutions, but they don't want to fund them to a level that that will actually be effective.

 

Robin Lindley: Terrible problems developed with deinstitutionalization, by the 1970s, I believe, and many patients who were discharged from institutions wound up on the streets without support.

 

Dr. Thomas Bird: Yes. And Huntington's is one of many diseases where they commonly put patients into state institutions. And when the deinstitutionalization happened, they were put back out in the community, but nobody was paying much attention and they didn't really get the care that they ought to have. And today even, there are people with Huntington's who are homeless and are not getting good care because they seem to misbehave and they have no financial resources. They have what's called denial. They don't think there's anything wrong with themselves so they often refuse treatment and they refuse to take medications and they flounder in society.

 

Robin Lindley: The issue of lack of insight in the disease seems prominent.

 

Dr. Thomas Bird: It's very common. It was called denial, but they're not really consciously denying it. They really are not aware of their disability and lack of insight is a good way to put it. It’s lack of awareness. They're not aware of their behavioral abnormalities and they're not aware of their incoordination and involuntary movements, so they don't think they need help.

 

Robin Lindley: As your book demonstrates, you're a master storyteller. Can you tell me about your interest in writing and telling stories? Is it a matter of course for you as a physician to write about your cases?

 

Dr. Thomas Bird: I see patients in my mind. I see patients as human stories. I always talk to my patients. I always find out from my patients, who are they? What kind of work did they do? What was their occupation? Where did they live? Where were they born? Where did they go to school? Did they play sports? Did they have hobbies? Did they have any kind of talents? Who were their parents? Who were their brothers and sisters? What has their life been like? What got them to the office today? I always think of my patients that way. They're all human stories.

 

Robin Lindley: Did you keep case notes with those kinds of detailed descriptions?

 

Dr. Thomas Bird:   In my clinic notes I always dictated the background of the patient. Where they were born, where they had lived, where they went to school, what their occupation was. I always thought that was part of their story and I thought of people that way. 

 

And of course, the Huntington’s people would often have these problems with their occupations and with their marriages and with their families and with their behavior. They often were doing surprising things that you didn't expect and that would become part of their story. And sometimes they were recurrent and, month after month, I would see them and they were always having one problem after another. And some of them you couldn't forget because their problems were so complicated and sometimes so strange and sometimes so unusual and sometimes so difficult to deal with that you just couldn't forget them. 

 

In fact, when I retired, I couldn't get these people out of my mind. I had seen dozens and dozens of them that I couldn't forget and I kept thinking about them. I thought one way to help me deal with that would be to get it down on paper. That's when I started writing their stories.

 

Robin Lindley: You vividly and compassionately describe your patients and share dozens of fascinating stories about them. You describe a man who had a compulsion to steal, and that seemed a part of his disease. He was in prison when he wrote you for help. Was his compulsion related to Huntington’s?

 

Dr. Thomas Bird: That individual’s story generated the name of my book. I got a totally unexpected letter in the mail. It was a handwritten letter and it began "Dr. Bird, Can you help me?" It turned out, as I read the letter, that it was from a prisoner in the state penitentiary in Walla Walla. He knew he had Huntington's chorea because his mother had died with it. He didn't say why he was in prison, but he asked if we had a clinic that followed people with Huntington's chorea. I wrote back to him and said that we did, and I just put it aside. I didn't think I'd ever hear from him again because I knew he was in prison and you don't go to the Walla Walla state penitentiary for minor crimes. I figured he would probably would be there for decades.

 

And then a few months later, he turned up in my clinic. I was actually a bit surprised. It turns out that the prison then was badly overcrowded and he had complained about having this disease, which was obvious because of his movements. The warden had told him that, if he could document that somebody would follow him for his disease on the outside, they would release him. So as soon as he got my letter, he showed it to the warden and they released him from prison on parole. 

 

When he was out on parole, I saw him and followed him. I got a call one day from his parole officer who said he'd stolen a sweater from Nordstrom's. And the parole officer said,  “He's on parole and, if I report him for that, he'll go right back to prison.” And he said, “I like the guy and I don't want to send him back, and I know the pen is overcrowded anyway, so I'm going to let this go, but would you please tell him to stop stealing things?” And so, the next time I saw him, I did. I said, you know, don't do that or you'll end up back in jail. And he said he'd take that under consideration, but added that he couldn't help it. That was the way he put it. I think that's why he was in prison originally, because he had burglarized places over and over again.

 

A few months later, I got a call from the parole officer who said, “Sorry to tell you this, but he's back in the Walla Walla pen.” I asked, “Why?” And he said, “He burglarized a home, and the mistake he made was that he burglarized the home of a very wealthy, well known person.” It turned out to be the home of the owner of the famous Seattle restaurant, Rossellini's 410. And he was the brother of a former governor of the state. So it was a very prominent family. When they found out that the guy who had burglarized their house was on parole, they said he's got to go back. And so he did.

 

I talk in the book about whether his repetitive stealing had anything to do with his disease. That's sort of a leap. What's the proof that the two are related? Maybe he was just a burglar who happened to have Huntington's disease. But, if you look at the literature on Huntington's, it's very clear that one of the themes of the behavior problem can be obsessive compulsive illness. They can do things over and over again that they don't have any control over. They can become cigarette smokers. They can become alcoholics. They can become gamblers. They tend to be obsessive about a lot of things. Not always, but frequently. I think that's part of the disease. I suspect that this guy was a compulsive stealer because of his brain disease. I can't prove it, but I think it's very likely.

 

Robin Lindley: Doesn't this get into the neuroscience of addiction?

 

Dr. Thomas Bird: It gets into the organic brain foundation of mental illness. There’s this tendency to classify diseases as organic biologic diseases or mental diseases, and say they're not the same. If somebody has a mental illness, that's not like having cancer or diabetes. That's somehow different. But when you try and think that through, where does behavior come from? And that comes from their brain, right? It's not magic. Your language comes from your brain. Your vision comes from your brain. Your speech comes from your brain. Your walking, your talking, and your thinking come from your brain. So, doesn't mental illness, if we assume that there is mental illness, and I do, doesn't that come from your brain? And if you say that schizophrenia is a mental illness or manic-depressive or bipolar disease is a mental illness, doesn't that imply that it's a brain disease? So, if somebody is in a state institution or in a prison for abnormal behavior, how much of that is because their brains are not functioning properly, and is the right approach to that to just throw him in a cell and ignore him?

 

I think as a community we dropped the ball when we tried to deal with people with severe, difficult to control behavior. And I think we need to recognize at least some of that is being driven by abnormalities of the brain. Of course, environmental things are playing a role too. Your diet plays a role. Your parents play a role. Your peers play a role. Your occupation plays a role. Head trauma plays a role. All of those things are involved.

 

Robin Lindley: Are there a couple of other striking cases you'd like to mention?

 

Dr. Thomas Bird: I think people often don't realize that Huntington's has a juvenile piece. I have a case in the book that I call the "Princess in Pink" about this little girl who was in elementary school. She was a good student. She played soccer and kickball and she got along fine. She was just a really cute, nice kid. Then she started to have trouble. She couldn't run around as well anymore. And then she couldn't keep up with her peers in classwork, in reading and writing and arithmetic. She fell behind. And she was living with her grandmother because her mother died with Huntington's disease. Her family was aware of juvenile Huntington's, and they wondered, is that possible? Is our little girl developing juvenile Huntington's disease?

 

And it turned out to be exactly the case. When she was seven or eight years old, she actually began to deteriorate because she had developed this progressive disease. The nice thing about the story is that her teacher realized what was going on and she was particularly outgoing and kind to her and brought her classmates into her social welfare. And so, her classmates realized that she had a disease, that she was getting worse, but she was still this really nice girl that they'd known for several years already. And so, the teacher and the classmates formed this very effective safety net and support group for this little girl. And they wrote a class book about her called "Princess in Pink.”  It's really a very compelling story.

 

Robin Lindley: You included much of the text of their lovely book for this girl, and it's very moving.

 

Dr. Thomas Bird: Yes, it was a lovely class project. . An awful lot of it had to do with the teacher who I have tremendous regard for, Ms. Perry. She stayed the girl's teacher for three years in a row. They developed a really good relationship and the students were really kind to the girl.  She did very well for several years and stayed in school, but eventually became quite disabled and died with the disease, I think when she was 14. Her teacher and some of her students went to her funeral. So it's a sad disease, but it's a very nice story about what loving care she got from her social community.

 

Robin Lindley: That’s a touching chapter of your book. You also have the case of a man with Huntington’s who shot his roommate and he didn't know why.

 

Dr. Thomas Bird: I think that's another one of those lack of awareness kinds of things. He was just watching TV and he had a handgun and he pulled out the handgun and shot and killed his roommate and he didn't know why. He had Huntington's disease and he eventually went to prison.

 

And then I had the opposite case, where a man with Huntington's disease was killed by his roommate.  That's a striking example of how vulnerable people with Huntington's are. I think he got in with the wrong crowd. He had no idea what a miserable guy he got attached to as a roommate.  The roommate just decided to kill his friend who was obviously disabled with this disease. My guess is [the roommate] probably robbed him. So, on the one hand, a guy with Huntington's committed homicide, but on the other hand, a guy with Huntington's was very vulnerable and he was a victim of homicide. So, it can go either way.

 

So people with HD become very vulnerable. They can't take care of themselves and it's obvious to the community that there's something wrong with them either because of their behavior or their movements. And their judgment is very poor, so they can't figure out who's a good colleague and who's not a good colleague. And so, they often are abused by other people in the community because they're seen as disabled and vulnerable.

 

One of my favorite stories is about the young man who kept all his money in his shoe. He was homeless and he kept getting arrested. When he was in jail, he took the money out of his shoe and his cellmate noticed the money. His cellmate told him that he was a financial advisor and that, when they got out, if he turned his money over, he would invest it for him and make him a pile of money. And this guy with Huntington's was getting a VA pension, so he was getting monthly money. He met this "financial advisor" and started giving him all his money. He got the VA pension and of course his former cellmate was just stealing the money from it, and he ended up with nothing. Again, that shows how vulnerable he was.

 

Robin Lindley: You mention a tendency of Huntington patients to suffer head injuries. 

 

Dr. Thomas Bird: Yes. I also have a picture in the book of an MRI of a person with a subdural hematoma. When they fall or hit their head, they bleed into the area between their brain and their skull. People with Huntington's are also vulnerable to mild head trauma and tend to get these subdural hematomas. There's a story in the book about a woman who fell down the stairs and she eventually died of subdural hematomas.  We just recently had a man with HD who last month who died with subdural hematomas. Falls are bad news with this disease. It's a real problem.

 

We think what happens is that, with Huntington's, the brain tends to shrink from the degeneration so this space between the brain and the skull gets enlarged. There's more space there and that puts stress on the veins. And when a patient just hits his or her head against the wall or has what we would think of as mild trauma, the brain bleeds and there's a hemorrhage into the space between their skull and the brain. They're more vulnerable to that because their brain is shrinking.

 

Robin Lindley: Thanks for explaining that brain anomaly. I thought the head injuries were from movement problems and falling.

 

Dr. Thomas Bird: Yes, there is that. They hit the head more often because they are falling and bumping into things. But the kind of bump that wouldn't bother us can be very serious for them.

 

Robin Lindley: And there’s the vexing problem of suicide.

 

Dr. Thomas Bird:Yes. It's not so surprising, particularly for those who have awareness of their disease and especially if they've seen some other family member go through the full brunt of the disease. They don't want that to happen to them. So they can become very depressed and suicidal.

 

Robin Lindley: How can you treat or otherwise help these Huntington’s patients?

 

Dr. Thomas Bird: There are things you can do to help people and improve their lives. If you look at that context, the thing that helps the most is the helping community that you put around those people. If their families help them, their friends help them, their doctors help them, their nurses help them, their social workers help them, they do better. So what you need is a team that's focused on helping these people live their lives as best they can. And that's what helps them the most.

 

If they have certain symptoms, sometimes there are treatments for those symptoms. So if somebody is depressed, you can treat them with an antidepressant or you can do talk therapy and try to help them that way. There are some things that improve the movements. Some drugs slow down the movements. Of course, they have side effects, and sometimes it's a tradeoff. You slow down the movements, but you develop side effects. The same if they have delusions or, or psychotic behavior. There are drugs that are antipsychotics, and that sometimes improves that behavior. If they have severe anxiety, there are things that can improve anxiety.

 

So, there are things you can do to help people with Huntington's, but you don't cure the disease.

 

Getting back to it being a fatal disease, that brings up the issue of when it develops and the fact that it has this huge range of onset. It usually develops in the thirties or forties--those are the typical ages when you get it. And you may live for decades--three, four, five decades with no symptoms at all, and then develop the disease, and then live for 15 or 18 years. But there is a juvenile variety where children develop the disease. And there's a late onset variety where people don't develop it until they're in their seventies. So, if you develop it when you're 70 and 10 years later, you die of cancer, it wasn't really a fatal disease, right? So sometimes it is not as severe as it seems. As I say in my book, I've seen children who've had this disease when they were in an elementary school and I've also seen people who were in even their early nineties with the disease.  The age range is surprisingly large.

 

Robin Lindley: You describe how the health care system often falls through for Huntington’s people and they don't get required care. What changes would you like to see with our health care system?

 

Dr. Thomas Bird: So, the people without financial and social resources need more help that they can't provide them themselves. If we really want to care for them compassionately, we've got to provide some resources for them. And that depends on what their problems are. If it's a medical problem, they need doctors and nurses and medication. If it's a social problem, they need housing, they need an occupation, they need social workers, they need appropriate diet. So, I think we need to put more resources into caring for people who, for no fault of their own, don't have those resources.

 

For people with progressive mental illness, particularly those that have lack of awareness, it's hard to care for them because they don't think there's anything wrong with them. But we have to be careful not to compromise their autonomy. People should be able to know in our country that they're free to behave in a wide range of ways as long as you don't hurt other people. But if you're injuring yourself or you're injuring people around you in some way, we have to try and get better care for you. 

 

And if we can't put them in a state institution, then we need facilities in the community that are locally easily available that can care for them. And that's not easy if they don't want much care. But I think we need to provide as much as we can locally. For some people that means they need a controlled environment, at least for some period of time. It might be a week or a month or a couple of months, but there are some people that, for their own safety and health and for the safety and health of the community, need to be in a controlled environment for a while.

 

Robin Lindley: Wasn't the hope of those who advocated for deinstitutionalization that the mentally ill would have alternatives to institutions in their communities? 

 

Dr. Thomas Bird: Unfortunately, too much emphasis was placed on saving money. What everybody saw with it was that deinstitutionalization would save money. They could close state hospitals or they could reduce their size and that would save millions of dollars. There was also this idea that new medications had been discovered that would so effectively treat these diseases that patients would not need close monitoring. You would just give them a pill and they would be fine.  That idea turned out to be terribly naive.

 

Not enough resources were put into local facilities. There were some, but it wasn't anywhere near what was needed to really care for these people. And not only do you need a physical facility and not only do you need medications, but you need professionals who can care for patients and follow them and monitor them. That means doctors and nurses and medical staff and social workers who are trained and dedicated to care for these people who are difficult to care for. They're not simple. And so those professionals are expensive. The fact that you close down a state mental hospital doesn't mean that the adequate care of these people is going to be cheaper.

 

Robin Lindley: What are we learning from recent genetics research? Will it be possible in the future that some of the degenerative diseases like Huntington's disease can be prevented or somehow addressed with gene editing?

 

Dr. Thomas Bird: Yes. That's the hope right now. There's a very strong hope that the disease can be attacked from the genetic therapeutic standpoint and that there are ways to shut down the effects of the abnormal gene and basically turn it off. Whatever it's producing that's abnormal, you would stop and shut down and that would prevent the disease from progressing or from even developing. There's a lot of enthusiasm about doing that for Huntington's disease, and also all genetic diseases.

 

There have been a couple of successes in other diseases using that kind of approach. There's a disease of children called spinal muscular atrophy, a very severe condition. They've used a genetic approach to turn off that abnormal genetic mechanism and have the correct one work properly. It has been a huge benefit to these kids who otherwise would have died. There's a lot of enthusiasm for that.

 

And there is a study of Huntington's going on right now where they're using that kind of therapeutic approach and hoping that it will work. Probably in a year or two we'll know the results of that study. They're doing the same thing for genetic forms of ALS.

 

Robin Lindley: Is Alzheimer's in that category too?

 

Dr. Thomas Bird: Alzheimer's is more complicated because there are so many different causes of Alzheimer's and most Alzheimer's is not purely genetic. Huntington's is always purely genetic. Alzheimer's is usually not purely genetic, but there are some rare forms of Alzheimer's that are genetic diseases caused by a single mutation in a single gene. So in those rare kinds of Alzheimer's, that sort of approach is being considered.

 

Robin Lindley: Do you have any other comments for readers or anything to add about Huntington's or your research?

 

Dr. Thomas Bird: When I think back about Huntington's, the things I like to emphasize are that, for a medical science kind of person, it's a fascinating disease. It's also a tragic disease. And it's an important disease. Even though it's uncommon, I think it has important implications for all degenerative brain diseases and for mental illness. So I think its importance is way out of proportion to its uncommon frequency in the population. So it's fascinating. It's tragic. And it's important.

 

Robin Lindley: Thank you for your words Dr. Bird, and congratulations on your vivid and informative new book on Huntington's disease. Your compassion and devotion to your many patients with this perplexing and cruel condition is inspiring.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171788 https://historynewsnetwork.org/article/171788 0
Presidential Personality and Politics

 

 

America’s founders researched and valued the lessons of history. This is evident in The Federalist Papers, that series of newspaper columns written by Alexander Hamilton, James Madison, and John Jay for New York City newspapers that justified the new American Constitution and appealed for its ratification.  Since it was common for copies of newspapers to be kept in taverns for reading by patrons, it should not be assumed that they were read only in elite circles.  

 

Are we as thoughtful as that generation?  In his latest book, Joseph J. Ellis, formerly the Ford Foundation Professor of History at Mount Holyoke College, author of many books about that early period of American history, argues that founding period offers many lessons for present-day America.  Yet, with all its many virtues, no one will mistake it for the equivalent in our day of The Federalist Papers.  

 

Like so many modern books on history, this book is driven by a concern for personality and politics.  Ellis argues that the approaches to political issues of Thomas Jefferson, John Adams, James Madison, and George Washington can have lessons for modern politics. Though this book focuses on these people’s lives and gives little attention to the societies that surrounded them, Ellis offers little to evaluate what each man could have done differently politically or what mistakes they made that modern Americans should try to avoid, except in the most general sense of learning from their flaws in character. While personality and judgment are emphasized, Ellis does not really analyze how leadership functions in our society to foster discussions of policy.  

 

For example, Ellis argues that Thomas Jefferson played an active role in trying to limit slavery, especially in the new territories but also in his native Virginia. Jefferson was rebuffed and although he still opposed slavery, he was temperamentally uncomfortable with being a center of controversy if he could avoid it.  So he just gave up.  In his old age he opposed abolitionist schemes in the North because he felt that abolitionists would seek to produce mixed race communities that he felt were not feasible. In his view, abolitionism would just exacerbate sectionalism.  The author claims the racism of Jefferson illuminates present-day racism. But the author does not explain if he thinks racism is diminishing or increasing nor adequately explain terms like “structural racism.”  

 

His discussions of John Adams, James Madison, and George Washington follow the same pattern, discussing their personalities more than the choices the nation faced, then as now.   

For example, in retirement, Thomas Jefferson and John Adams exchanged friendly letters with each other that rebuilt their friendship that had become frayed during their earlier period as political rivals. In analyzing these exchanges, Ellis emphasized the temperamental differences between the two, the naïve optimism of Jefferson (except regarding slavery of course) and the fearful pessimism of Adams.   

 

Regarding “Our Gilded Age” he provides a quite good summary on the circumstances that led to quite severe maldistribution of income, not only in the United States but in places with similar economies like Europe.   But by limiting intellectual discussion of how to deal with this state of affairs to a recapitulation of the debate between John Adams and Thomas Jefferson, or in broader terms between the Federalists who favored a strong federal government, and the anti-Federalists who favored a weak federal government, what is left out is much detail about what particular actions would be useful at both the local and nationwide levels.  Instead the general mythology that all of American history is nothing more than a replaying of the debates between the Federalists and the anti-Federalists is reinforced. 

 

The author portrays James Madison as a skillful politician and highlights his role in organizing the Constitutional Convention and the ratifying conventions that followed. Madison changed his political alliances over time, moving away from the Federalist party and joining Thomas Jefferson when he felt that the Federalist administration became arrogant and even abusive.  More than John Adams, and even more than Thomas Jefferson, James Madison reacted to circumstances and changed his opinions. Ellis argues that Madison’s political skills clarifies what the founding fathers considered the original intent of the Constitution. Ellis really shows that Madison had good political instincts but his philosophy of government remains unclear. 

 

Ellis seems to admire George Washington most of all.   Ellis details how he sought to conduct foreign affairs in an honorable and reasonable fashion. For example, Washington  negotiated the Treaty of New York (1790) with the Creek Nation that would transfer western Georgia, northern Florida, southern Tennessee, and most of Alabama to the Creek Nation.  Instead, a flood of settlers on the Georgian frontier refused to be bound by this treaty and the legislature of Georgia rejected the treaty.  Nonetheless, Ellis uses this example to argue George Washington had the temperament to be honorable in foreign negotiations that many politicians throughout American history lacked.   He illustrates this by arguing American foreign policy after the fall of the Soviet Union has been characterized by half-baked moral crusades against greatly exaggerated threats. This claim seemingly undercuts his argument of the worthiness of our foreign policy establishment.

 

These are all interesting stories, but with few revelations and only very general lessons for the present, Ellis hopes that from Thomas Jefferson we can learn lessons on American racism, from John Adams lessons on economic inequality, from James Madison lessons on understanding constitutional laws, and from George Washington lessons on foreign policy.  Yet mostly what we have learned is that a leader should be thoughtful and gracious, and should understand the many ramifications of the problems under discussion. Also, leaders tend to be limited by their own prejudices, and the prejudices of their times.   Even when we share the values of these Founding Fathers, and when we don’t, their policy options are not necessarily the same as our policy options.  World trade, the threat of international war, the ecological crisis and automation in the workplace are all issues that do not have their 18th century equivalents. Yes, this book is a start, an enjoyable start, to help guide citizens in their future political decisions.  But these decisions require policy choices that cannot be handled merely by stories about the  Founding Fathers.  Nevertheless, we can take away good lessons especially on personal character from this start that Ellis provides.   

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171789 https://historynewsnetwork.org/article/171789 0
Do Those Closest to Trump Think He's Fit for Office?

 

Several times since 2016, I have criticized the performance of our current President Donald J. Trump.  This little essay is, one might say, “a horse of a somewhat similar color.”  Like many observers in our concerned society, I have speculated on the “mental capacity” and the job performance of the man who will apparently be in the Oval Office until 2020—and, who knows, maybe longer.

So far, I have reviewed five books about Trump in the White House by individuals who have really studied “the nature of our President’s  mentality.” I have gone back to see how those authors felt about the mental condition of the one who has appointed such weird and unsuitable individuals to high office. The authors are both male and female, politically inexperienced and veterans, and all have been given major media attention. What can we learn from these books? Let’s start.

The jacket of Omarosa’s 334 page “insider’s account” UNHINGED  boasts, “Few have been a member of Donald Trump’s inner orbit longer than Manigault Newman.” An Assistant near the Oval Office,  she ended her friendship Donald  with distaste. In a chapter entitled “I Think Our President Is Losing It,” she wrote openly about our President’s “level of paranoia” and  found “… something real and serious was going on in Donald’s brain.  His mental decline could not be denied….  I knew something wasn’t right.” Even more, she revealed the make of three guns he proudly owned and said at least once  in the primaries he carried a gun. 

Something of a pioneer in judging Trump is Michael Wolff, whose book FIRE AND FURY created a stir.  Boasting “deep access to the West Wing,” Wolff casually writes toward the end of the book that “staffers” were concerned that Trump’s rambling and alarming repetition of the same sentences had significantly increased. Further, his ability to stay focused, never great, had noticeably declined and the staffers worried this would be noticed by the general public. Wolff also makes frequent references to “Trump’s stupidity.”

At the time of reviewing, I found this short paragraph mid-book upsetting:  

“Trump’s extemporaneous moments were always existential, but more so for his aides than for him.  He spoke obliviously and happily, believing himself to be a perfect pitch raconteur and public performer, while everyone with him held their breath. If a wackadoo moment occurred on the occasions—the frequent occasions—when his remarks careened in no clear direction, his staff had to go into intimate  method-acting response.  It took absolute discipline not to acknowledge what everyone could see.” p. 137  

Even more, Trump had “contempt for other people’s expertise that was understood by everybody in his billionaire circle.” He was openly contemptuous of both the Bush and Obama families, and he didn’t back down from his disgusting critique of prisoner then national hero John McCain. (That gross misconduct has been quite amazing!)

At more than 400 pages, Bob Woodward’s FEAR is “drawn from hundreds of hours of interviews with participants and witnesses to these events.”  The President refused to be interviewed by the famous reporter and author.  Maybe it was just as well. Woodward argues President Trump has “anger issues,” problems making apologies, is erratic, is impulsive, and is “an emotionally overwrought, mercurial and unpredictable leader.”  The executive power of the United States has come to experience a “nervous breakdown.” p. xxii   

It can be downright frightening to read over 400 pages about life in the White House and the views of aides about their powerful leader. Rational lifelong leaders who had seldom shown fear now did. “The senior White House staff and national security team were appalled,” Woodward wrote, “They didn’t know what the president might say or do.” Staff Secretary Rob Porter said “A third of my job was trying to react to some of the really dangerous ideas that he had and try to give him reasons to believe that maybe they weren’t such good ideas.” Lawyer Dowd’s version of Trump’s characteristics (rooted in a 47 year legal career) is virtually unprintable: "he’s going to say ‘I don’t remember’ 20 times.  And I’m telling you, Bob, he doesn’t remember.” “[T]hese facts and these events are of little moment in his life." "I told you he was a goddamn dumbbell.” 

Trump asked  if if Kim Jong Un had a nuclear button on his desk “at all times,” why, “Will someone from his depleted and food starved regime please inform him that I too have a Nuclear Button, but it is a much bigger & more powerful one than his and my Button works!”  

What has happened to “patient diplomacy”?  Those of us who fearfully speculate that a North Korean missile just might get steered toward San Francisco or Honolulu, are jumping up and down.  The prospect of an oddball individual in a redecorated Oval Office thinking he can, casually,  consider happily arousing one like KIM to maybe fire off a missile if in the mood, free of any consequences, is much too much. Actions DO have consequences, no?  

Which brings us to James Comey: tarred by a difficult political decision he made late in 2016, but a decent long-time leader of our FBI. To me, he has every right to place on his book’s cover words like its provocative title,  “A HIGHER LOYALTY:  TRUTH, LIES, AND LEADERSHIP.  I believe him when he says in summary that D.T. insisted on his loyalty.  When he didn’t get his promise of loyalty in advance, he barged ahead to fire the FBI leader!  Earlier, when Comey was invited to dinner by Trump, he hardly got a word in, as the realtor/builder/golf course creator from NYC, dominated the conversation for an entire meal.  Writes Comey, “None of this behavior, incidentally, was the way a leader could or should build rapport with a subordinate.” Agreement is easy.

 Elsewhere, Comey resented the bizarre conditions placed on him at the time of his “release” from a lifetime with the FBI. He was 3,000 miles from his office and colleagues when Fired.  There was given no ability to say “goodbye” to subordinates.  Reading the Comey book arouses real sympathy for the stalwart FBI leader on the way out.

This writer finds that it is beyond his capabilities to mold into this short essay any account of the renderings—little more than educated guesses—about the mind of President Trump offered so far by physicians who lack firsthand interviews/examinations of him. In his book, the psychiatrist-author Justin A. Frank at a major university expresses apprehension over the nature of our President’s mind.  (Of course, a book like that can be written about anybody.)  I do have to say here, however, that the number of times since Inauguration I have heard—or said myself—“He must be crazy” is out of the ballpark.  Donald J. Trump is “different” in so many ways:  so often, unapologetic, in public, well, yes, disgusting.

On the other hand, these profiles of Trump don’t include what many perceive as his successes. His daughter contends “He has the heart and mind of a leader.” His TV show The Apprentice was highly successful.  He has magnified the wealth given him by his father.  He made his name in the tough NYC.  His friends and acquaintances seem to include movers and shakers.  He has not been thwarted by overseas ventures.  Daunted by one, then a second, marriage, he sought and found another at “his level.” 

Medically impaired or not, Donald J. Trump is engaged in splitting and impairing the United States of America.  Something must be done about it. From his daily expression as he comes and goes in the 2019 year, he may indeed be ailing.  Or maybe he just doesn’t enjoy the life surprisingly granted him in our White House and prefers the tropics.   And can’t help showing it?   May one hope that some remarkable change is in the offing, so that before too long “everything will somehow work its way out all right?”

Surely, we all deserve a happy ending: a rational federal government in all three branches, and a respected place as an organized people, living contentedly in this ever-changing world of ours.  If Donald J. Trump, our President, can’t or won’t be part of that somewhat idyllic system and help meet our needs, something will have to be done about it.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171768 https://historynewsnetwork.org/article/171768 0
Roundup Top 10!  

When Slaveowners Got Reparations

by Tera W. Hunter

Lincoln signed a bill in 1862 that paid up to $300 for every enslaved person freed.

 

Why you don’t need to be French or Catholic to mourn the Notre Dame fire

by Kisha G. Tracy

The cathedral is an important part of our shared cultural heritage.

 

 

How historians got Nike to pull an ad campaign — in under six hours

by Megan Kate Nelson

The multinational corporation dropped its “Lost Cause” ads after historians pushed back.

 

 

Immigration, Race, and Women’s Rights, 1919 and Today

by Arnold R. Isaacs

The comparison couldn’t, in many ways, be grimmer or more telling.

 

 

How California is dumbing down our democracy

by Max Boot

It is a matter of national concern that the California State University (CSU) system is on the verge of further diluting its already inadequate history and government requirements.

 

 

Elizabeth Warren’s historically sound case against the filibuster

by Julian Zelizer

The Senate rule has long been used as a weapon against civil rights and other progressive legislation.

 

 

The return of ‘reefer madness’

by Emily Dufton and Lucas Richert

Both supporters and opponents of legalization are quick to use sensationalism to prove their points, stunting the pursuit of real research needed to determine cannabis’ social effects.

 

 

Why Democratic Presidential Candidates Should Make Climate Change Their #1 Issue

by Walter G. Moss

Nothing else—including medical care, the economy, income inequality, immigration, racism, or the gender or race of a candidate—is more important.

 

 

Why Trump Won’t Stop Talking About Ilhan Omar

by Jamelle Bouie

The president is following a Republican playbook that is now nearly two decades old.

 

 

Join my Nato or watch critical thinking die

by Niall Ferguson

A new red army is out to silence debate. We must rise up and resist it.

 

 

Niall Ferguson isn’t upset about free speech. He’s upset about being challenged

by Dawn Foster

Powerful people used to express their views on others unopposed. Now their targets fight back, they find it intolerable.

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171779 https://historynewsnetwork.org/article/171779 0
American Jews Versus Israeli Politics Steve Hochstadt teaches at Illinois College and blogs for HNN.

 

Knesset chamber

 

 

Benjamin Netanyahu just won a record fifth term as Prime Minister of Israel. He has dominated Israeli politics for ten years. His reelection shows the widening gap between the ideas and politics of American and Israeli Jews.

 

The Israeli Attorney General announced at the end of February that Netanyahu will be indicted for bribery and fraud. Just days before the election, Netanyahu said that Israel would annex Jewish settlements on land in the West Bank taken in the Arab-Israeli War of 1967. About 400,000 Israelis live in West Bank settlements. He said, “I will impose sovereignty, but I will not distinguish between settlement blocs and isolated settlements. From my perspective, any point of settlement is Israeli, and we have responsibility, as the Israeli government. I will not uproot anyone, and I will not transfer sovereignty to the Palestinians.”

 

Netanyahu’s electoral opponents were a new coalition of centrist and conservative Israeli politicians. Thus the choice for voters was between a continued hard line against Palestinians and Netanyahu’s even harder line. His victory demonstrates the preference of Israeli voters for an ethically dubious politician, who offers no path toward peace with Palestinians, but continued seizure of formerly Arab land.

 

In 2009, Netanyahu made the following programmatic statement about the most pressing issue in the Middle East: “I told President Obama in Washington, if we get a guarantee of demilitarization, and if the Palestinians recognize Israel as the Jewish state, we are ready to agree to a real peace agreement, a demilitarized Palestinian state side by side with the Jewish state.” Since then he has gradually been moving away from this so-called two-state solution. In 2015, he employed harsh anti-Arab rhetoric during the last days of the election campaign, for which he apologized after winning. He seemed to move away from support of the two-state idea, but said after the election that this idea was still viable.

 

The election of Donald Trump pushed Israeli politics further right. Although Trump repeatedly claimed to have a bold plan to create a peace settlement between Israelis and Palestinians, in fact, he has openly supported Netanyahu’s movement away from any possible settlement. A year ago, Trump announced that the US officially recognized Jerusalem as the capital of Israel. Trump announced last month that the US recognizes Israeli sovereignty over the Golan Heights, seized from Syria during the 1967 war. Netanyahu used giant billboards showing him shaking hands with Trump.

 

To support his election bid this time, Netanyahu offered a deal to the most radical anti-Arab Israeli parties, which had thus far failed to win enough votes to be represented in the parliament, the Knesset. He orchestrated the merger of three far right parties into one bloc, the “Union of Right-Wing Parties”, and promised them two cabinet posts if he wins. One of those parties, Jewish Power, advocates the segregation of Jews and Arabs, who make up 20% of Israelis, and economic incentives to rid Israel of its Arab citizens. Jewish Power holds annual memorials for Baruch Goldstein, who murdered 29 Muslims at prayer in 1994. Imagine an American politician allying with a party which celebrates the murderous accomplishments of Dylann Roof.

 

Netanyahu recently said, “Israel is not a state of all its citizens,” but rather “the nation-state of the Jewish people alone.” That makes a “one-state solution” impossible, because non-Jews would automatically be second-class citizens. Netanyahu’s victory shows that the creation of a Palestinian state is less and less likely, as the land for such a state is increasingly seized by Israel.

 

While most Israelis also say they support a two-state solution, their real politics makes this support meaningless. A poll of Israelis in 2017 showed Jews leaning heavily to the right and extreme right. A more recent poll showed greatly increasing support for annexation: 16% support full annexation of the West Bank with no rights for Palestinians; 11% support annexation with rights for Palestinians; 15% support annexation of only the part of the West Bank that Israel currently fully controls, about 60% of it. About 30% don’t know and 28% oppose annexation.

 

Meanwhile, the uprooting of Arabs and confiscation of their land continue as Jewish settlements expand. While the West Bank is highlighted in the news, the Israeli policy of expelling native Arabs from their homes has also been taking place for decades in the Negev desert in southern Israel. Bedouin communities, many of which predate the founding of the Israeli state, have been systematically uprooted as part of an Israeli plan of concentrating all Bedouins into a few towns, in order to use their land for Jewish settlements and planned forests. The Bedouin communities are “unrecognized”, meaning that the Israeli government considers them illegal. Illegal Jewish settlements in that region have been recognized and supported, while much older Bedouin communities have been labeled illegal and demolished or slated for demolition. Essential services, like water and electricity, have been denied to the agricultural Bedouin villages in order to force their citizens to move to the new urban townships.

 

American Jews are overwhelmingly liberal. Polls since 2010 show over two-thirds supporting Democrats for Congress, rising to 76% in 2018. This long-standing liberalism meant broad support among American Jews for the civil rights struggle during the 20th century. Now the open discrimination against Arabs by the Israeli state, which in some ways resembles the former South African apartheid system, reduces sympathy for Israel.

 

Surveys of American Jews have demonstrated a consistent support for a two-state solution. Since 2008, about 80% of American Jews support the creation of a Palestinian state in Gaza and the West Bank. 80% also agree that a “two-state solution is an important national security interest for the United States.” Many factors have been moving American Jews away from support of Israel. The close family connections between Jews in America and Israel after World War II have diminished over the past half-century. The continued dominance of Israeli politics by ultra-Orthodox religious policies has worn out the patience of more secular American Jews in Conservative and Reform congregations.

 

In fact, the greatest support for hard-line Israeli policies has not been from American Jews, as Ilhan Omar recently implied, but from evangelical Christians who support Trump. After Netanyahu talked about annexing West Bank land, nine major mainstream American Jewish groups wrote to Trump asking him to restrain the Israeli government from annexation, saying that “it will lead to greater conflict between Israelis and Palestinians.”

 

The drifting apart of American Jews and Israelis is a tragic development, but perhaps an inevitable one. As Jews gradually assimilated into American democracy, they congregated at the liberal end of the political spectrum, feeling kinship with other minorities which experienced discrimination. American Jewish religious politics affirmed the traditional Jewish ethical ideas of justice, truth, peace, and compassion. Israeli Jews have faced a radically different environment. Although many of the early Israeli settlers and leaders came from the leftist European labor tradition, decades of conflict with Arab neighbors, in which both sides perpetrated countless atrocities, have led to hardening attitudes of self-defense and hatred for the other.

 

Jews in Israel support politicians and policies that I reject as abhorrent. That is a personal tragedy for me. The larger tragedy is that there appears to be no solution at all to the Israeli-Palestinian conflict.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/blog/154204 https://historynewsnetwork.org/blog/154204 0
I Stuck with Nixon. Here’s Why Science Says I Did It.

Richard Nixon surrenders to reality and resigns, August 9, 1974

Rick Shenkman is the former publisher of the History News Network and the author of Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books, January 2016). You can follow him on Twitter. He blogs at stoneagebrain. This article was first published by the  Daily Beast.

Will Donald Trump’s supporters ever turn on him? I think I know the answer. It’s partly because I’ve been in their place.

During Watergate I was a die-hard Nixon dead-ender. I stuck with him after the Saturday Night Massacre in the fall of 1973 and the indictments of Nixon aides H.R. Haldeman and John Ehrlichman in 1974. Not until two months before Nixon resigned did I finally decide enough’s enough.

What was wrong with me? I’ve been haunted by that question for decades. 

I can clear up one thing immediately. I didn’t support Nixon out of ignorance. I was a history major at Vassar during Watergate and eagerly followed the news. I knew exactly what he’d been accused of.

The fact is the facts alone didn’t matter because I’d already made up my mind about him. My fellow Vassar students—all liberals, of course—pressed me to recant. But the more they did, the more feverish I became in my defense. I didn’t want to admit I was wrong (who does?) so I dreamed up reasons to show I wasn’t—a classic example of cognitive dissonance in action. 

A pioneering study by social psychologist Elliot Aronson conducted in the 1950s helps explain my mental gymnastics. Young college women invited to attend a risqué discussion of sexuality were divided into two groups. One group was put through a preliminary ritual in which they had to read aloud a list of words like “prostitute,” “virgin,” and “petting.” The other group had to say out loud a dozen obscenities including the word “fuck.” Afterwards, the members of both groups were required to attend a discussion on sex, which is what had been the draw. But it turned out they had all been duped. The discussion wasn’t risqué. The subject turned out to be lower-order animal sexuality. Worse, the people leading the discussion spoke in a monotone voice so low it was hard to follow what they were saying. 

Following the exercise the students were asked to comment on what they had been through. You might expect the students who went through the embarrassing rite of speaking obscenities to complain the loudest about the ordeal. But that isn’t what happened. Rather, they were more likely to speak positively about the experience.

The theory of cognitive dissonance explains why. While all of the subjects in the experiment felt unease at being duped, those for whom the experience was truly onerous felt a more compelling need to explain away their decision to take part. The solution was to reimagine what had happened. By rewriting history they could tell themselves that what had appeared to be a bad experience was actually a good one. Dissonance begone.

This is what I did each time one of my Vassar friends pointed to facts that showed Nixon was lying. 

Neuroscience experiments in the 21st century by Drew Westen show what happens in our brain when we confront information at odds with our commitments. In one study, supporters of President George W. Bush were given information that suggested he had been guilty of hypocrisy. Instead of grappling with the contradiction they ignored it. Most disturbing of all, this happened out of conscious awareness. MRI pictures showed that when they learned of Bush’s hypocrisy, their brains automatically shut off the “spigot of unpleasant emotion.” (It’s not a uniquely Republican trait; the same thing happened with supporters of John Kerry.) 

In short, human beings want to be right and we want our team to win. But we knew all that, right? Anybody who’s taken a Psych 101 class knows about confirmation bias: that humans seek out information that substantiates what they already believe; and bounded rationality: that human reason is limited to the information sources to which we are exposed; and motivated reasoning: that humans have a hard time being objective. 

But knowing all this isn’t enough to understand why Trump voters are sticking with Trump.

What’s required instead is a comprehensive way to think about the stubbornness of public opinion and when it changes. Until a few decades ago no one had much of a clue what a comprehensive approach might look like. All people had to go on was speculation. Then scientists operating in three different realms — social psychology, neuroscience, and political science — began to delve into the working of the human brain. What they wanted to know was how we learn. The answer, most agreed, was that the brain works on a dual-process system, a finding popularized by Daniel Kahneman, the Nobel prize-winning Princeton psychologist, in the book, Thinking Fast and Slow.

One track, which came to be known as System 1, is super-fast and happens out of conscious awareness, the thinking you do without thinking.

There are two components to System 1 thinking. One involves what popularly is thought of as our animal instincts, or what social scientists refer to, with more precision, as evolved psychological mechanisms. Example: the universal human fear of snakes. The other involves ways of thinking shaped by habit. The more you perform a certain task, the more familiar it becomes and the better you get at it without having to think about it.

Donald Trump likes to say that he goes with his gut. What he’s saying, likely without knowing it, is that he has confidence in his System 1. This is not exceptional. Most of us trust our instincts most of the time. What distinguishes Trump is that he seems to privilege instinct over reason nearly all of the time.

The second track, System 2, is slower and allows for reflection. This mode, which involves higher-order cognitive thinking, kicks in automatically when our brain’s surveillance system detects a novel situation for which we aren’t prepared by experience. At that moment we shift from unconscious reaction to conscious thinking. It is System 2 that we rely on when mulling over a difficult question involving multiple variables. Because our brain is in a sense lazy, as Kahneman notes, and System 2 thinking is hard, our default is System 1 thinking.

One thing that’s worth noting about System 1 thinking is that our brains are essentially conservative. While humans are naturally curious about the world and we are constantly growing our knowledge by, in effect, adding books to the shelves that exist in our mind’s library, only reluctantly do we decide to expand the library by adding a new shelf. And only very rarely do we think to change the system by which we organize the books on those shelves. Once we settle on the equivalent of the Dewey Decimal System in our mind, it’s very hard to switch to another system. This is one of the main reasons why people are almost always reluctant to embrace change. It’s why inertia wins out time and time again.

But change we do, thanks to System 2. But what exactly triggers System 2 when it’s our politics that are on the line? Social scientists finally came up with a convincing explanation when they began studying the effect of emotion on political decision-making in the 1980s.

One of the pioneers in this research is George Marcus. When Marcus was starting out as a political scientist at Williams College he began to argue that the profession should be focusing more on emotion, something they’d never done, mainly because emotion is hard to quantify and count and political scientists like to count things. When Marcus began writing papers about emotion he found he couldn’t find editors who would publish them. 

But it turned out his timing was perfect. Just as he was beginning to focus on emotion so were neuroscientists like Antonio Damasio. What the neuroscientists were learning was that the ancient belief that emotion is the enemy of reason is all wrong. Rather, emotion is the handmaiden of reason. What Damasio discovered was that patients with a damaged amygdala, the seat of many emotions, could not make decisions. He concluded: The “absence of emotion appears to be at least as pernicious for rationality as excessive emotion.” 

If emotion is critical to reason, the obvious question became: which emotion triggers fresh thinking? Eventually Marcus and a handful of other political scientists who shared his assumption that emotion is important to decision making became convinced that the one that triggers reappraisals is anxiety. Why anxiety? Because it turned out that when people realize that the picture of the world in their brain doesn’t match the world as it actually exists, their amygdala registers a strong reaction. This is felt in the body as anxiety.

Eventually, Marcus and his colleagues came up with a theory that helps us understand when people change their minds. It became known as the Theory of Affective Intelligence (later: the Theory of Affective Agency). The theory is straightforward: The more anxiety we feel the more likely we are to reconsider our beliefs. We actually change our beliefs when, as Marcus phrases it, the burden of hanging onto an opinion becomes greater than the cost of changing it. Experiments show that when people grow anxious they suddenly become open to new information. They follow hyperlinks promising fresh takes and they think about the new facts they encounter.

How does this help us understand Trump supporters? It doesn’t, if you accept the endless assertions that Trump voters are gripped by fear and economic anxiety. In that case, they should be particularly open to change. And yet they’re as stuck on Trump as I was on Nixon.

The problem isn’t with the theory. It’s with the fear and anxiety diagnosis. 

Humans can multiple feelings at odds with one another simultaneously, but research shows that only one emotion is likely to affect their politics. The dominant emotion characterizing so-called populist voters like those attracted to Trump is anger, not fear. This has been found in studies of populists in FranceSpainGermany and Britain as well as the United States

If the researchers are right that populists are mostly angry, not anxious, their remarkable stubbornness immediately becomes explicable. One of the findings of social scientists who study anger is that it makes people close-minded. After reading an article that expresses a view contrary to their own, people decline to follow links to find out more information. The angrier you become, the less likely you are to welcome alternative points of view. 

That’s a powerful motive for ignoring Trump’s thousands of naked lies.

Why did I finally abandon Nixon? For months and months I had been angry over Watergate. Not angry at Nixon, as you might imagine, but angry at the liberals for beating up on him. Nixon fed this anger with repeated attacks on the people he perceived as his enemies. As long as I shared his anger I wasn’t prepared to reconsider my commitment to his cause. 

But eventually there came a point when I stopped being angry and became anxious. 

I would guess that what happened is that over time Nixon’s attacks came to seem shopworn and thin. Defending him became more of a burden than the cost of abandoning him.

If I am right about the circuitous path I took from Nixon supporter to Nixon-basher, there’s hope that Trump supporters will have their own Road to Damascus epiphany. Like me, they may finally tire of anger, though who knows. Right-wing talk radio and Fox News have been peddling anger for years and the audience still loves it.

It took me 711 days from the time of the Watergate burglary to my break with Nixon, when I resigned from a committee defending him, to come to my senses. As this is published, it has been 812 days since Trump became president. And there’s little indication that Trump voters have reached an inflection point.

Any of a number of disclosures could disillusion a substantial number of them. We have yet to read the full Mueller report. Nor have we yet seen Trump’s tax returns, which might prove politically fatal if they show he isn’t really a billionaire or if they prove his companies depended on Russian money. (As Mitt Romney suggested, the returns likely contain a bombshell.) 

If Trump’s disclosures suggest to his supporters that they were chumps to believe in him his popularity no doubt would begin eroding. And already there’s evidence his support has weakened. In January 51 percent of GOP or GOP-leaning voters said they considered themselves more a supporter of Donald Trump than the Republican Party.  Two months later the number had declined to 43 percent. If this slippage is because more supporters feel they are embarrassed to come out as full-blown Trumpies he may be in trouble come election day.

In the end, politics is always about the voters. Until now, Trump has made his voters by and large feel good about themselves by validating their anger. But there remains the possibility that in the coming months disclosures may make them feel that they have been conned, severely testing their loyalty. If the anger they feel either wears off or is redirected at Trump himself their amygdala should send them a signal indicating discomfort with the mismatch between the known facts and their own commitments.

This presupposes that they can get outside the Fox News and conservative talk bubble so many have been living inside. Who knows if they will. It is worth remembering that even in Nixon’s day, millions remained wedded to his lost cause even after the release of the smoking-gun tape. On the day he resigned, August 9, 1974, 50 percent of Republicans still supported him even as his general approval dropped to 24 percent.

To sum up: Facts finally count if enough loyalists can get past their anger to see the facts for what they are. But people have to be exposed to the facts for this to occur. And we can’t be sure that this time they will be.

 

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/blog/154203 https://historynewsnetwork.org/blog/154203 0
The Sorrow of Watching Notre Dame Burn

 

On the 20th day of Brumaire during the Second Year of the revolutionary order, a spectacle was held inside the newly consecrated Temple of Reason. Upon the altar of what had once been the magnificent cathedral Notre-Dame de Paris at the very heart of the greatest city in Christendom, the religious statues were stripped away (some decapitated like the heads of the overthrown order) and the whole building was turned over to a festival for what the most-radical of Jacobins called the “Cult of Reason.” In the hopes that this atheistic faith-of-no-faith would become the state-sponsored religion of the new regime, the revolutionaries staged their own observance, with young girls in tri-colored sashes performing a type of Morris dance about a statue of the Goddess Reason.  Such was just another occurrence among the competing factions of the Revolution, and which saw the dechristianization of France, including not just the iconoclasm of smashed stain-glass and white-washed images, but the execution of perhaps 30,000 priests. Less than a decade laterMass would once again be celebrated upon Notre-Dame’s altar. 

Within the shadow of its spire – which as of today no longer stands – the great Renaissance essayist Montaigne would have walked. By the massive rose window which filtered natural light into the ring of cobalt blue and emerald green, solar yellow and fire red, Rene Descartes may have contemplated his Cogito. By its gothic flying buttresses and underneath its simultaneously playful and disquieting gargoyles the novelist Victor Hugo both celebrated her stone walls and arches while advocating for her 19th century restoration. In 1323 the scholastic theologian John of Jandun would write of the cathedral that she “deservedly shines out, like the sun among stars.” And through it all, over a millennium of Parisian history, the cathedral stood guard from its island in the Seine.  Which is not to say that the cathedral hadn’t been destroyed before, and that it wouldn’t be destroyed again. Notre-Dame withstood the Wars of Religion which burnt across France during the sixteenth-century and Hitler’s orders to leave not a stone of Paris standing when the Nazis retreated at the end of the Second World War, and yet the cathedral endured. Since the twelfth-century Notre-Dame has survived, and while we watch with broken hearts as her spire collapses into the burning vaulted roof during this mournful Holy Week, we must remember that Notre-Dame will still be standing tomorrow. 

Sorrow for the destruction of something so beautiful, so perfect, must not obscure from us what a cathedral is. A cathedral is more than the granite which composes her edifice, more than the marble which lines the nave. More than the Stations of the Cross and the statues; more than the Crucifix which punctuates the altar. A cathedral is all of that, but it is also an idea; an idea of that which is more perfect than this fallen world of ours. More mysterious, and more powerful, and more beautiful. When we see push notificationsalerting us to the fire of this April 15th, when we see that tower which points to the very concept of God collapsing above her nave, it can feel as if civilization itself is burning. As if watching the Library of Alexandria be immolated on Facebook live, or reading the live tweeting of the dissolution of the monasteries. In this age of uncertainty, of rage, of horror, and of violence; of the decline of democracy and the heating of the planet; it can feel as if Notre-Dame’s fire is as if watching the very world itself be engulfed. Which is why it’s so important to remember what a cathedral is, what Notre-Dame is. 

Skeptics can reduce that which is associated with the phrase “High Church” to an issue of mere aesthetics, as if in our post-Reformation, post-secular world the repose of a cathedral is simply a mood or a temper and not a profound comment in its own right. An allegiance to the sacredness of silence, of the holiness of light refracted onto a cold stone floor. Minimalism makes its own offers and promises, and requires its own supplication, and the power of simplicity and thrift should not be dismissed. But a cathedral makes its own demands – a cathedral is beautiful. The intricacy of a medieval cathedral is not simply an occasion for art historians to chart the manner in which the romanesque evolved into the gothic, or for engineers to explicate the ingenuity of the flying buttress. Notre-Dame isn’t simply a symbol of Paris, nor a landmark by which a tourist can situate themselves. A cathedral is larger than the crowds which line up to take selfies in front of it; a cathedral more significant than the gift shops and food trucks which line the winding cobble-stoned streets that lead up to it. A cathedral is an argument about both God, but also humanity and the beauty which we’re sometimes capable of. 

Tomorrow the world will be less beautiful than it was this morning, and this is in a world which has precious little beauty that it should be able to give up. That Notre-Dame should be burning this April evening is a calamity, a horror. It is the loss of something that is the common treasury of humanity, which belongs not entirely to the people of France, nor only to those whom are Roman Catholics, but which rather sings of those yearnings of all women and men, living in a world not of our own creation but trying to console each other with a bit of beauty, a bit of the sacred. To find that meaning in the cathedral’s silence, in that movement of light and shadow upon the weathered wooden pews and the softness of the grey walls. The 17th century English poet George Herbert wrote of “A broken ALTAR… Made of a heart and cemented with tears,” as indeed may describe the crowds gathering along the Seine and singing hymns to our burning cathedral this spring night. Herbert’s poem is an apt explanation of what a cathedral is. A cathedral is a person. Her spine is the nave, and the transept her arms; the window her face, and the spire her head – the altar a heart. And though a cathedral is as physical as our finite bodies, threatened by incendiary and crowds, by entropy and fire, its soul is just as eternal. 

If there is something to remember, it’s that in the era before steel and reinforced concrete an anonymous mason would begin work with his brothers on a cathedral that his children would most likely never see completed. Perhaps his grandchildren would never live under its full height either. To work on a cathedral was a leap into a faith that we can scarcely imagine in our era, to work towards a future you’d never see, and yet to embrace that which is greater, more sublime, more perfect that you are. Our attitude of disposable consumerism and exploitive capitalism makes such an ideology a foreign country to us, yet if we’re to solve any of those problems that face us today – from climate change to the restoration of democracy – it must be with the faithful heart of a medieval mason who toils with the knowledge that a spire will rise above Paris – again. 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171724 https://historynewsnetwork.org/article/171724 0
Benny and Joon and a Good Look at Schizophrenia

 

How do you stage a charming musical about schizophrenia? Was there ever a dimmer, sadder and troubling topic for a play?

Just ask the folks who run the Paper Mill Playhouse, in Millburn, New Jersey, where the new musical Benny and Joon opened on Sunday. It is delightful look at modern schizophrenia with three stars who are not only entertaining, but work hard to exam schizophrenia and talk about it on stage with candor, and with smiles, too.

Benny and Joon is the musical version of the 1993 movie of the same name that starred Johnny Depp. Schizophrenia was such a controversial topic in that year that the word schizophrenia was never mentioned in the script. Now, thankfully, it is.

Benny and Joon are brother and sister (he’s early twenties and she’s 20 or so). Joon suffers from schizophrenia and was a handful for her parents. They were killed in a cars crash when Benny was 18. Now, with them gone, Benny, who runs a car repair shop, has to raise her. All of a sudden, after a bad night playing poker, Benny has to provide room and board for a kooky young man, Sam, who comes to live with them. Sam, who wears and odd- looking hat, sees himself as the re-incarnation of Charlie Chaplin and Buster Keaton and mimics them. In one nice bit he uses dinner rolls as people and has them dance.

It starts with Sam’s arrival. Joon is relentless in her schizophrenic behavior and although Benny loves her to death, she drives him crazy. He faces the very real possibility of putting her into a group home with other mentally ill people. Joon, of course, wants to keep living with him and continue her amateurish career as a painter. He does not know what to do and consults Joon’s psychiatrist. 

The story of the play is Benny’s fear of putting Joon into a home or, later in the play, a mental institution. Throughout the story, Joon exhibits numerous schizophrenic tendencies. She is moody, very happy and then very sad, convinced people are trying to hurt her, fearful of what will happen to her. She’s impulsive. She doesn’t listen to people. She’ argumentative, possessive. There is no typical schizoid, but Joon exhibits the qualities of many people seen as such.

Yet, though all of this you love her.

 Kirsten Guenther wrote the book and the music and lyrics are by Nolan Gasser and Mindi Dickstein. They use their words and songs to suggest that while Joon might need help, she may not need all of the help that people suggest. They also get you to root for Joon. Isn’t she like so many quirky people we all know? Don’t put her away, people will say, just put up with her.

Sam, as he bops around the stage in a very goofy way, starts to admire, and then love, Joon. It’s an improbable relationship, to be sure, but so what? Where will they live, Benny asks his sister? The answer, as she frets, is well, who knows. We’ll get by.

Big brother Benny is scared to death. He is so, so worried about his sister and needs to protect her. What he’s going to do?

The success of the play is the work of the three stars, Claybourne Elder as Benny, Hannah Elless as Joon and Bryce Pinkham as the slightly nutty but thoroughly adorable Sam. They play their characters as lovable people trying to ward off schizophrenia.

The story is not about schizophrenia, but how it affects the families of its victims. It is the story, too, about how all mental illnesses affect families. We need more of these stories. There are tens of thousands of moms and dads, brothers and sisters, and have been throughout history, who have to live with and cope with mentally ill people. It is a struggle and Benny and Joon shows that in a majestic way. You need to love and support the victims of mental illness, not just toss them into a group home.

The music in Benny and Joon is OK, but none of the songs are memorable. Together, though, they create a nice atmosphere for the story. Some of the songs are painful, as they help to tell the story of the brother and sister and their wacky friend Sam.

The show’s director, Jack Cummings III, gets fine work from his stars, Elder, Elless and Pinkham, but also gets fine performances from the other actors in the play - Colin Hanlon, Paolo Montalban, Conor Ryan, Natalie Toro, Jacob Keith Watson, and Tatiana Wechsler.

Schizophrenia is a relatively new illness, not named by Doctors until 1908. Medical specialists today see schizophrenics as people with split personalities. They are, in general, wildly eccentric, believe other people are trying to get them to do things, feel slightly paranoid and see themselves embattled against just about everybody.

Benny and Joon, in the end, is both a sobering look at schizophrenics and a wonderful look at a pair of siblings who fight and feud, with the troubles of schizophrenia added, but, through it all, love each other.

We need more plays like this one. And more Bennys and Joons in this world, too.

 

PRODUCTION: The play is produced by the Paper Mill Playhouse. Scenic and Costume Design: Dane Laffrey, Sound: Kai Harada, Lighting: R. Lee Kennedy., choreography: Scott Rink. The play is directed by Jack Cummings III. It runs through May 5.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171750 https://historynewsnetwork.org/article/171750 0
The Electoral College and the Myth of a Proslavery Ploy

 

As the New York Times at present lacks the proper format for a debate over an Op-Ed of mine that it published on the origins of the Electoral College, “The Electoral College Was Not a Pro-Slavery Ploy,” I am grateful to History News Network for giving me the opportunity to reply to Akhil Reed Amar and his Op-Ed, “Actually, the Electoral College Was A Pro-Slavery Ploy.” I look forward to continuing the debate on slavery and the Constitution, which certainly includes the inception of the Electoral College but also involves the larger and, I believe, more important historical issues raised in my recent book, No Property in Man: Slavery and Antislavery at the Nation’s Founding.

    

Earlier this month, I wrote on Op-Ed piece for the New York Times disputing claims that the Electoral College originated as a slaveholders’ ploy at the nation’s founding in 1787. The issue has become important recently as part of a larger debate about the Electoral College. In the wake of two presidential elections, in 2000 and 2016, where the electoral system overruled the popular will, many Americans, especially inside the Democratic Party, have declared that the system ought to be seriously amended, if not eradicated in favor of direct popular election of the president. I have long believed the Electoral College at least needed fixing. The point of my Op-Ed, though, was a different one, having to do with history.

Last September, I published a book that challenges the prevailing wisdom about the role of slavery at the Federal Convention in 1787. Almost as an aside, the book briefly discusses how the framers created the Electoral College and argues that, on this matter at least, the prevailing wisdom is correct: the Electoral College, I wrote, arose out of Southern delegates’ efforts to give as much extra power as they could to the Southern slaveholding states. Since then, in part because of the current public debate over the Electoral College, I closely re-examined the issue and concluded that, like most of my fellow American historians, I have been wrong about slavery and the Electoral College. 

In advance of the publication of a paperback edition of my book this fall, I duly prepared a new preface that explained my change of mind. As the general debate about the Electoral College heated up, though, and as the Electoral College’s opponents began decrying its origins in slavery, I thought I ought to write an essay on the subject in a more public venue, lest anything I wrote in my book be taken to support a claim I no longer believe.  I wrote that essay and the Times published it – and very quickly thereafter, the Times published another Op-Ed by Akhil Reed Amar disputing and dismissing my piece. 

Not surprisingly, I think that Amar’s response is mistaken. His account turns the origins of the Electoral College into a simple story that fits well with commonplace current views of slavery and the U.S. Constitution.  Disarmingly straight-forward, it seems almost self-evidently true. But in order for his claims to hold up, Amar’s story has to ignore a great deal that either does not fit or that flatly contradicts him: for example, that the Electoral College emerged as an alternative to another proposed system that also protected slavery; or the fact that the leading pro-slavery states actually voted against what would become the Electoral College, even though it promised to give more protection to slavery than the system they supported.  

On closer examination, though, Amar’s response is even more deeply flawed and in fundamental ways. But before I get into those defects, let me provide a basic chronological narrative of how the framers created the Election College.

 

*************************************************

 

On June 2, 1787, shortly after the convention to frame the new Constitution opened in Philadelphia, the delegates endorsed the creation of a national executive, the president, who would be elected by the national legislature and serve a single seven-year term. In mid-July, though, when the delegates began to favor making the executive eligible for re-election, the consensus for congressional election of the president faltered. 

From the early days of the convention, some delegates had argued in favor of having the executive elected directly by the people at large – a “people” then restricted to white men who met certain property qualifications, but still many orders of magnitude larger than the as-yet only vaguely envisaged Congress. Previously marginalized in the convention’s debates, the advocates for direct election returned to the fray, including such distinguished delegates as Gouverneur Morris who was representing Pennsylvania. American democracy, these men insisted, had reached a point where ordinary voters could and should choose the president. The Congress would be much more vulnerable to bribery and other forms of corruption than what Morris called “the people at large…the freeholders of the County.” 

The convention debated the matter sharply.  At one point, Hugh Williamson, a non-slaveholding delegate from North Carolina, mentioned that direct election would hurt Virginia, the state with the most slaves, because enslaved men “will have no suffrage.” The preponderance of the objections, though, from southerners as well as northerners, echoed the disdainful observation of George Mason – another distinguished delegate and an actual Virginian -- that having the people at large choose the president would be like referring “a trial of colours to a blind man.”  On July 17, the convention defeated a motion for direct election by 9 states to 1.       

A third group of delegates, however, supported an intermediate elector plan, as a middle ground that would provide a more attractive alternative to legislative selection than direct election.  Their basic idea was to give the authority for electing the president to independent electors, possibly chosen by the people, possibly by the state legislatures. The germ of such a plan had appeared in the convention debates much earlier, when James Wilson of Pennsylvania realized that a direct election proposal he was offering would fail. But after the dismal defeat of the direct election proposal on July 17, the supporters of an electoral system, including some like Wilson who had previously spoken in favor of direct election, found their collective voice. 

On July 19, William Paterson of New Jersey, who happened to be a critic of slavery, offered a proposal “that the Executive should be appointed by Electors to be chosen by the States”; Wilson, who was more keenly antislavery than Paterson, chimed in that it was now the “unanimous sense” of the convention that the executive be chosen by “an election mediately or immediately by the people.” James Madison of Virginia, the slaveholder later known as the father of the Constitution, gave a speech that, while it praised the defeated idea of direct election, instead backed an electoral system. A direct system, he observed, for all of its strengths would weaken the slaveholders' power as the electoral system under consideration would not. Momentum for independent electors grew. A motion to replace legislative election of the president with an electoral system passed easily, 6 states to 3, with one state divided.  

It is important to pause here and explain the place of slavery in the different proposals for choosing the president. The congressional selection plan gave the slaveholding states a singular advantage as the convention had already approved the notorious three-fifths clause. This clause stated that apportionment of the seats in the lower house of Congress would be calculated by including three-fifths of the total of each state’s enslaved population. As the three-fifths addition would apply if Congress was selected to choose the president, the congressional mode had a powerful appeal to the South. By contrast, as Madison pointed out in his speech, a direct election system would mean that the non-voting slave population would count for nothing in selecting the president, which Madison said would prevent any Southerner from winning the presidency. The electoral system under consideration, however – a detail that Madison did not mention -- would count all of each state’s inhabitants, slave and free, toward apportioning electors, which in principle would give the slaveholding states even more additional votes than the congressional system gave them with the three-fifths rule.

The really crucial point to remember here is that the electoral system began gaining support as an alternative not to direct election but to the congressional plan, which also protected slavery. Because, except to its strongest supporters, direct election seemed dead and buried, the convention did not face a choice between a system that favored slavery and one that did not but between two systems which favored slavery in different ways. This fact is essential to understanding the convention’s decision to adopt the electoral system and everything that followed. 

Although it was momentarily the convention majority’s choice, the electoral system still had some formidable foes, chiefly in the three most ardently proslavery states, North Carolina, South Carolina, and Georgia. That these states were so opposed to an electoral system is, to say the least, ironic given today’s conventional historical wisdom, but those states had their reasons. Lower South delegates instead favored what had been the approved system, selection of the president by Congress. Their calculations had nothing to do with protecting slavery; indeed, the electoral system promised to offer them additional votes for the president above and beyond what their favored congressional system did. Rather, they scorned the electoral system on elitist grounds, charging that the independent electors, unlike congressmen, as Hugh Williamson put it, “would not be the most respectable citizens” but men of inferior rank, open to bribery and other forms of corruption. And when the convention approved the electoral system, the only three states that voted against it were North Carolina, South Carolina, and Georgia. The lower South states, though, would not easily abide their loss, and they joined in mounting a counterattack. 

Five days after it approved the electoral system, the convention, with the full support of the lower South, reversed itself and rejected the electoral system -- seemingly for good – and restored the choice of the executive to Congress. The day after that, James Madison, who now believed an electoral system was forever doomed, dissented from the convention’s switch and strongly endorsed direct popular election, with hope that the system crushed a few days earlier might yet be revived. 

For more than a month thereafter, the convention continued to support congressional election of the president, but a major dispute broke out over procedure; and so, in the convention’s waning days, a special committee of eleven, appointed to settle the convention’s still unfinished business, offered a comprehensive plan of its own. It was this committee that revived the electoral system idea and effectively invented the Electoral College we know by proposing, for the very first time, apportioning the electors according to each state’s combined representation in the House and Senate. 

Slavery would seem to have been irrelevant to the special committee’s concerns, as its proposal offered the South an inflated proportion of electors just as the southerners’ preferred congressional system did. As Gouverneur Morris, a member of the committee, explained to the convention, the group was motivated by alarm at “the danger of intrigue & faction” should the legislature be authorized choose the president. Yet Morris also remarked that, with nobody “satisfied with an appointment by the Legislature,” “many” committeemen, “were anxious even for an immediate choice by the people.” (These members almost certainly included Morris himself.)  

Morris’s speech raises intriguing questions and possibilities about the committee’s private discussions. With appointment by Congress on the ropes, did Morris seize the opportunity to resuscitate his arguments in favor of direct popular election? Might James Madison, who was also on the committee, have replied that, although direct election was the “fittest” system, its disadvantages to the Southern slave states recommended an electoral system instead? Might another committee member, the New Yorker Rufus King, have restated his reasons, shared in by William Paterson, for endorsing direct election? In the absence of more detailed evidence, it is impossible to know. What is clear is that, as a majority of the committee opposed direct election, it was no more in the cards now than it ever had been. 

At all events, the convention at last approved the committee’s recommendations, with modifications, eleven days before the convention completed its work. The only opposition came from North Carolina and South Carolina, fighting to the bitter end for election of the president by the Congress. 

 

*************************************************

 

Akhil Amar’s response to my Op-Ed evades virtually all of the substantive points I made about this history. Instead, inside a brief space, Amar offers three assertions to support the view that the Electoral College was a proslavery ploy. First, he says, James Madison explained to the convention that a direct popular vote for president was a “non-starter” for the South because, “as slaves couldn’t vote,” the South would lose every time. Second, Madison’s “political calculation” is why the convention rejected a direct vote system. Third, in lieu of a direct vote, the framers considered an indirect electoral system which counted slaves – a system, in Amar’s words, that “might sell in the South.” “Thus were planted,” Amar claims, “the early seeds of an Electoral College system.”

Practically everything in this account, however, is either illogical, false, invented, or factually incomplete. Let’s start with Amar’s first assertion that James Madison supposedly described a popular vote system as a “political nonstarter.”  Amar is referring to the speech that Madison gave on July 19, the day the convention approved – temporarily, as it turned out – an electoral system. In that speech, Madison indeed recommended an electoral system, noting that the defeated direct election system he still deemed the “fittest” would hurt the slave South “on the score of the Negroes.”  “The substitution of electors,” Madison said, would correct for these problems.  

But to pluck that speech out of context, as Amar does, is to distort not just Madison’s thinking but the purport of what he said. Recall that five days after Madison delivered that speech, the convention reversed itself and rejected the electoral system. Then recall that, a day later, Madison, now believing the convention would never approve an electoral system, dissented from the convention’s switch and strongly endorsed direct popular election, still hoping that it had a chance -- the system that Amar says Madison ruled out completely because it was unacceptable to the South. 

Slavery, as it happened, was not the only thing or even the main thing on Madison’s mind.  Having Congress elect the president might give the slaveholding South extra votes for president, but he believed it would also invite corruption, which overrode whatever benefits the system might afford the slaveholding states. By contrast, he believed a direct vote system was preferable, even if it diminished Southern power. If it came down, in his mind, to a choice between the health of the republic and power considerations for the slaveholding states, he would choose the former; or, as he put it, “local considerations must give way to the general interest.” As a southerner, he concluded, “he was willing to make the sacrifice.”

Amar tells a different tale. According to him, Madison dismissed the direct voting system as “nonstarter” because it hurt the slaveholding South. The evidence shows this is false. Although he would have preferred an electoral system for reasons having to do with slavery, Madison hardly rejected direct voting because it was a “nonstarter” for the slaveholders or for any other reason. As soon as an electoral system seemed to be off the table, he returned to supporting a direct system, despite the long odds against it, rather than support a congressional system that was favorable to slavery but also vulnerable to corruption.

Amar’s second assertion is more consequential but equally wrong; indeed, it is an invention. By his account, Madison’s speech of July 19 explained to the convention why direct election would have been a “dealbreaker” for the slaveholding South and that the convention subsequently rejected direct election of the president. The trouble is, as the basic narrative shows, by the time Madison delivered this speech, he could not have been warning the convention against direct election. The reason is simple: the delegates had already soundly defeated direct election two days earlier.   

Here’s what really happened. On two separate occasions, the convention crushed proposals for direct election: the first time, on July 17, by nine states to one; the second much later, on August 24, by nine states to two. On the first of these occasions, the North Carolinian Williamson made his stray remark about how a direct system would hurt the largest slaveholding state, Virginia, but this was the only time on either occasion that the issue of slavery arose. There is no evidence that the northern states which voted “nay” did so out of deference to any slaveholder’s “dealbreaker” objections, explicit or perceived. On the other hand, as we have seen, there is plenty of evidence that the northerners – and many if not most southerners as well -- regarded direct election, in the words of Elbridge Gerry of Massachusetts, as a “radically vicious” system, in which an uninformed people “would be misled by a few designing men.” 

Amar’s third assertion is illogical, incomplete, and invented, all at the same time. He asserts that, at Madison’s prompting, the delegates or some of them, began wondering: “if slaves could somehow be counted in an indirect system, maybe at a discount (say, three-fifths), well, that might sell in the South.” Here, Amar claims, begins the real story of the inception of the proslavery Electoral College. Yet the scheming that Amar imputes to unnamed delegates about an indirect system is pure fiction. Moreover, as we have seen, the proslavery lower South actually rejected the idea of an electoral system despite its relative advantages to slaveholders, preferring a system of congressional election based on a formula that would have provided the slaveholding states a smaller number of electors. 

In all, Amar wants us to believe that the delegates sowed the proslavery seeds of the Electoral College when Madison explained to them that the South would never agree to a direct election system – a system the convention had, in fact, already defeated. He would have us bypass the fact that the most vociferous proslavery states, instead of rallying to that plan, resisted it in favor of another. And he would have us overlook that the plan the proslavery states favored promised to give them a smaller proportion of the vote for president than the electoral plan they opposed.     

To be sure, there are some traces of truth in Amar’s argument. First, because the framers tolerated slavery where it existed from the very beginning of their deliberations, slavery touched and often distorted many aspects of the new federal government. In the case of electing the president, the framers’ toleration led to the Southern slave states getting extra power, derived from the three-fifths compromise struck early on in the proceedings. Second, during the debates over the mode of electing the executive, two Southern delegates did note that a system of direct election of the president would hurt the slave states. But all of this put together is still a far cry from demonstrating that the Electoral College originated as a proslavery ploy. 

A good way to summarize what actually happened inside the convention is to recall the place of slavery in the different plans that the convention considered about electing the president.  The delegates weighed three options: the president would be selected by direct popular vote, by Congress, or by electors who would be chosen either by the people or the state legislatures. Direct election failed, but not because it was intolerable to the slaveholders, as Amar maintains. It failed because it enjoyed little support in the convention, for reasons that had nothing to do with slavery. The real choice for the framers was between the congressional method and the electoral method. Both methods gave the slave states an extra measure of power in selecting the president; so once direct election was scrapped, the convention was bound to grant the slave states some sort of bonus. But this was because the great majority of the convention did not trust in the people at large to choose the president. There was no slaveholders’ ploy.

Another way of putting this is to concede that the full story of the framers, the Electoral College, and slavery shows that proslavery concerns did indeed arise at the crucial point when the convention’s decided to reject direct popular election of the president. But the role they played amounted to twelve words in an insignificant speech by Hugh Williamson, and, at a great stretch, some remarks by James Madison after the direct voting plan had been defeated. Beyond that, proslavery concerns had nothing to do with the convention’s debates over what became the Electoral College; two northern critics of slavery, William Paterson and James Wilson, opened the debate that led to the temporary adoption of an electoral plan; proslavery delegates resisted that plan and helped get the convention to abandon it; and when the convention finally settled the issue, it agreed at the last minute to scrap what had long been the lower South’s preferred arrangement, selection by the Congress, in favor of a system of electors. The claim that the Electoral College originated as a proslavery ploy is a myth that can be sustained only by misreading the evidence or by simplifying in order to manipulate it.

Amar offers some additional criticisms of my Op-Ed’s discussion of the effects of the Electoral College after 1787. These chiefly involve a paragraph on the election of 1800-01, which argues that the Federalists’ interference with the electoral vote in Pennsylvania offset the extra votes that Thomas Jefferson received as a result of the three-fifths compromise. My point was simply that it is badly mistaken to say that the compromise unfairly handed Jefferson the presidency. Amar interprets this as an attempt on my part “to erase the ugly fact that the South had extra seats in the Electoral College because of its slaves.” His imputation is offensive as well as mistaken. To describe how the evil of slavery prevented the outrageous theft of a presidential election is not to evade or apologize for slavery and the three-fifths clause. It is to describe a terrible irony.           

In the 21st century, the Electoral College has twice thwarted the popular will. The debate over its future cannot be helpfully advanced by distorting its complex origins, in which slavery’s role was not central but incidental.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171722 https://historynewsnetwork.org/article/171722 0
President Donald Trump, HIV/AIDS, and Black Lesbian and Gay Activism

ACT UP Protestors in New York

 

 

In his State of the Union address on February 5th, 2019, President Donald Trump surprisingly included a plan to eliminate HIV/AIDS in his budget: “My budget will ask Democrats and Republicans to make the needed commitment to eliminate the HIV epidemic in the United States within 10 years. Together, we will defeat AIDS in America.”  The inclusion of HIV/AIDS in his address came as a surprise to many because one of President Trump’s first actions upon arriving at the White House was firing all 16 members of the Presidential Advisory Council on HIV/AIDS.

Though President Trump reinstated this council 15 months later, his initial actions were indicative of his longer record on HIV/AIDS. The AIDS Coalition to Unleash Power (ACT UP)--New York held many direct-action protests, including one at Trump Tower in October 1989. Roughly 100 protestors gathered to protest the 6.2 million-dollars in tax abatements Trump received to build the mixed-use, high-rise property at a time when those stricken with AIDS were increasingly vulnerable to homelessness. Protestors saw Trump Tower as a symbol of corporate greed and argued that state monies could have been used to build more housing facilities for those impacted by AIDS.  

Creative writer, activist, and scholar Sarah Schulman has written that the rise in sudden deaths of gay men during the early era of AIDS hastened gentrification in New York City—their absences from rent-controlled apartments and their partners’ lack of access to inheritance claims accelerated the conversion of these apartments to market-rate rents. The early AIDS crisis facilitated changes in the constitution and character of New York City neighborhoods, linking it to larger trends in gentrification that have shifted the racial demographics of inner cities from ethnically and class diverse to more homogenous, middle-class, and increasingly white enclaves. 

Trump’s plan to end AIDS within this decade also came as a surprise given his abandonment of his mentor Roy Cohn after rumors spread publicly that Cohn was dying of AIDS.  It was Cohn’s ruthless business tactics and genius maneuverings around legal loopholes that helped Trump secure the tax abatements to build Trump Tower. Cohn had cut his teeth in politics as Senator John McCarthy’s chief counsel during the Army-McCarthy Hearings in 1954.  Cohn became a power broker in local New York City and federal politics, and in 1971 represented Trump when he was accused of violating the Fair Housing Act in 39 of his properties. Trump’s organization was accused of quoting different rental terms and conditions and asserting false claims of “no vacancy” to African Americans looking to rent apartments in his Brooklyn, Queens, and Staten Island properties. Under Cohn’s direction the Trumps countersued the government for $100 million dollars for defamation, and were able to settle the lawsuit against the Trump corporation by agreeing to stipulations that would prevent further discrimination, thereby not having to admit guilt.

 

 

Trump’s record on AIDS and racial and sexual discrimination make his 10-year plan even more surprising since the face of the U.S. AIDS epidemic is primarily black and Latina/o, especially gay, bisexual, and transgender blacks and Latina/os. In January 2019, the Black AIDS Institute (BAI), a Los Angeles-based, national HIV/AIDS think tank focused on black people, expressed their dismay when the Trump Administration proposed a change in “protected class status” under Medicare, which has allowed people living with HIV to access better medical care. In their response to his State of the Union address, BAI questioned President Trump’s intentions, since he has repeatedly sought to cut the President’s Emergency Plan for AIDS Relief, better known as PEPFAR, a multi-million-dollar initiative which has been credited with saving 17 million lives around the world. Moreover, they indicted the President for his racist and homophobic rhetoric, which has fueled an increase in violence against black and LGBTQ communities. One of the suggestions BAI made to move Trump’s plan from words to action was to center leadership from communities most impacted by HIV.

Some of the earliest leadership from communities impacted by HIV/AIDS emerged from black lesbian and gay artists and activists during the early era of AIDS. Beginning in the late 1970s, black lesbian and gay arts and activist movements—which political scientist Cathy Cohen has identified as the first stage of AIDS prevention efforts in black communities—centered collectivity, self-determination, creativity, and radical love as central to their political practice. They saw the elimination of racism, homophobia, and economic inequality as essential to the elimination of AIDS in black communities. In 1986, Philadelphia based, black gay journalist, creative writer and activist Joseph Beam published the editorial “Caring for Each Other” in Black/Out magazine, the official publication of the National Coalition of Black Lesbians and Gays. The essay is a meditation on placing community responsibility ahead of reliance on the state. Beam believed that the state had never been concerned about the lives of black people. State apathy, he argued, extended to black gay men and IV drug users dying of AIDS, stating that “it would be a fatal mistake if we were to relinquish our responsibility for AIDS in the black community to such an external mechanism.”  

Indeed, Trump’s proposal to end AIDS by targeting geographic and demographic “hot spots” in seven states, 48 counties, Washington, D.C., and San Juan, Puerto Rico, comes as part of a budget plan that would eliminate funding for global AIDS programs, slash expenditures on the Centers for Disease Control and Prevention, while transferring the management of Medicaid through block grants to states, comprising an overall cut to spending on health and human services. This plan proposes to end health inequalities at the local level while threatening to reproduce broader social inequalities at the state, national, and global levels. 

Though Trump’s plan of action challenges Beam’s narrative of state apathy by continuing the contradictory record of state action that began with President Ronald Reagan when AIDS first appeared, Beam’s caution suggests that our efforts to end HIVAIDS in poor communities and communities of color across the globe must not depend solely on federal or state bureaucracies. Instead, this history suggests that plans to eliminate HIV/AIDS must be centered on community care and responsibility, and political action aimed at transforming the conditions of structural inequality that President Trump has perpetuated throughout his career.

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171714 https://historynewsnetwork.org/article/171714 0
Health Care for All – A Cautionary Tale from the 1970s

 

With the 2020 presidential election around the corner, both parties appear headed, once again, for a train wreck on health care.  While scores of Democrats in Congress and on the presidential campaign trail advocate a single-payer health care system for all Americans immediately, other Democrats embrace the idea of universal coverage as the ultimate goal, but believe it should be achieved incrementally.  To some this seems like a repeat ofthe late 1970s when Democrats allowed the perfect to become the enemy of the good, and nothing was done on health care---for another 30 years. Meanwhile the unrelenting opposition of Republicans to the Affordable Care Act suggests that the GOP has no serious interest in offering an affordable health care plan. The voters punished them for it last year.  “Those who cannot remember the past are condemned to repeat it,” George Santayana famously said, offering an immutable truth that should be imbedded in the mind of every member of Congress.  

 

Health care coverage in the United States has had a compelling but sometimes fraught history that is essential to understand before it is reconsidered. Theodore Roosevelt first proposed national health care in his 1912 platform but he lost that election. Subsequent Democratic presidents including Franklin Roosevelt, Harry Truman and John Kennedy supported the idea but it was Lyndon Johnson who achieved Medicare for seniors with the Medicare Act of 1965.  At last every American 65 and over became eligible for federal health insurance regardless of income or medical history; it also included coverage for low-income Americans in the form of Medicaid. It was a landmark achievement, made possible by a unique moment in history and the tenacity of Democratic presidents in keeping the Republican Roosevelt’s 1912 idea alive. 

 

The next Democratic president, Jimmy Carter, was in step with his predecessors as he wanted to extend health care to all Americans, but the economic conditions of that time werevery different from 1965.  While both houses of Congress were Democratic in 1977-78, inflation was out of control and the economy as a whole was weak, straining the resources of the federal budget.  Carter had been a progressive governor of Georgia but a fiscal realist; he believed the country couldn’t afford such an enormous cost at that time without serious economic consequences.            

 

While Carter embraced universal coverage as the ultimate goal, he believed it should be achieved incrementally, not only for affordability but also for feasibility. An incremental approach, Carter contended, would aid the federal government’s ability to digest and administer such a huge and complex new system. Additionally, proposing a stepped approach would make it more likely to attract bipartisan support, which he believed was important for its long-term sustainability. 

 

Not everyone agreed. Eight years after Johnson’s Great Society was enacted, there were still pent-up demands among congressional Democrats for new federal spending.  Senator Edward M. Kennedy (D-MA) was the most vocal spokesman, and he was also, many suspected, planning to challenge Carter for the Democratic presidential nomination in 1980, using national health care as a defining issue.   

 

In 1977 Carter’s White House reached out to Kennedy to find a middle ground.  It became clear early on that there was a significant difference between the two camps. Over many months, the two parties tried to compromise, but the talks eventually faltered over the specific phasing-in of Carter’s proposal. The unbridgeable gaps were fully revealed at the final meeting between Carter, Kennedy, and their staffs in the Oval Office on July 28, 1978,.  When they first appeared, Carter, according to one participant, told Kennedy, “It will doom health care if we split . . . I have no other place to turn if I can’t turn to you . . . I must emphasize fiscal responsibility if we are to have a chance.”  Kennedy left the White House and soon announced he couldn’t support whatever the Administration offered on health care and he would write his own comprehensive bill, which he unveiled on May 19, 1979.  

 

A month later, Carter delivered a message to Congress calling for catastrophic coverage for all Americans so that  families who incurred severe and costly injuries or illnesses would not be financially destroyed. He also called for “comprehensive” coverage of 16 million low income Americans (Medicaid). It was a thoughtful, generous and responsible proposal, and it won significant early support on Capitol Hill, not least because many Democrats saw it as an essential step toward universal coverage.

 

In the previous fall of 1978, Kennedy had addressed the Democrats’ mid-term convention in Kansas City and threw down the gauntlet to Carter: “There are some who say we cannot afford national health insurance . . .But the truth is, we cannot afford not to have national health insurance.”  Tensions between the two men, already high, came to a boil when Kennedy formally announced his candidacy for president on Nov. 7, 1979. With no major issues dividing the candidates -- save for the timing but not the goal of universal coverage -- Kennedy’s campaign got off to a faltering start.  It was apparent he needed strong support from the more liberal trade unionsand some unions did sign on with Kennedy, including the United auto Workers, which had been a long-time supporter of national health care. The UAW’s  leadership pledged it would use its clout to see the plan enacted. Even after Carter captured sufficient delegates to win the nomination following a brutal series of primaries, the UAW would notback down from its all-or-nothing position. Neither would Kennedy. 

 

The hard-fought contest took its toll on both candidates and, tragically, on the issue of health care. In short, the dynamics of the 1980 primary campaign inevitably precluded the kind of legislative process that might have enabled universal catastrophic coverage to become law.  An important opportunity was lost; the American people would have to wait another 30 yearsfor major health care reform. 

 

It finally arrived in 2009 when President Barack Obama unveiled the Affordable Care Act as his highest legislative priority. The ACA or, as it became known, Obamacare, bore a striking resemblance to Carter’s proposal three decades before. New to the presidency, Obama’s leadership was sometimes hesitant and he failed to articulate a strong and consistent public case for his proposal, an omission that made passage more difficult. At a joint session of Congress in September 2009, the president read an endorsement from Senator Kennedy, written before he had died the month before. Obama rallied the congressional Democrats and, with the indispensable help of Speaker Nancy Pelosi, ACA finally became law in 2010.  It was an historic achievement, representing the most significant regulatory overhaul and expansion of coverage since 1965. 

 

With few Republicans supporting Obamacare, GOP leaders made its repeal their rallying cry for nearly a decade. Yet, they failed even when Republicans controlled both houses of Congress and the White House.  With Democrats now in control of the House of Representatives, the ACA finally appears secure--except that President Trump’s Justice Department is trying to overturn the ACA altogether.  

 

Republican control of the Senate and White House makes it a prohibitive time to attempt any major expansion of health care.  There is nonetheless an opportunity for Democrats -- and hopefully Republicans -- to prepare for the future by working together during the next two years to fix and strengthen the ACA so that it actually delivers the care it is meant to deliver.  They should also come together to significantly reduce the cost of medications, for which there is an undeniable bipartisan public mandate. Who knows where this could lead?  If led by serious people on both sides, it could yield yet more success stories like criminal justice reform and conservation of public lands.  Whatever it is, it’s better than polarized stalemate.

 

Thus, if the ultimate goal is to expand affordable health care to every American, history offers important lessons. It tells Democrats that in the next two years they must be politically savvy, and in some instances, uncharacteristically restrained, if they want to be poised to offer a viable form of expanded health care in 2021. They must be honest that 2021 is the first time a plan realistically can be considered.  Before then, they must avoid the public perception of “over-reach,” a political deadly sin that costs politicians who appear to offer grand proposals that are hugely expensive, complex and unwieldy. “Medicare for All” comes to mind as something many people already see as over-reach. Voters have finely attuned antennae, and most can tell when they’re being played by a slogan.  

 

On the other hand, Americans will respond favorably to reasoned proposals even for aspirational goals,as they did in 2018. They will do so again if a plan is couched in language they can understand, such as supporting a proposal for 2020 that offers “affordable health care for every American regardless of income or existing conditions.” At the same time, liberal Democrats should resist the siren song of ideological purity and embrace insteada pragmatism that will assure ultimate success.  The run-up to 2020 will be better than the 1970s unless Democrats take their eye off the ultimate goal and again allow a deep division within the party to preclude the outcome most Americans seek.

 

As for Republicans, history tells them that if they want to help shape America’s health care of the future, they should 1) accept the legitimacy, if not every detail, of the ACA,which is, after all, a direct philosophical descendant of the thinking of the conservative Heritage Foundation, as well as the first cousin of Republican governor Mitt Romney’s plan for Massachusetts, and  2) abandon their blind opposition to any expansion of health care. They should engage in a constructive and serious conversation with Democrats so that by 2021 we will have something approaching a national consensus on how to care for our health. 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171720 https://historynewsnetwork.org/article/171720 0
Ilhan Omar is a Reconstruction Reformer

 

I stand with Ilhan Omar. As a historian of Reconstruction, I must. 

 

Omar embodies the best of Reconstruction-era reformers. She articulates a robust and inclusive vision of civil rights. She is a vocal advocate for the dispossessed and an outspoken opponent of racism and bigotry. She opposes Donald Trump’s nativist and Islamophobic “Muslim ban” and supports paid family leave and raising the minimum wage. In fact, she even co-sponsored the “Never Forget the Heroes Bill” that would permanently authorize the September 11th Victims Compensation Fund.

 

I did not run for Congress to be silent. I did not run for Congress to sit on the sidelines. I ran because I believed it was time to restore moral clarity and courage to Congress. To fight and to defend our democracy.

— Ilhan Omar (@IlhanMN) April 13, 2019

 

That last part might come as a surprise to those who know Omar primarily from the wave of race-baiting unleashed by conservative politicians, press, and agitators. Indeed, the president himself has repeatedly Tweeted lies about Omar paired with images of the 9/11 attacks obviously designed to make Omar out to be a terrorist. 

 

WE WILL NEVER FORGET! pic.twitter.com/VxrGFRFeJM

— Donald J. Trump (@realDonaldTrump) April 12, 2019

 

But we should recall that this has been a Republican strategy for quite some time now. The Republican Party of West Virginia implied that Omar was a terrorist last month, suggesting that Americans, by electing a Muslim, had “forgotten” the 9/11 attack. Again, this wasn’t some far-Right website. It was the WV state Republican Party. 

 

Nor is Omar the first woman of color to be targeted by Trump. Last year, Trump launched similar attacks against California Congresswoman Maxine Waters. These and other racist and Islamophobic attacks on Omar and Waters have inspired death threats against both women.  

 

As a scholar of Reconstruction, this recent surge in racist propaganda has me worried. It is precisely the tactic that conservatives used to subvert Reconstruction-era reforms. They publicly targeted politicians in their newspapers and incited violence as a tool to regain political power after having been defeated during the Rebellion.

 

I wrote about an eerily similar campaign of terror against Victor Eugène Macarty, an Afro-Creole politician recently for the Journal of African American History. Like Omar, Macarty was an outspoken advocate for equality. He had attended the voting rights convention on July 30, 1866 at the Mechanics Institute in New Orleans when it was attacked by police. He escaped death by hiding under the porch while New Orleans police officers, at the head of an angry mob of whites drummed up by the local press, attacked members of the convention and mangled their corpses.

 

I became interested in Macarty while researching his time as a member of the Orleans Parish School Board as part of a project examining the impact of racial science on state institutions after slavery. But the more I read about Macarty—who was singled out by the white-supremacist New Orleans Bulletin as “extremely offensive to the white people of this city”—the more I became intrigued by his story. During an era when the white press was reluctant event to print the names of African Americans, the Anglo papers in New Orleans routinely targeted Macarty, almost begging readers to attack him. They did.

 

After he confronted a white woman fired from her teaching position for supporting the White League—a white supremacist terrorist organization—the Bulletin repeatedly called for Macarty’s head. When the woman’s brothers attacked and left him for dead on September 16, 1875, the paper cheered the outcome and warned that the other Black school board members should “rememb[er] the fate of Macarty.” His attackers pleaded guilty and were “sentenced to each pay a fine of Ten Cents or one minute in the Parish Prison.” The court system in New Orleans functioned as an institution of racial control, letting Macarty’s attackers off the hook while signaling to African Americans that they would find no justice before the law. The continued media campaign and threats against Macarty played an outsized role in his political life and eventually led him to leave the city.

 

Macarty was not alone as a victim of media-initiated racist attacks. The white press regularly named targets for white vigilantism. White elites pioneered this form of racist terrorism after emancipation as a means of controlling African Americans and subverting working-class politics.

 

The consequences of the media campaign against Macarty should give us pause as the president and large portions of our national media engage in blatant race-baiting against Ilhan Omar and Maxine Waters. Indeed, it is hardly a coincidence that following this highly public, racist coverage, both Omarand Waters received death threats. As an activist and citizen, it is terrifying to see the resurgence of this Reconstruction-era tactic of racial oppression today.

 

What frustrates me as a scholar is that we’ve created a historiographic landscape in which African American contributions to American history are overlooked. We too often take a teleological approach to Reconstruction and spend too little time allowing ourselves to be surprised by the profound commitment to equality made by many of the era’s reformers. This act of intentional mis-remembering strengthens the foundation of white supremacy in our country. As we’re seeing right now, that’s incredibly dangerous. 

 

Macarty was a revolutionary figure about whom little was known until my recent article, despite his having brought the first lawsuit against segregated seating in federal court in 1869. In fact, the same few lines had been written and rewritten about Macarty since James Trotter’s 1880 Music and Some Highly Musical People, published the year before Macarty’s death. 

 

We need to better remember the stories of African American reformers and visionaries to counterbalance a field that remains plagued by Lost Cause categories, periodization, and imagery. We need to know more about those who led prior movements for equality. We need to celebrate their martyrs and understand the cause we inherit from them. And perhaps most crucially at this moment, we must become intensely aware of the tactics that their white supremacist opponents used to subvert equality.

 

Biography helps us accomplish these ends and we should pursue it vigorously and unapologetically. My friends and family are consistently surprised when they learn about my research into Macarty and his contemporaries. This cannot be the case, at least not if we hope to live in a society that values justice and equality.  

 

Biography is a key pillar of historical instruction from grade school through high school. It helps students recognize themselves in historical figures large and small. Well-executed biographies allow them to better understand the debates of the past and relate them to those of the present. They also enable students to approach the past with humility and to see that our forebears grappled with many of the same issues we face today. This is one of the central “lessons of history” and among the most important that we can offer. 

 

Further, biographical approaches to historical actors not only show African American resistance to white supremacy, but also avoid flattening African Americans into vehicles of resistance. Indeed, the view that African American liberty implies a rejection of (white) authority is a core belief of white supremacists. By telling the stories of African American men and women as whole persons, we can combat this racist lie.

 

In researching Macarty, I realized the need for more African American biographies in Louisiana and, I suspect, throughout the 19th-century U.S. At least in south Louisiana, I came across many prominent African Americans about whom little or nothing is known. Take T.M.J. Clark, who after having been enslaved, taught himself to read and became the president of the State Insane Asylum. Or John Gair, who helped write the Louisiana Constitution of 1868 and survived numerous threats and an assassination attempt before being gunned down while in police custody in 1875. Our histories have either completely ignored these radicals or, in cases where they’ve been mentioned in passing, gotten them almost entirely wrong.

 

Moreover, like Macarty, Gair and Clark were subjected to race-baiting coverage in the media that effectively ended their careers. The white press slandered and vilified both men and each of them suffered brutal attacks by white supremacist vigilantes. Like Macarty, Gair and Clark demanded equality. It was the cause for which Gair was martyred and Clark forced to flee for his life, a permanent exile from his hometown.

 

This wave of media-inspired white supremacist violence effectively ended Reconstruction. No one was ever held accountable for the massacre of voting rights activists in New Orleans in 1866. Macarty’s attackers, after nearly beating him to death, faced no consequences. And though Gair was assassinated while in police custody in 1875, none of his attackers were ever charged. It was this failure to hold the race-baiting press, politicians, and vigilantes responsible that undermined any semblance of equality for more than 100 years. 

 

Politicians like Macarty, Gair, and Clark took incredible risks and made enormous sacrifices to fight for equality 150 years ago. Their contemporaries failed to hold their attackers responsible. We cannot make that same mistake.

 

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171721 https://historynewsnetwork.org/article/171721 0
Oh, What a Beautiful Piece of American History

 

Oklahoma!, one of the great musicals of show business history, often loses its own history amid all of those gorgeous Richard Rodgers and Oscar Hammerstein songs. The play is a straight forward, and yet very complex, story of ranch hands and their women on farms in the bustling Oklahoma territory in 1906, just before Oklahoma became the 46th state. The simplicity and beauty of that life is the basis for the marvelous, new and different version of the play, that opened last week in New York at the Circle in the Square Theater at 1633 Broadway.

The play starts with ranch hand Curly, played superbly by the multi-talented Damon Daunno, a cowboy star in the Oklahoma territory who is desperately infatuated with farm girl Laurey. He stands up and, with a gorgeous voice, sings one of the signature songs in the musical, Oh, What A Beautiful Morning. It kicks off a play that is full of new romances, busted romances, patched up romances, a lot of violence, dark conversations, threats and a wild and wooly battle for the middle of America in a very divided country (sound familiar?). It is the men vs. the women, the good vs. the bad and the cowboys vs. the ranchers, all scrambling for a piece of the Oklahoma territory just after the turn of the century, in 1906, and all of the promises and dreams within it.

This new version is pretty much the same as all the other plays and movies (the 1955 film version won three Oscars) and yet, at the same time, it is distinctly different. The others were grand sprawling sagas with lots of props, such as the time-honored surrey with the fringe on top, farmhouses and barns. There are none of them in this new play, majestically directed by Daniel Fish. All the director gives the audience here is an empty stage with chairs, some spectators on the periphery, a small orchestra (all happily wearing cowboy boots) placed carefully in a shallow pit and that luscious music that drifts through the air and soothes the hearts of everyone in the theater.

The story (Hammerstein also wrote the book) develops nicely. Curly wants to take Laurey to the local dance but she had already promised to go with Jud Fry, a menacing, malevolent cowboy whom nobody likes. She only did it, she tells friends, to spite Curly. This sets off a battle between Curly, Jud and Laurey, in addition to the fight between cowboy Will Parker and traveling salesman Ali Hakim for the hand of the boisterous cowgirl Ado Annie. There is a lot of back and forth and the plot is told with the wonderful songs as well as dialogue. Those tunes include Oh, What a Beautiful Morning, The Surry with the Fringe on the Top, People Will Say We’re in Love, Kansas City, I Can’t Say No, and the rousing, burn-down-the-barn title song, Oklahoma!

Even though this is a barebones show, it has some marvelous special effects. At one point, Curly and Jud are arguing over Laurey with some pretty dangerous and threatening dialogue. Curly even suggests that Jud Hang himself. The whole scene is presented in the dark, so that you only hear their voices of the two men. Part of that confrontation is a huge, haunting, slightly out of focus film of Jud talking. It fills the stage wall.

Many of the conversations in the story are done with dark lighting and stirring music to add a sense of foreboding to the drama. There is some gunplay, pretty authentic for the era. An anti-gun theme is evident around the walls of the theater, where over a hundred rifles and standing in wall racks, ready to be fired at any moment if there is trouble somewhere in the territory of Oklahoma.

The story of the land and the people battling over it, the tale of yet another new frontier in U.S. history, is absorbing and the same story that developed in every other U.S. territory, whether it was Arizona, Alaska or Oklahoma. The play tells the tale of an America that, out there in the cornfields, is bursting at the seams. And, at the same time, it tells the story of Oklahoma, ranchers, cowboys and city folk.

In the play you learn about all the hard work the cowmen and ranchers put into make their ranches successful, the social customs of Oklahoma, and the mid-west, in 1906, the dances, the dating, the generational battles, and marvel of country folks for city folks, told so well in the tune Kansas City.

Amid all of this history is the story of the young people, helped and guided by the older ones, as they try to find their place in Oklahoma, America, and the world. It is a nicely told saga told within all of those memorable tunes.

Stetsons off to director Fish for not just re-staging, but re-inventing this classic musical. He used all of his genius to create a sensational new play out of an equally sensational old one. He gets significant help from a gifted groups of actors, including Daunno as Curly, Mary Testa as Aunt Eller, who holds the chaotic life of the prairie together through all of its storms,  Rebecca Naomi Jones, a fine singer and whirling dervish of a dancer as Laurey, James Davis as the stoic, hunkered down Will Parker, Ali Stroker as his beloved girlfriend Ado Annie, Patrick Vail as the villain Jud Fry,  Anthony Cason as Cord Elam, and Will Brill as salesman Ali Hakim.

The play started its musical journey in 1931 as Lynn Riggs’s Green Grow the Lilacs. It wound up with Rodgers and Hammerstein, who in 1943 made it into their very first, of many, shows. In 1944 it won a Pulitzer Prize. The play was a huge commercial hit and ran on Broadway for nearly seven years. Revivals of it over the years have won numerous Tony Awards. The 1955 movie, starring Gordon Macrae, Shirley Jones and Rod Steiger, garnered three Oscars.

The folks connected to the original play really should have taken some time to give people in the audience a little history about sprawling, ever green and inviting Oklahoma that was so central to the show. The big push for statehood started in the 1889 Oklahoma Land Rush, in which 50,000 energetic settlers raced across the territory’s plains in wagons, carriages and on horseback to claim two million acres of free land, a race into history sanctioned by the U.S. government as a way to populate the huge piece of Midwestern landscape.  As the new settlers developed it, the need for statehood grew. Ironically, after the success of the play, the state of Oklahoma named the title song of the musical as it’s official state song.

I’m sure they voted for it on a beautiful morning at the start of a beautiful day.

PRODUCTION: The play is produced by Leve Forward, Eva Price, Abigail Disney, others. Scenic Design:  Lara Jellinek, Costumes: Terese Wadden, Lighting: Scott Zielinski, Sound: Drew Levy, Choreography: John Heginbotham. The play is directed by Daniel Fish. It has an open-ended run.

   

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171723 https://historynewsnetwork.org/article/171723 0
When Women Ran Hollywood

 

Hold on. When did women—who produced only 18 percent of the 100 top-grossing movies of 2018, whose screenplays constituted a mere 15 percent, and who directed a microscopic 4 percent—ever run Hollywood?

 

Here’s how I found out about this little-known history. While researching a novel set in 1919 about vaudeville, the live variety shows that were America’s favorite form of entertainment at the time, I learned its demise was caused in part by the growing success of silent movies. The obvious question was, what could make silent movies—with their melodrama, bad acting and, you know, silence—more desirable than a live performance of, for instance, a regurgitator who could swallow and then upchuck items in the order the audience determined? (Audiences loved regurgitation, by the way; also sword swallowing and fire breathing. In addition to a wide variety of acts, there was a lot of ingesting stuff that one just shouldn’t.)

 

What was so great about silent movies? I soon found myself wandering down a succession of internet rabbit holes. (When it comes to research, most writers can get so rabbit-y we practically sprout long floppy ears.) First, I sampled a few films and found that they were more complex, well-acted, and creatively-filmed than I’d expected. Mary Pickford’s movies or Gloria Swanson’s, for instance, are surprisingly subtle.

 

But far more fascinating was the fact that women were a driving force in early filmmaking. Up until about 1925, “flickers” weren’t considered terribly respectable, so if you could get a real job, you avoided a career based on these flights of fancy. Conversely, if you were shut out of most employment because of, say, your gender, Hollywood beckoned. 

 

Consider the following:

  • Women worked in almost every conceivable position in the industry, from “plasterer molder” (set construction) to producer.
  • There were a few popular actors, but actresses were the stars.
  • In 1916 the highest salaried director was female.
  • In 1922 approximately 40 production companies were headed by women.
  • An estimated half of all screenplays produced before 1925 were written by women. 
  • For over twenty years, the most sought after and highest paid screenwriter was female. 

 

Why had I never heard of any of this? I was familiar with directors D. W. Griffith and Cecil B. DeMille, producers Sam Goldwyn and Jack Warner, writer-director-actor Charlie Chaplin, but had heard of almost none of the following sample of brilliant and powerful women.

 

Studio Chiefs

Alice Guy-Blaché began as a secretary for a French motion picture camera company. Fascinated by the medium’s possibilities, in 1896, she asked her boss if she could make a short story film—the first ever!—to promote the camera. He agreed as long as she didn’t shirk her secretarial duties. By the time she moved to America in 1907, she had produced 400 such films. She founded a new studio, Solax, served as president, producer, and chief director, and produced 300 more films by the end of her career. Her feature-length films were quite sophisticated, focusing on subjects of social import such as marriage and gender identity.

 

Mary Pickford, best known as “America’s Sweetheart,” was the most successful and highest paid actor of her time. She was also a shrewd business woman. Along with D. W. Griffith, Douglas Fairbanks, and Charlie Chaplin, she co-founded United Artists in 1919, and was arguably the most financially astute among them. Chaplin recalls that at a meeting to form the studio, “She knew all the nomenclature: the amortizations and the deferred stocks, etc. She understood all the articles of incorporation, the legal discrepancy on Page 7, Paragraph A, Article 27, and coolly referred to the overlap and contradiction in Paragraph D, Article 24.”

 

Screenwriters

Known for her sharp wit and snappy dialogue, Anita Loos’ career as a screenwriter, playwright, and novelist spanned from 1912 to the late 1950s. Douglas Fairbanks, as much an athlete as an actor, relied upon her to accelerate his career and devise an ever-expanding list of “spots from which Doug could jump.” Most famously, she adapted her bestselling novel Gentlemen Prefer Blondes as a silent film in 1928, which was the basis for the 1953 version starring Marilyn Monroe.

 

Frances Marion is hard to top even by today’s standards of output and success. Until 1935, well after women’s influence in Hollywood had waned, she remained the most sought after and highest paid screenwriter in America, male or female. She acted, directed, produced, and is the only woman to have won two academy awards for Best Original Screenplay. She was mega-star Mary Pickford’s preferred writer (and best friend) and saved many careers in the tumultuous years when the industry was converting to sound. Above all, Frances was a generous collaborator, hosting famous “hen parties” at her house as a sort of support group for Hollywood’s female filmmakers.

 

Directors

Lois Weber was also a studio head, producer, screenwriter, and actress, but as a director she was as well-known as D. W. Griffith and Cecil B. DeMille. In 1916 she was the highest paid director, male or female, earning an unprecedented five thousand a week. She was the first woman member of the Motion Picture Directors Association, with 138 films to her name. They were often morality plays on social issues such as birth control, drug addiction, and urban poverty, particularly as these affected the plight of working class women.

 

Dorothy Arzner, the most prolific American female director of all time, started in the scenario department in 1919 typing up scripts. In the fluid Hollywood work environment, she quickly progressed to cutting, editing, and writing, and by 1927 had directed her first film. With the advent of sound, she invented the boom mic to allow actors to move about the set without bumping into sound equipment. Arzner was gay and fairly open about her personal life, wearing men’s clothing, and living with choreographer Marion Morgan for 40 years. Despite her gender and orientation, she was able to work steadily as a director until she retired in 1943.

 

Renowned early film historian Anthony Slide has said, “Women directors were considered equal to, if not better than, their male colleagues.”

 

Actresses

Florence Lawrence is credited with being the world’s first movie star. In 1908 she was making 100 flickers a year with D. W. Griffith at Biograph, the world’s top studio at the time. However, she was known only as “the Biograph Girl” because the studio didn’t want to increase her burgeoning fame, and thus ability to demand a higher salary, by naming her. She moved to upstart IMP studios, and was involved in perhaps the first wide-scale publicity stunt. The studio quietly fed the papers a story that she’d been killed by a street car, then took out ads declaring “WE NAIL A LIE” claiming that other studios were trying to ruin her career. Fans went crazy for the story, and a public appearance shortly thereafter resulted in mayhem as a huge throng rushed her, pulling buttons from her coat and the hat from her head.  

 

Mabel Normand was a brilliant comic actress, starring in approximately 200 films, most at Mac Sennett’s Keystone studio, and was the first actor to be named in a film’s title (e.g. Mabel’s Lovers in 1912). She also directed many of her own films, including those in which she was featured with a young Charlie Chaplin. Though he erroneously claimed directorship on several of them, Mac Sennett has said that Chaplin “learned [to direct] from Mabel Normand.”

 

Every one of these women was a multi-talented powerhouse, committed to the success of her films, the industry, and other female filmmakers. And for each of those named above there were many, many more.

 

Unfortunately, with a few notable exceptions their careers were generally over by the end of the 1920s. As Hollywood historian Cari Beachamp said, “Once talkies arrived, in the late 20s, budgets soon tripled, Wall Street invested heavily, and moviemaking became an industry. Men muscled into high-paying positions, and women were sidelined to the point where, by the 1950s, speakers at Directors Guild meetings began their comments with “Gentlemen and Miss Lupino,” as Ida Lupino was their only female member.”

 

Their names may no longer be widely recognizable, but these were among the many women who built and ran early Hollywood, shaped the industry in myriad ways, and influenced what we see on the silver screen even today.

 

Will women ever “run” Hollywood again—or even advance to relatively equal numbers as studio heads, producers, directors, and screen writers? This remains to be seen, of course. But powerful leaders, like executive producer, showrunner, and director Shonda Rhimes, Amazon Studios head Jennifer Salke, Disney TV Studios and ABC Entertainment chair Dana Walden, Producer-director Ava DuVernay, and director Patty Jenkins, among many others, offer hope. 

 

“Demanding what you deserve can feel like a radical act,” Rhimes has said. Radical, perhaps, but not new. All it would take is a return to the good old days of early Hollywood.

 

]]>
Sun, 19 May 2019 07:02:43 +0000 https://historynewsnetwork.org/article/171719 https://historynewsnetwork.org/article/171719 0
Trump’s War on Civil Rights and Beyond: A Conversation with Acclaimed Political Analyst and Civil Rights Historian Juan Williams

 

 

Republican presidential candidate Donald Trump urged black voters to ditch the Democratic Party and “try Trump” at a campaign rally on August 19, 2016, in the predominantly white suburb of Dimondale, Michigan. He said of black Americans: "You're living in poverty. Your schools are no good. You have no jobs. Fifty-eight percent of your youth is unemployed.” Trump then asked, “What the hell do you have to lose?"

            

As it turned out, African Americans—among others—are losing a great deal under President Trump, as acclaimed commentator, journalist and historian Juan Williams argues in his timely and illuminating new book, “What the Hell Do You Have to Lose?”: Trump’s War on Civil Rights (Public Affairs). 

 

Mr. Williams contends that Trump’s now infamous campaign speech and other statements on race have conveniently ignored African American history and progress in the decades since the passage of the 1964 Voting Rights Act and the 1965 Voting Rights Act. He denounces the president’s ingrained tendency to intentionally distorthistory to fuel racial tensions for his political advantage.

 

In “What the Hell Do You Have to Lose?” Mr. Williams deftly weaves the remarkable story of the struggle for civil rights into his account of how the Trump Administration has been bent on turning back the clock and undoing or threatening advances in voting rights, school integration, equal employment, and fair housing, and other areas. He describes the unprecedented threat to civil rights under Trump as he chronicles the president’s personal and family history ofdiscriminating against people based on race and his record of hostility to African Americans, including President Barack Obama.

 

In describing the losses for African Americans under Trump, Mr. Williams also provides glimpses from the struggles of heroic pioneers who fought for civil rights and for a better life for all Americans. He shares the stories of activists such as Bob Moses of the Student Nonviolent Coordinating Committee who braved the violent Jim Crow South to register African American voters; James Meredith, a US Air Force veteran, who became the first black student to enter the University of Mississippi in 1962 in the wake of bloody riots at “Ole Miss”; A. Philip Randolph, a union leader who made strides for equal employment rights in the Jim Crow era; and Robert Weaver who championed fair housing programs and became the first black cabinet secretary as the head of the Department of Housing and Urban Development. 

 

Mr. Williams takes pains to explore the past in the belief that knowledge of history is the key to understanding the present and to shaping the future as he explains how the principles of equality, tolerance, and justice today are at stake for all citizens.

 

Mr. Williams is an award-winning journalist, political analyst and historianwho has covered American politics for four decades. He has written several other books, including Eyes on the Prize: America’s Civil Rights Years 1954-1965; Thurgood Marshall: American Revolutionary; This Far by Faith: Stories from the African American Religious Experience; My Soul Looks Back in Wonder: Voices of the Civil Rights Experience; and Enough. His articles have appeared in the New York Times Sunday Magazine, Time, Newsweek, Fortune, The Atlantic Monthly, Ebony, Gentlemen’s Quarterly, and The New Republic. Mr. Williams is currently a columnist for The Hill, and was a longtime correspondent for The Washington Post and NPR. He also cohosts the Fox News Channel’s debate show The Five, and appears on other Fox shows where he regularly challenges the orthodoxy of the network’s right-wing stalwarts. 

Mr. Williams generously spoke by telephone about his new book, his work, and his commitment to sharing historical context when discussing current events. Following our conversation, he added this opening update for readers on his historical perspective and recent events.

 

Juan Williams:I want to thank Robin for the opportunity to talk to history lovers on the History News Network. When I wrote “What the Hell do you Have to Lose: Trump’s War on Civil Rights,” my goal was to answer the question that then presidential-candidate Donald Trump posed to Black America: ‘What do we have to lose from a president who doesn’t care about African Americans?’

 

My book dissects Trump’s unprecedented assault on everything America has achieved over the last half century to move forward on race relations--from voting rights to integrated schools to equal opportunity in employment and fair housing. These changes were achieved by people who made sacrifices, put themselves at risk of being expelled from school, losing jobs, losing their mortgages, constant threats of violence and some even faced death.

 

I tell stories of these courageous civil rights heroes so that we can better understand that progress came at great cost. Starting from that baseline helps the reader to understand how much the nation has gained, and how much we have to lose from Trump’s effort to return to the past or, in his infamous words, “Make America Great Again.”

 

Since I finished writing What the Hell do you have to Lose in 2018, very little has changed. The president continues to tell lies about blacks, Latinos, and immigrants. He makes racial minorities and immigrants out to be a threat to America; we become the enemy, all lumped together as barbarians who commit crimes, take advantage of social programs, and abuse affirmative action policies.

 

These lies are aimed at the ears of white America at a time when pollsters report that large numbers of older whites are anxious about the growing number black and brown people, and immigrants of all colors, in the USA.

 

Trump’s most frequent refrain is that life is better for minorities with him as president. He dismisses talk about increasing racism and anti-Semitism as overwrought. Even FBI reports on the increase in hate crimes since he has been president are waved away as liberal nonsense. Instead, he frequently tells interviewers, for example, that the black unemployment rate is currently “the lowest in the history of the country.”

 

This is a distortion.

 

First, black unemployment under Trump has never reached its lowest point in history. Though it did hit 5.9 percent last May, Labor Department data indicates that black unemployment dropped down to 4.5 percent in 1953. According to the Washington Post Fact-Checker, this distortion was worth giving Trump three out of four Pinocchio’s for his unfounded claim.

 

In addition, the president fails to mention that black unemployment has been increasing. As recently as February 2019 it reached 7 percent. And throughout, black unemployment has remained more than double white unemployment.

 

Unfortunately, these are the kinds of distractions from the truth about race relations that Americans--black and brown Americans in particular--have come to expect from our president.

 

He’s a man who couldn’t condemn the unique horrors of white supremacy that resulted in the death of a woman in Charlottesville last summer

 

Trump won’t talk about the white supremacy that led to the death of Heather Heyer in Charlottesville, eleven Jews in Pittsburgh, and fifty Muslims in New Zealand. But he couldn’t be happier to talk about Congresswoman Ilhan Omar, whose recent treatment by Trump and the Republican Party has less to do with condemning anti-Semitism than it is a political ploy to silence an immigrant, black and Muslim woman who dares to wear a Hijab in Congress and speak her mind about controversial subjects.

 

He’s a man who, hours after it came out that a white supremacist in New Zealand slaughtered fifty Muslims during their Friday Prayers, said that white nationalism was “not really” a major threat, even as the killer’s manifesto described Trump’s 2016 victory as “a symbol of renewed white identity and common purpose.”

 

Indeed, even after Chicago Mayor Rahm Emanuel condemned the courts for dropping the charges against disgraced actor Jussie Smollett, he slammed Trump for speaking on the issue, ordering him to “stay out” because “the only reason Jussie Smollett thought he could get away with this hoax is because of the environment President Trump created.”

 

Previous Republican Administrations made good faith efforts to improve relationships with African Americans.

 

Presidents Reagan and Bush made a point of speaking at the NAACP, seeking out advice from prominent black intellectuals, and appointing African Americans to the highest positions in government. And under President Obama black and white members of both parties were willing to start having the messy, yet necessary conversations about issues that continue to prevent us from moving forward on race as a nation.

 

On the other hand, President Trump has just one African American in his Cabinet. Despite agreeing to some criminal justice reform measures, Trump has failed to deal with issues of police brutality that have led to persistent tensions with black America and the creation of the Black Lives Matter movement. Instead, he ran a campaign, and now a government, fueled largely by white American fears that the country is being stolen from them by ungrateful African Americans, undocumented immigrants, and radical Muslim terrorists.

 

According to Trump, the problem is not the harsh, unfair reality of high levels of segregation in neighborhoods, schools, and jobs. The problem in his eyes is a football player, Colin Kaepernick, kneeling in protest during the playing of the national anthem.

 

Trump also is easy to anger when prominent black people challenge his policies. He will also go out of his way to tongue-lash black critics, including insulting LeBron James, Steph Curry, Jay-Z and other black celebrities. He regularly disparages black women in the Congress who disagree with his policies. 

 

To get away from the day-to-day static around Trump’s mishandling of racial issues, the American people need to know about the civil rights heroes like Bob Moses, James Meredith, A. Philip Randolph, and so many others, because we need to understand how much blood, sweat, and tears it took to create the thriving Black America of today and protect us from those who, like President Trump, couldn’t care less.

 

 

Robin Lindley: Congratulations Mr. Williams on your powerful new book on Trump’s war on civil rights. You take pains to weave history into your reporting, and you are a historian in your own right with your acclaimed books such as Eyes on the Prize, a study of the Civil Rights Movement, and your renowned biography of Justice Thurgood Marshall. In your new book you share the story of civil rights advances that are now threatened under Trump. Your efforts as a journalist and historian are refreshing in this era of fake news. 

 

Juan Williams: I love history. I find it eye-opening because it tells me so much about not only the present but it allows me a structure for thinking about the future. For me, history always been a revelation. Even when I was a child when I learned about the past, I thought, Oh, my goodness. Who knew?

 

Robin Lindley: Did you have training in history when you went to school or did history just naturally come into your writing when you were a reporter?

 

Juan Williams: No, my love of history is an extension of my interest in the news, a fascination I had from my days as an immigrant child in a city with close to a dozen newspapers, New York. I found newspapers and daily journalism on radio and television to be a reason to look into history. The rest of the story, the back story if you will, was the history of the characters and events, and the ideas that animated the politics of the day. I would see something that happened in a prior period in American life and I would go to the library in New York City, where I grew up, and I’d read a book to investigate the story and to understand how we came to the point where we were then and how that article that I was reading in fact was representative of a larger and longer vein of history.

 

Robin Lindley: There's a new twist in the news every day concerning our history, and particularly about race. Attorney and Trump “fixer” Michael Cohen called Trump a racist, a con man, and a cheat at a public Congressional hearing. I don't think that was news to many of us, including the Republicans on the committee. You certainly delve into the history of Mr. Trump's racial insensitivity as well as his lack of historical knowledge as he attempts to erase the past.

 

Juan Williams: I write of the reality of the sacrifices, even people giving their lives, to accomplish racial justice in this country. I'm not suggesting this book is a complete telling of the civil rights movement; I structured the book to include the history as an introduction to the background for young people and a reminder for people who may have forgotten the past. My premise is that we have a traveled such a distance on race going back to our origins as a nation with legal slavery and then legal, government enforced legal segregation that extended well into the 20th century.

 

I had written some of that story in my first book, Eyes on the Prize. More of it is in my second book, a biography of former Supreme Court Justice Thurgood Marshall.

 

That brings me to this book and why I was offended by Trump telling white audiences that black people had nothing to lose by voting for him. The quote, "What the hell do you have to lose?" came from him during the 2016 election campaign. He argued to whites that these black people live in such bad neighborhoods in terms of the violence and crime, with bad schools and a lack of jobs. And, some white person driving through a troubled black neighborhood might say, "Well, it looks like he has a point."

 

There's so much missing context in terms of that distorted picture of black American life.  First, no fake news, just the facts: The nation’s black population is doing better than ever before by so many measures in terms of income, education, business ownership, occupying political office, and the like. I could go on. But Trump doesn't seem to plug into that part of the story. Instead, he takes a perverse delight in poverty and crime among blacks, Latinos and immigrants. Again, this why I think the history is so important. 

 

The history of progress for American minorities is needed to inform someone hearing Trump’s indictment so they are not fooled. With history in mind they will know what the hell striving minorities in this country have to overcome and a history lover knows how far minorities and immigrants have come despite those obstacles.

 

That indictment of black people by Trump is undermined by the history of all the struggle and sacrifices made to bring black people to this point. And also, it opens eyes to the idea that the African American community is not all poor and poorly educated. In fact, black America in 2019 is at historic heights in terms of income and education. Almost 40 percent of black people make between like $35,000 and $100,000 per year. Another 11 percent are earning between $100,000 and 200,000. So that's half of the black population living in the American middle class. And then you have the reality of black executives who have led very successful American companies like Time-Warner, Merrill Lynch, American Express, and Xerox. 

 

Those stories of black achievement are not part of Trump telling whites that blacks have nothing to lose. An informed listener will know they are being misled by Trump because they know the history of black trailblazers, beating the odds to make new paths in American society, a society that not only enslaved black people but legally segregated them and still discriminates against them.