History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Mon, 03 Aug 2020 08:48:57 +0000 Mon, 03 Aug 2020 08:48:57 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://m.www.hnn.us/site/feed The Mississippi Flag and the Shadow of Lynching



The Mississippi flag, which has now seen its inglorious end, first flew its Confederate design in 1894, a busy time in the south for mythmaking about white supremacy.  


The year 1894 was also near the peak of one of the most evil and grotesque practices in American history: lynching. During a 14-year stretch from 1886-1900 more than 2500 people, mainly black men, were tortured and killed by mainly southern white individuals and mobs that faced almost no consequences. The tortures were horrific; mobs burned some men alive, others castrated them.  One woman in Texas was “boxed up in a barrel with nails driven through the sides and rolled down a hill until she was dead,” wrote Ida B. Wells, the leading anti-lynching crusader of the time. 


As lynching rose, the white south was busy creating monuments and flags that asserted white supremacy and white innocence.  The myth of black culpability in the crimes against them was so pervasive that nearly everyone, including Frederick Douglass, believed it, according to the autobiography of Ida B. Wells.  If the great Douglass believed the myths, you can rest assured that they were almost universally shared. Even Ida B. Wells believed them, until a mob lynched three Black grocery store owners in 1892 near her home in Memphis. 


Thomas Moss, who Wells knew well—she was his daughter’s godmother—was one of the three Black owners of the People’s Grocery Company.  A competition and tension grew between that store and a white-owned one.  The People’s Grocery workers were increasingly harassed by their white neighbors and scuffles broke out, culminating in a mob of whites surrounding the store and shooting at it.  Someone inside the store shot back and three whites were injured.  


The three Black men and a few others were arrested. Egged on by the white-owned media, a lynch mob stormed the jail and lynched the three shopkeepers. Moss, understanding that Memphis was no longer safe for his people, made one last request to his killers: “Tell my people to go West,” he said. “There is no justice for them here.” The lynching and these prophetic words led many Black Memphians to leave the city in 1892.


The lies told about Black men were repeated over and over across the south and dutifully reprinted in the northern press.  Black men were lynched, the myth went, because they were raping white women. “The crime for which negroes have frequently been lynched and occasionally been put to death with frightful tortures,” wrote The New York Times in an 1894 editorial, “is a crime to which negroes are particularly prone.”


Wells, through her investigations, discovered a very different reality that can be summarized by two key findings.  First, rape was not usually the stated cause, and when it was, it was often not charged until after the lynching had occurred. Second, when an actual relationship between a Black man and a white woman existed, it was a generally a consensual one. 


“Nobody in this section believes the old thread-bare lie that Negro men assault white women,” wrote Wells in an unsigned editorial in her newspaper, The Free Speech and Headlight.  “If Southern white men are not careful they will over-reach themselves and a conclusion will be reached which will be very damaging to the moral reputation of their women.” 


“The black wretch who had written that foul lie should be tied to a stake,” wrote The Memphis Commercial Appeal, “a pair of tailor’s shears used on him and he should then be burned at a stake.” A mob destroyed the newspaper’s presses and Wells fled the city.


As the Mississippi flag was going up in 1894, Ida B. Wells was working as a one-woman force to change the narrative. She had spent years traveling around the south investigating lynching, and had narrowly escaped death herself. In 1894, she was wrapping up her speaking tours. “I found myself physically and financially bankrupt,” she wrote.   


Wells had concluded, backed up by evidence, that Black lawlessness was a myth.  Instead, Wells wrote, the real reason behind lynching was white terrorism due to economic competition, just like in the case of Thomas Moss and his grocery store.  The sacking of Tulsa’s Black Wall Street years later in 1921 can be seen as economics-driven terrorism too. 


At this point in the story, it would be satisfying to hear that Ida B. Wells convinced the nation and changed the narrative.  But this would be only half true. The Black press told the story and the truth about lynching spread widely among people of color. 


But the white press doubled down on the lies. In 1894, the year that the Mississippi myth-making flag was going up, The New York Times called Wells “a slanderous and nasty-minded mulattress, who does not scruple to represent the victims of black brutes in the South as willing victims.”  Politicians, newspapers, and historians painted the picture of a heroic south needing to rein in lawless Blacks, a racist view echoed by the film Birth of a Nation in 1915, and persisting even in our own time when Trayvon Martin, Eric Garner and other victims of violence, are demonized.  


But in 2020 the Mississippi flag is coming down and so are the myths. Nikole Hannah-Jones won a Pulitzer Prize for her central essay in the 1619 Project, which seeks to change the national narrative on race.  Ida B. Wells herself won a posthumous Pulitzer Prize this year, more than a century overdue, but better late than never.  Lynching helped to raise the odious flag in 1894.  But in 2020, hundreds of thousands of marchers protesting the lynching of George Floyd brought the flag down.  Maybe, just maybe, this will be remembered as an era of change. 

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176621 https://historynewsnetwork.org/article/176621 0
Did the Atomic Bomb End the Pacific War? – Part I Many historians and most lay people still believe the atomic destruction of Hiroshima and Nagasaki ended the Pacific War.


They claim with varying intensity that the Japanese regime surrendered unconditionally in response to the nuclear attack; that the bomb saved a million or more Amercian servicemen; that Hiroshima and Nagasaki were chosen chiefly for their value as military targets; and that the use of the weapon was, according to a post-war propaganda campaign aimed at soothing American consciences, ‘our least abhorrent choice’.


The trouble is, not one of these claims is true.


That such denial of the facts has been allowed to persist for 75 years, that so many people believe this ‘revisionist’ line - revisionist because it was concocted after the war as a post-facto justification for the bomb – demonstrates the power of a government-sponsored rewrite of history over the minds of academics, journalists, citizens and presidents.


The uranium bomb dropped on Hiroshima, code-named ‘Little Boy’, landed on the city center, exploding above the main hospital and wiping out dozens of schools, killing 75,000 people, including tens of thousands of school children.


‘Fat Man’, the plutonium bomb used on Nagasaki, incinerated the largest Catholic community in Japan, obliterating the country’s biggest cathedral along with a residential district packed with schools and hospitals. Its missed its original target, the city center.


Zealous apologists for the bomb will have started picking holes: Hiroshima held troops? Yes, a few enfeebled battalions. Hiroshima had military factories? Most were on the outskirts of town, well clear of the bomb.


Nagasaki hosted a torpedo factory and shipyards? Yes. The factory was deep underground and untouched by the weapon; the bomb missed the shipyards, which were not functioning in any case.


Only Kokura, of the five intact cities set aside by the Target Committee, a secret group of US military and scientific personnel, for nuclear destruction, contained a large weapons arsenal. In any event, bad weather diverted the second atomic run from Kokura to Nagasaki.


And yet, it mattered little to the Target Committee if the targeted city held civilians or soldiers, arms-makers or sushi restaurants, kimono-clad women or children. The ideal city should, according to the committee’s Minutes, “possess sentimental value to the Japanese so its destruction would ‘adversely affect’ the will of the people to continue the war; … and be mostly intact, to demonstrate the awesome destructive power of an atomic bomb.”


Kyoto matched that criteria but was grudgingly struck off the list for aesthetic reasons: War Secretary Henry Stimson had visited the beautiful heart of Japanese culture with his wife in 1926, and insisted on preserving it. Tokyo was rubble, so there was no point in “making the rubble bounce,” to appropriate Winston Churchill’s famous remark about the nuclear arms race.


In other words, the target should show off the awesome power of the bomb not only to the six leaders who ruled Japan from a bunker under the ruins of the Imperial Palace in Tokyo but also – and just as importantly - to Joseph Stalin, whose massed forces were being deployed to the border of Japanese-occupied Manchuria. Stalin himself was aching to be “in at the kill,” to seize a communist foothold in Asia. The old Bolshevik fancied Hokkaido.


In this light, the use of the atomic weapon must be seen as a continuation and a start: the nuclear contination of the conventional terror bombing of Japanese civilians, and the start of a new “cold war” waged by a superpower equipped with a weapon that would, as James Byrnes said on May 28, 1945, a few weeks before he was appointed US Secretary of State, “make Russia more manageable” in Asia.




Let us revisit the scene of the world back then; let us try, briefly, to unravel the confluence of events that led to the use of the weapons.


By the start of 1945, Japan had lost the racial war they’d started in the Pacific. Allied-- chiefly US--military power had utterly defeated them. In fact, the Japanese had lost the war as early as the Battle of Midway, fought between June 4th and 7th 1942, when US forces destroyed the bulk of the Japanese navy, “the most stunning and decisive blow in the history of naval warfare,” as historian John Keegan described it, rendering Japan incapable of mounting another major offensive.


By July 1945 Japan possessed about 3,000 fighter planes and 1,500 bombers, but few functioning airfields. They lacked sufficient ammunition for their remaining artillery, machine guns and rifles. They had no effective navy (their most lethal sea weapon being some 2400 “suicide boats”: little high-speed craft used to ram the bellies of enemy ships). Untrained kamikaze pilots still dared to take off in planes made partly of wood, so dire were supplies of steel. Indeed, Japan was desperately short of all commodities, chiefly fuel, food and steel. The people were compelled to hand over any household steel items to be melted down for ammunition. Most civilians were malnourished or slowly starving.


No doubt Japan could still draw on a large pool of men: some 65 divisions had returned home earlier in the year, and every one of the 350,000 troops (about 900,000, if you include support units, teenage soldiers and troops with little training) assigned to the defense of Kyushu were determined to honor the fierce exhortation of the Bushido military code: “To die!”


Yet Japanese “spirit” on the ground meant little without any effective air defense: the Pacific War was won in the air, and by mid-1945, American aircaft carriers and warships ringed the Japanese achipelago in an impenetrable blockade, and US aircraft were in complete control of the skies over Japan.


By then, 67 major Japanese cities (including Tokyo) lay in ruins, the result of General Curtis LeMay’s terror firebombing campaign, in which millions of incendiary (proto-napalm) canisters created huge firestoms that tore through Japan’s papyrus homes like a bushfire in hell – killing at least 100,000 civilians in Tokyo in a single night, on March 9-10, making it the deadliest bombing raid in history.


LeMay’s goal was the same as Allied terror-bombing of Germany: to break civilian morale. It failed: the Japanese and German people hardened in response to terror bombing - as had the British, of course, during the Blitz, offering empirical evidence of civilian mental toughness the Allies failed to heed.


Yet, if the destruction of most of Japan’s cities was not enough to make them surrender, throughout July 1945 Admiral William Halsey’s Third Fleet was busy finishing off what LeMay’s napalm sorties had failed to destroy or even target: Japan’s remaining infrastructure, such as airfields, 12 giant coal transports, and the naval base at Kure.


Something else sustained the Japanese that defies easy explanation to westerners: the ‘divine’ presence in their midst, in the form of Emperor Hirohito, the ‘Sacred Crane” who, the people believed, was descended from the Sun Goddess Amaterasu and fortified their extraordinary psychological resilience. The western, Christian equivalent would be the return of the Messiah during total war.




Nowhere was the deference to Hirohito so palpable, so forceful, so weighed down by the dreadful burden of history, as in the concrete bunker under the ashes of Tokyo, where Japan’s War Council of six old Samurai rulers refused to utter three magic words: “We surrender - unconditionally.” Somehow those words had to be extracted from their mouths like an especially stubborn tooth.


Three hardliners – War Minister Korechika Anami, Army Chief of Staff General Yoshijiro Umezu and Navy Chief of Staff Suemu Toyoda – dominated the Six, and pressed every Japanese to fight to the death - commit, in effect, national “seppuku,” or ritual suicide - to defend the Emperor and the homeland.


Three moderates – Prime Minister Kantaro Suzuki, Foreign Minister Shigenori Togo, and Navy Minister Admiral Mitsumasa Yonai - wavered and vacillated, by turns secretly pursuing peace and openly supporting war. 


Through 1945, as their country crumbled before their eyes, the Big Six continued to press for a conditional peace – unacceptable to the Allies - that would at least deliver Japan’s chief condition: the preservation of the life of Hirohito and the Imperial dynasty.


For the Japanese regime, Hirohito’s life was non-negotiable, a condition of surrender Japan experts in Washington, notably Joseph Grew, US ambassador to Japan from 1932-1941, well understood and warned the Truman administration. To this, Tokyo would stick to the bitter end: no Japanese leader could, or would, bear responsibility for serving up the Emperor to the Americans to be tried and hung as a war criminal.


In Hirohito’s name, then, the Japanese regime would refuse to surrender, and nothing, not even the annihilation of the Japanese people, would deflect these grim old men from saving their divine monarch, a minimum condition for peace.


That day of reckoning was fast approaching. The Japanese regime was expecting and preparing for a US land invasion. The hardliners, Anami, Umezu and Toyoda, welcomed this prospect: every Japanese must prepare to martyr themselves in defense of the homeland. There was method in this madness: from the depths of their delusion the Japanese hawks believed high American casualties would compel the US to sue for a negotiated, conditional peace.




Meanwhile, in Washington, President Harry Truman was determined to avoid a land invasion, despite the advanced planning for “Operation Downfall,” the two-pronged attack on the Japanese homeland, at Kyushu and Tokyo Bay.


The appalling casualties of Okinawa (April 1-June 22), the bloodiest battle in the Pacific, in which an estimated 12,520 Americans were killed in action and up to 55,000 wounded, preyed on Truman’s mind.


With such terrible premonitions, Truman called a critical meeting of the Joint Chiefs of Staff on June 18, 1945, to discuss the invasion plan – a month before the atomic bomb was scheduled to be tested in the New Mexico desert.


The Joint Chiefs were asked to estimate likely American losses – dead, missing and wounded – in a land invasion. General George Marshall calculated that during the first 30 days, casualties “should not exceed the price we have paid for Luzon,” where 31,000 were killed, wounded or missing (compared with 42,000 American casualties within a month of the Normandy landings).


Several caveats qualified this low “body count”: the invasion of Kyushu and Tokyo Bay would take even longer than the allocated 90 days, and the figures did not include naval losses, which had been extremely heavy at Okinawa. Nor did the meeting reckon on the unknown menace of Japanese civilians, all of whom were expected to fight to the death armed with bamboo spears and knives or whatever weapons they could find.


The Joint Chiefs agreed on the politically palatable figure of 31,000 battle casualties in the first month, implying about 10,000 killed in action. Other estimates placed the figure far higher: Admiral Chester Nimitz reckoned on 49,000 dead and wounded in the first 30 days; Admiral William Leahy predicted a 35% casualty rate, implying 268,000 killed and wounded in total. Major General Charles Willoughby, General Douglas MacArthur’s Intelligence Chief, and no stranger to hyperbole, warned of between 210,000 and 280,000 battle casualties in the first push into Kyushu. At the extreme end, some feared half a million dead and wounded.


That the estimated casualties of a land invasion ranged from tens of thousands to half a million should have sounded alarm bells: nobody really knew. In any case, Marshall insisted “it was wrong to give any estimate in number” (after the war he privately offered Truman “as much as a million” as the likely casualty number).


To put these figures in context: the American combat force slated to invade Japan numbered 766,700. So it was an obvious fiction – or, if the Joint Chiefs actually believed it, a dismal reflection of their faith in the quality of the American soldier – to claim after the war that Japan’s ailing divisions (only half of whom were sufficiently supplied with ammunition) and “home guard” - mostly civilians carrying knives and bamboo spears - would have wiped out the entire US invasion force.


In short, in June 1945 nobody seriously believed casualties of an invasion would be a million or several million. So the claims promoted after the war and ad nauseum to this day that the atomic bomb avoided a land invasion and “saved up to a million American troops” were grotesque fictions, used as post-facto justifications for the weapon in the face of mounting ethical objections to its use.


The crucial question, however, is what impact these shocking figures had on Truman’s mind. Winding up the 18 June meeting, the president asked the Joint Chiefs: so the invasion would be “another Okinawa closer to Japan?” They nodded. And the Kyushu landing – was it “the best solution under the circumstances?” the President wondered. “It was,” the Chiefs replied.


Truman was unpersuaded, and after deep consultation, energised by the prospect of Russia joining the Pacific war, he decided in early July to shelve – ie postpone, if not actually cancel - the invasion plan, two weeks before the atomic “gadget” was due to be tested in New Mexico.


Why risk thousands of American lives attacking a defeated nation? Why grant the old Samurai their dying wish, to martyr themselves and their people? Why not involve the Russians or use the US blockade to force Japan to surrender? Those questions fairly reflected Truman’s thinking at the time, and reflect the fact that he was determined to avoid a land invasion.


In this light, it was never a question for Truman of either the bomb or an invasion: the bomb hadn’t been tested. It was a question of: why invade Japan at all?




Fast forward to “Trinity,” the atomic bomb test conducted on on July 16th in the Jornada de Muerto desert, 35 miles south of Socorro, New Mexico. Its success fulfilled the wildest dreams of the Manhattan Project, the secret organization charged with building the weapon.


The first man-made nuclear explosion detonated at 5:29 that morning. Radiation waves fled the bomb casing at the speed of light. Billions of neutrons liberated billions more in conditions that “briefly resembled the state of the universe moments after its first primordial explosion,” wrote one scientist. A bell-shaped fireball rose from the earth, whose “warm brilliant yellow light” enveloped physicist Ernest Lawrence as he stepped from his car. It was “as brilliant as the sun ... boiling and swirling into the heavens” – about a kilometer and a half in diameter at its base, turning from orange to purple as it gained height.


The nuclear dawn was visible in Sante Fe, 400 kilometers away. A partially blind woman later claimed to have seen the light. The blast was variously compared to “Doomsday” and “a vision from the Book of Revelation,” inspiring the scientific leader of the Manhattan Project, Robert Oppenheimer, to summon a line from the mystical Hindu text, the Bhagavad Gita - “Now I am become Death, the destroyer of worlds” – after which he strutted around like a cowboy who had just acquired the fastest weapon in the west.


The successful test certainly gave Truman the biggest weapon in the West - and a great boost to his confidence before the coming Potsdam conference, convened in late July to carve up the post-war world and to send an ultimatum to Japan to surrender.


The resulting Potsdam Declaration (or Proclamation), signed on July 26th by the United States, Britain and China, ordered Japan to surrender unconditionally or face “prompt and utter destruction.”


The nature of that destruction, by an atomic weapon, was not revealed to the Japanese or, ominously, the Russians. Stalin knew of the weapon’s development through his spies in Los Alamos, and drew his own conclusions.


But something else set Stalin’s rage boiling: Russia had not been invited to sign the Potsdam ultimatum to Japan. He had been pointedly ignored.




On 27th July, Japan’s Big Six read the Potsdam ultimatum. The three “moderates,” Suzuki, Togo and Yonai, noted with relief that the Soviet Union was not a signatory.


Why had Russia been excluded? Russia was then a US ally and Stalin “a disgusting murderer temporarily on our side,” as George Orwell had described the Soviet dictator. Why not use Russia’s name to help end the war, as Truman had earlier that month intended?


For one thing, Truman was now armed with a nuclear weapon, and the president understandably felt Russia’s help might no longer be needed to force Japan to surrender.


For another, James Byrnes, the US secretary of state and master political manipulator, had persuaded Truman to strike Russia’s name from the ultimatum. Byrnes himself had put a line through the Soviet Union on one draft, signed the amendment “JB” and added the word: DESTROY. The clerk responsible failed to heed Byrnes’ wishes, as I found a copy of this remarkable document in a box in the Truman Presidential Library in 2009.


By persuading Truman to remove Russia as a joint-signatory on the ultimatum, Byrnes effectively prolonged the war because, at a single stroke – surely the deadliest pen stroke in history – he deleted one of the greatest incentives for Japan’s surrender (avoiding a communist invasion) and reassured the Japanese leaders that Stalin remained neutral.


Byrnes thus handed Tokyo’s hardliners a powerful justification to continue the war effort. The US Secretary of State’s motives were threefold: to buy time for the bomb to complete its journey across the Pacific; to deny Stalin a claim on the spoils of victory; and to give America a crack at using the bomb and emerging as sole Pacific victor. In short, nuclear power was now guiding US strategy, not combat troops on the ground or Russia’s support.


In the event, Byrnes delaying tactics worked: the Big Six dared to hope that Russia remained neutral - as agreed under the Russo-Japanese Neutrality Pact. And so Tokyo’s fantasies were allowed to persist: they would continue to press Moscow to mediate a conditional peace with America, which Stalin had no intention of offering, even as he accelerated the mass deployment of his forces to the border with Japanese-occupied territory.




There was an olive branch in the Potsdam ultimatum, which the moderates seized on. One clause appeared to offer the Japanese people, of their “freely expressed will,” the chance to choose their post-war government. That implied the retention of the Imperial system, or at least the emperor as figurehead. 


Yet it was wide open to interpretation, and the three hardliners (Anami, Toyoda and Umezu) drew the darkest interpretation of another clause, which insisted that “the authority and influence of those who have deceived and misled the people of Japan into embarking on world conquest must be eliminated for all time.”


In their eyes, this meant the Emperor, and his probable execution as a war criminal – tantamount to the destruction of the soul of Nippon.


The Potsdam ultimatum must therefore be firmly rejected, they concluded. To surrender the national godhead would condemn them forever as the most reviled figures in Japanese history. The hawks prevailed: none of the Big Six were willing to sign a paper they interpreted as the Emperor’s death warrant.


And so, on 28 July Prime Minister Suzuki was persuaded to officially “mokusatsu” or “kill [the Potsdam ultimatum] with silence” – a Japanese negotiating tactic that treated offense with silent contempt.


Prime Minister Suzuki obliged at once: “The government does not think that [the Potsdam statement] has serious value,” he told the Japanese press. “We will do our utmost to fight the war to the bitter end.”


Like monks cloistered with their myths, the Big Six resolved to fight on, locked in the fantasy of Soviet-sponsored peace negotiations from which Japan would emerge with “honor” intact, oblivious to the fact that, in the eyes of the world, the Japanese regime had nothing left to negotiate - and much less honor.




On 6th August, on the morning a bomber called the Enola Gay flew towards Hiroshima with an atom bomb in its belly, the Japanese leaders were still waiting, hoping, for a Soviet reply to their peace feelers .



Part II of this essay will appear next week on HNN. 

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176631 https://historynewsnetwork.org/article/176631 0
Learning from Lincoln: Meeting Crisis with Action



In the spring of 1861, the United States was on the verge of becoming a failed state. All that was needed was Lincoln’s recognition of the existence of the Confederacy. In writing a history of how a strategically placed minority of slaveholders maneuvered eleven states out of the Union and created the short-lived Confederate States of America, I gave little thought to the possibility that my America might soon find itself on the brink of a comparable crisis of governance and national purpose. 

Yet, here we are. The nation is staggering from the shattering effects of the Coronavirus pandemic and its response is among the worst in the world. As mass protests on behalf of racial justice rock the nation’s cities, the president and federal law enforcement agencies respond with a show of brutal force that frightfully resembles the strong-arm tactics of Nazi Brownshirts in the 1930s. As unemployment soars to depression-era levels, the toxic partisanship in Congress stymies any consistent policy of national aid and relief. In retrospect, the combination of malevolence and paralysis that has characterized our national leadership should come as no surprise. Beginning in the late 1970s and accelerating after 9/11, the means of effective governance have been systematically undermined by tax cuts that further enrich the few at the expense of the many, deregulatory policies that give free rein to corporations regardless of the consequences to the environment or the public good, and the privatization of healthcare, prisons, education, infrastructure, pensions, low-income housing, and social services. Only a free market economy directed by individuals pursuing their own self-interest unfettered by government interference can be trusted, we’ve been told, with producing the greatest good for the greatest number. The result has been an economy hobbled for decades by deindustrialization, dead-end jobs in the midst of endemic underemployment, rising levels of linked racial and economic inequality, and smoldering alienation and resentment by those who once knew or dared to hope for better.

Lincoln met the crisis of secession by demonstrating that the United States indeed had a government whose claims to national sovereignty received broad public support. Working hand in hand with Congress and with support from pro-war Democrats, Lincoln oversaw a restructuring of the federal government to meet the demands of the war. In a burst of path-breaking legislation unmatched until the New Deal, Congress created the first national currency, fashioned a new backing system around nationally chartered banks, imposed the first national taxes on individual incomes and businesses, provided for land grant universities to advance agricultural education, and made good on homestead legislation. Federal expenditures exceeded all the costs of running the government since its inception in 1789 down to the outbreak of the Civil War. Business interests certainly received their share of federal subsidies and then some, but the reform program of the Republicans was broad and generous enough to garner the support of the party’s core constituencies. More significantly, the Republicans’ campaign for emancipation, however belated, set the stage for a commitment to racial equality written into the Constitution that would have been unthinkable before the Civil War.

As Lincoln recognized, the Union’s cause was the cause of Western liberalism. Carried forward by a new middle class and workers seeking basic democratic rights of suffrage and political participation, liberalizing currents swept Europe into the revolutions of 1848 against the hierarchical old order of the landed aristocracy. Though suppressed, the revolutionaries held fast to their liberal agenda. This was the international context of rising liberalism the Confederacy sought to reverse with its bid for a reactionary slaveholding republic ruled by a landed elite. Its defeat in the Civil War was a setback for conservative regimes in Europe intent on further repression at home and the imposition of new imperial regimes in the Americas, such as Emperor Napoleon III’s designs on Mexico and Spain’s on Sainte-Dominque.

The challenge facing democratic governments today is even more daunting than that confronted by Lincoln. To date, tightly regulated, centralized regimes in the East led by China, Singapore, and Vietnam have been far more successful in dealing with the Coronavirus crisis and in providing social programs insulating their populations from the worst effects of the crisis. More open nations in the East like Japan, Taiwan, and South Korea have also coped relatively well with the assistance of mandatory controls on individual behavior and programs of public education and health services.  Lagging far behind are the flagships nations of Western liberalism and individualism, the United States and the United Kingdom.  In the increasingly contentious battle for global leadership between China and the United States, China’s model, for all its repressive features, is pulling ahead as a governing system for other nations to follow. 

At risk is the very survival of democratic governance and free market capitalism as the exemplar of how progress and human rights are to be achieved, a legacy already weakened by the rise of authoritarian populism. Much more is needed than pouring money into the struggle to maintain or regain military, technological, and economic superiority and to throw a lifeline to the most disadvantaged. Such measures will be at best stop gap palliatives without a fundamental redefinition of just what constitutes national security and a reckoning with the poisonous legacies of slavery, systemic racism, and grotesque levels of wealth inequality. The United States is at a crossroads. One path leads to closed borders, bludgeoning protesters, and repeating the same policies of the past that have worsened the problems of today. The other leads to major cultural and economic shifts brought about by a fundamental re-examination of who we are as a people and what legitimate demands for social and economic justice we have a right as citizens to make on our government. The path chosen will determine whether contemporary America resumes its role as a beacon of hope and progress to the rest of the world or joins the Confederate slaveholders of the past among history’s losers. 

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176681 https://historynewsnetwork.org/article/176681 0
Free Speech and Civic Virtue between "Fake News" and "Wokeness"

Alexander Meiklejohn, Time, 1928



Harper’s Magazine recently published an open letter defending public inquiry and debate against “severe retribution [for] perceived transgressions of speech and thought.” The letter refers to “an intolerant climate that has set in on all sides,” and while it mentions Donald Trump and authoritarian regimes its real target seems to be an increasingly illiberal segment of the American Left.


Critics have cried foul. Some reactions have exhibited exactly the tendencies that the letter describes, but others thoughtfully critique power imbalances between the celebrity signers and members of various marginalized groups, or question the motives or consistency of those signers who (in their eyes) have not supported free speech in the past. Some even claim the mantle of liberalism for themselves, arguing that public pressure to suppress unpopular opinions is itself a form of collective speech or association, the very “marketplace of ideas” that many liberals celebrate.


Unfortunately, none of these arguments reaches past adversarial notions of democracy. They all characterize free speech as a matter of conflicting rights-claims and competing factions. Indeed, some critics of the Harper’s letter seem eager to reduce all public debate to a form of power politics. Trans activist Julia Serano merely punctuates the tendency when she writes that calls for free speech represent a “misconception that we, as a society, are all in the midst of some grand rational debate, and that marginalized people simply need to properly plea our case for acceptance, and once we do, reason-minded people everywhere will eventually come around. This notion is utterly ludicrous.” As long as political polarization precludes rational consensus, she argues, we are left to “[make] personal choices and pronouncements regarding what we are willing (or unwilling) to tolerate, in an attempt to slightly nudge the world in our preferred direction.” Notably, she makes no mention of how we might discern the validity of those preferences or how we might arbitrate between them in cases of conflict.


To paraphrase the philosopher Alexander Meikeljohn, one could say that critics of the Harper’s letter take the “bad man” as their unit of analysis. By their lights, all participants in public debate are prejudiced, particular, and self-interested, “idiots” in the classical sense of the word. This is what allows many of the critics to assert a moral equivalence between free speech and the suppression of ideas. Free speech advocates are hypocritical or ignore some extenuating context, they claim, while those stifling disagreeable or offensive views are merely rectifying past injustices or paying their opponents back in kind, operating practically in a flawed public sphere.


It is telling, however, that the letter’s critics focus on speakers and what they deserve to say far more than the listening public and what we deserve to hear. Indeed, their arguments seem to deny the very existence of a public (at least in a unitary sense), and that is their fundamental shortcoming.


In Free Speech and Its Relation to Self-Government (1948), Meikeljohn challenges us to approach public discourse from the perspective of the “good man”: that is to say, the virtuous citizen. For Meikeljohn, if political debate is nothing but an exercise of “self-preference and force,” oriented toward no “ends outside ourselves,” we have already lost sight of its purpose (pp. 78-79). One cannot appreciate the freedom of speech, he writes, unless one sees it as an act of collective deliberation, carried out by “a man who, in his political activities, is not merely fighting for what…he can get, but is eagerly and generously serving the common welfare” (p. 77). Free speech is not only about discovering truth, or encouraging ethical individualism, or protecting minority opinions—liberals’ usual lines of defense—it is ultimately about binding our fate to others’ by “sharing” the truth with our fellow citizens (p. 89).


Sharing truth requires mutual respect and a jealous defense of intellectual freedom, so that “no idea, no opinion, no doubt, no belief, no counter belief, no relevant information” is withheld from the electorate. For their part, voters must judge these arguments individually, through introspection, virtue, and meditation on the common good. 


The “marketplace of ideas” is dangerous because it relieves citizens of exactly these duties. As Meikeljohn writes:


As separate thinkers, we have no obligation to test our thinking, to make sure that it is worthy of a citizen who is one of the ‘rulers of the nation.’ That testing is to be done, we believe, not by us, but by ‘the competition of the market.’ Each one of us, therefore, feels free to think as he pleases, to believe whatever will serve his own private interests. We think, not as members of the body politic, of ‘We, the People of the United States,’ but as farmers, as trade-union workers, as employers, as investors.…Our aim is to ‘make a case,’ to win a fight, to make our plea plausible, to keep the pressure on” (p. 86-87).


Of course, this is precisely the sort of self-interested posturing that many on the Left resent in their opponents, but which they now propose to embrace as their own, casually accepting the notion that their fellow citizens are incapable of exercising public reason or considering alternative viewpoints with honesty, bravery, humility, and compassion. 


As in many points of our history, the United States today has its share of hucksters, conspiracy theorists, and hate-mongers. It would be a mistake, however, to conceive of either our democratic practice or ideals as if these voices were central or unanswerable. In practice, curtailing public speech is likely to worsen polarization and further empower dominant cultural interests. As an ideal (or a lack thereof), it undermines the intelligibility and mutual respect that form the very basis of citizenship.


The philosopher Agnes Callard points out that political polarization has induced Americans to abandon “truth-directed methods of persuasion”—such as argumentation and evidence—for a form of non-rational “messaging,” in which “every speech act is classified as friend or foe… and in which very little faith exists as to the rational faculties of those being spoken to.” “In such a context,” she writes, “even the cry for ‘free speech’ invites a nonliteral interpretation, as being nothing but the most efficient way for its advocates to acquire or consolidate power.” Segments of the Right have pushed this sort of political messaging to its cynical extremes—taking Donald Trump’s statements “seriously but not literally” or taking antagonistic positions simply to “own the libs.” Yet none of that justifies the countervailing impulse to cloak messaging in sincerity of conviction.


While the language of the Harper’s letter has been criticized as vague, the responses to it could not be clearer. Critics have endorsed an adversarial politics based on the “marketplace of ideas” and selective elements of liberalism. Signers have put forward robust liberal principles that align with republican virtues. When they warn that restriction of debate “makes everyone less capable of democratic participation,” they affirm civic equality and public duties. When they refuse “any false choice between justice and freedom,” they underscore that freedom is hardly “intellectual license” (p. 87) but a shared commitment to realizing our society’s ideals. Rather than assuming the supremacy of our own opinions or aspersing the motives of those with whom we disagree, our duty as Americans is to think with, learn from, and correct each other. If we are to decide what we are willing or unwilling to tolerate, we must do so with individual integrity and collective concern.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176633 https://historynewsnetwork.org/article/176633 0
Better Than Silence: The Need for Memorials to the Manhattan Project




Seventy-five years ago, the United States detonated the first atomic bomb in Alamogordo, New Mexico. Given the enormity of that event, and the ensuing swift conclusion of World War II, it would be reasonable to expect a substantial physical manifestation of the place’s importance. A park, perhaps, or a museum. 


Instead the site of the Trinity Test is closed. Yes, it was designated a National Historic Landmark in 1975. But people can only visit on two days a year -- one in spring and one in fall. Visitors may drive to the site to see the nearly barren landscape and the distant hills that, before dawn on July 16, 1945, were illuminated brighter than one hundred suns. A black stone marker stands about twelve feet tall with its dull brass plaque, surrounded by security fences and a vast expanse of sky. That is all. 


When the bomb detonated, atop its 100-foot steel tower, it heated the sandy soil below into glass. The tower itself was vaporized. Should you find a piece of the glass, keeping it is against the law. No artifacts, please.


If Alamogordo’s minimalized memorial indicates America’s uncertain relationship with the bomb’s history, then the city two hundred and fifty miles away where the weapon was built reveals an even deeper ambivalence. In a time when monuments are under nationwide debate, Los Alamos may set the standard for what not to do. 

Aug. 11 will be the three-year anniversary of the “Unite the Right” rally in Charlottesville, VA. Its purported aim was to prevent removal of a statue of Confederate Gen. Robert E. Lee. The real intent surfaced when the rally turned violent, with many people injured and one woman killed. 


Since then the debate over monuments has intensified. Cities as different as Lexington, KY, Baltimore and New Orleans removed statues of Confederate leaders. Protestors weighed in too, for example tearing down the statue of Confederate president Jefferson Davis in Richmond. According to the Southern Poverty Law Center, 114 Confederate statues have come down since Charlottesville. 


In other words, America is having a heated argument about how to represent its past. From Christopher Columbus forward, some public historic symbols have become controversial. The bomb is no exception.  


Granted, this nation is fully capable of memorializing painful or complex times. A visitor to Pearl Harbor National Memorial can stand at the window, and see the USS Arizona sitting right there on the bottom. The most casual tourist at Gettysburg can easily find a pamphlet detailing how many boys died in Pickett’s Charge – on both sides. Even above the beaches of Normandy, there is no flinching from the brutal reality of what liberating France required. The cost of valor is tallied in tombstones. 


Los Alamos is not the same. I went there to do research for my book Universe of Two. All those years after the Trinity test, I had not expected a stone wall. 

The book is a novel, loosely based on the life of Harvard-trained mathematician Charles B. Fisk. He worked on the Manhattan Project, first within the Metallurgy Department at the University of Chicago, and then in Los Alamos. After the war he received a full scholarship from Stanford University to get a PhD in physics. 


Apparently the program’s emphasis on developing new bombs diminished his enthusiasm. Fisk dropped out after less than a semester, taking a part-time job with a company that repaired church organs. When he died in 1983, Fisk was considered one of the greatest cathedral organ builders ever. The company he founded continues to make premium instruments for colleges, universities and churches around the world. 


I learned none of this from the two paid historians on staff at the Los Alamos National Laboratory. All they would do is confirm that Fisk had worked there, on the detonator team. 


When I reached Los Alamos, I found a similar disinterest in disclosure. Such fundamental tools as maps – of the work sites, of the barracks’ locations -- were not available. I tried with the Historic Society, which directed me to the National Parks office, which was closed in the middle of the day – for several days. 


I pressed on. A geologist whose job was to remediate radioactive areas loaned me maps he’d used to organize the various extraction sites. It was helpful for understanding the scale of the Project’s operation, but couldn’t tell me where Fisk might have lived. Ultimately I relied on the hand drawn maps in self-published memoirs by the wives of Manhattan Project scientists – hardly an exacting record.


To be fair, Los Alamos is home to the Bradford Science Museum, a clean place with enthusiastic displays. But it lacks even one mote of dust about the consciences of those who built the bomb, who questioned its moral fitness. Hundreds of project scientists repeatedly petitioned the Department of War, the State Department and the president, arguing that the bomb should be demonstrated, not used on civilians. You won’t find that information in the exhibits. If you search the museum’s digital document archive under the word petition, there are zero matches.


Los Alamos’ ambivalence about its history is perhaps most physically expressed at Fuller Lodge. In the 1930s, when the area was home to a rough-riding boys’ school, the lodge’s large wooden edifice, with a stone patio facing Ashley Pond, served as the campus center. After the U.S. government acquired the school and surrounding lands, Fuller Lodge continued to be the hub of activity: dances, speeches, concerts, dinners, plays. 


But for those who notice subtleties, the history represented there is selective. Fuller Lodge bears a brass plaque declaring it to be on the National Historic Registry. Inside, Native American rugs hang on the walls. Local potters’ work sits in glass cabinets. 


But the photos on the walls are revelatory in what they exclude: Here are schoolboys playing basketball. Here’s a graduating class. If there is a picture of Manhattan Project director Robert Oppenheimer, or any of the scientists who labored to build the bomb, it hangs somewhere out of sight. 


Is this place special because it was a school? Or because thousands of scientists, engineers and technicians made incredible leaps in knowledge and technology there, harnessing the power of the atom as a tool of war that would change international relations forever? According to the photos, the school’s history wins. 


For as long as mankind has warred, new technologies have been put swiftly to use: the trebuchet, gunpowder, the airplane. But each of these tools, in general, was aimed to defeat enemy combatants -- with the intention of sparing innocent civilians. Atomic bombs make no such distinction. They kill everyone within reach, the criminal and the child, the murderer and the monk. Creating the bomb was both a milestone achievement, and a profound expansion of the limits of warfare. 


That is not to say that memorials about complex matters cannot exist. The Vietnam Memorial in Washington, DC is a stunning example, simultaneously questioning the war and honoring those who made the ultimate sacrifice. Likewise the 168 empty chairs in the plaza beside the Oklahoma City bombing memorial both symbolize the lives that were lost, and bear mute witness to the crime committed there. The endlessly falling water in the Ground Zero monument in New York City represents the countless tears shed by friends and families and a nation for people whose only error was in going to work on Sept. 11, 2001.


What might a proper memorial for the creation of the atomic bomb look like? If Japan can build the Hiroshima Peace Memorial, the artists and architectural geniuses of this nation can find a suitable answer. Both triumph and tragedy deserve a permanent place in the public sphere. Almost anything is better than silence. 

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176679 https://historynewsnetwork.org/article/176679 0
Conventional Culture in the Third Reich

Wilhelm Furtwängler conducts the Berlin Philharmonic in a Berlin Factory, March 1942. 

© Deutsches Historisches Museum, Berlin.



Thinking of culture in the Third Reich conjures up images of mass rituals, swastika flags, and grandiose buildings. Makers of television documentaries and designers of book covers (admittedly including that of my own new synthesis) tend to look for visual material that is instantly recognizable as Nazi. However unconsciously, this reflects the ambition of the Third Reich’s leaders to bolster their rule through a clear cultural profile – an ambition that was only partially fulfilled. No one would doubt that public architecture by Albert Speer or the Nuremberg Party Rallies, enhanced by Speer’s light installations and prominently filmed by Leni Riefenstahl, mattered a great deal. But in other realms, a distinctive cultural profile proved far more elusive. 


The careers of extreme-right composers, playwrights, and film directors often stalled owing to their cantankerous personalities, limited popular appeal, or works that were deemed too shocking for a wider public (such as antisemitic dramas featuring rape scenes). Others fell short of Adolf Hitler’s standards, which were as high as they were vague. In January 1936, his faithful propaganda minister Joseph Goebbels noted impatiently: “We don’t have the people, the experts, the Nazi artists. But they must emerge in time.” Behind the scenes, Hitler was unhappy with the heroically proportioned bodies, monumental landscapes, and idealized peasant scenes on display at the 1937 “Great German Art” exhibition in Munich. His opening speech consequently dwelled on nineteenth-century Romanticism and the nefarious influence of Jewish art dealers rather than elaborating on what “true new German art” was supposed to entail.


Should the Third Reich’s efforts at transforming German culture thus be regarded as a failure? This would be to distort the picture, for much of what was performed, printed, or exhibited after 1933 was not, and did not aim to be, specifically Nazi. As early as the 1920s, Hitler and his followers had posed equally as bold innovators and as staunch defenders of a tradition that was supposedly under threat from cosmopolitan Jews and left-wing modernists. Such overlaps between extremist and conservative beliefs increased their support among those sections of the German middle class that upheld nineteenth-century cultural tastes. During the Third Reich, this ensured a sense of continuity for a public that appreciated conservative interpretations of Beethoven’s symphonies, Schiller’s plays, and Wagner’s operas. In turn, many a theatre actor, orchestra musician, or opera singer benefitted from the generous flow of direct subsidies and the activities of the leisure organization Strength through Joy, which arranged ten thousands of special performances.


Popular culture during the Third Reich had a similarly conventional outlook. Most of the costume dramas and screwball comedies shown in cinemas evinced few, if any, traces of Nazi ideology. Germans consumed them as harmless entertainment, much like they did with low-brow novels and the light music that predominated on the radio channels. While they favored domestic offerings, before World War II they did not have to feel cut off from international developments. Walt Disney’s Mickey Mouse series and Margaret Mitchell’s novel Gone with the Wind were widely popular. The Third Reich’s own movie stars included the Hungarian Marika Rökk and the Swede Zara Leander alongside domestic idols such as the cheerful comedian Heinz Rühmann and the ruggedly masculine Hans Albers.


If so much of culture in the Third Reich was conventional rather than identifiable as Nazi, then where does its political significance lay? Obfuscation is an important part of the answer. When seeing a trivial comedy or listening to a nineteenth-century symphony, few appear to have thought about those cultural practitioners who were defined as Jewish and were consequently eliminated from movie casts and symphonic orchestras. Audiences were well aware that the fringes of culture had changed, in favor of pseudo-Germanic plays and paintings and to the detriment of the ambiguity that had been at the heart of Weimar’s most fascinating art, music, and literature. But the ready availability of conventional fare made it easier not to care, in Germany as well as abroad: When the antifascist and modernist Kurt Weill performed a composition in Paris based on texts by Bertolt Brecht, the audience reaction was negative, in stark contrast to the enthusiastic welcome which the French capital gave to Wilhelm Furtwängler, the conductor of the Berlin Philharmonic and one of the Third Reich’s cultural figureheads.


Beyond obfuscation, conventional culture in the Third Reich stood out for the ways in which it was marshalled politically. During World War II, when Germany occupied much of Europe, it promoted its own film industry by excluding Hollywood imports and those prewar movies in which Jews had had any involvement. While the occupiers paid respect to the culture of France, they despised that of Poland, citing even the most mediocre theatrical or musical performance as evidence of German superiority. The Nazi grandees were special not in their appreciation of Western European art from the middle ages to the nineteenth century but for their habit of looting widely and unashamedly, thereby treating museums and private collections in the occupied countries as personal hunting grounds.


All this backfired, inasmuch as the allies became increasingly uninclined to distinguish between ‘German’ and ‘Nazi’ culture. Nowhere did this become more apparent than in the British and American bombing campaigns that were inflicted on major city centers with their time-honored churches and town halls. This allowed the Nazi leaders to declare themselves the defenders of German culture against a lethal threat from the outside. After the Third Reich’s demise, conventionality once again ensured continuity. Now that the war was over, American, British, and Soviet occupiers gave ample room to an established version of German culture, out of long-standing respect and in an effort to win over a defeated people. To Germans, attending a conservatively interpreted Beethoven symphony or seeing an entertaining movie seemed apolitical. At the same time, it allowed them to preserve a sense of national identity in a situation of national disempowerment. The fact that specifically Nazi elements were marginal to the post-1945 cultural landscape made it all the easier to dissociate oneself from the Third Reich – a necessary step, but also a self-exculpatory one. 

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176674 https://historynewsnetwork.org/article/176674 0
The Battle of The Atlantic has Lessons for Fighting COVID-19

Dixie Arrow after being torpedoed off Cape Hatteras, March 1942



Tom Hanks's “Greyhound,” the historically accurate film about a World War Two convoy under attack from German U-boats, depicts the harrowing early years of the war when merchant shipping faced near-constant threat of attack. The attacks took place in the mid-North Atlantic where, until later years of the war, air coverage could not protect the ships. But in the very first months of the war, the U-boats converged on shipping within sight of the United States’ coastline. The bungled response to the U-boats lurking off the Eastern seaboard offers some surprising lessons for responding to the current pandemic.


With Germany’s December 11, 1941 declaration of war on the United States, Hitler’s navy launched what was known as Operation Paukenschlag, or Drumbeat, the cross-Atlantic attack on shipping plying the Eastern seaboard lanes. Five long-range U-boats departed in the third week of December from their pens in occupied France. They reached within just miles of the Eastern seaboard early in January 1942.


What they saw shocked and delighted them. It was the start of what the U-boat sailors called the “Second Happy Time,” after an earlier period of great success against Allied shipping.

Lighthouses flickered, headlights beamed, signs glowed, and household and business windows illuminated the night. Arriving off New York City, “I found a coast that was brightly lit,” one U-boat captain recalled. “At Coney Island, there was a huge Ferris wheel and roundabouts —could see it all. Ships were sailing with navigation lights. All the light ships, Sandy Hook and the Ambrose lights, were shining brightly. To me this was incomprehensible.”


Other Germans arriving along the East Coast shared his astonishment. They had traveled from blacked out Europe. But surfacing off American cities and resort towns from Portland, Maine, to Miami, observed another U-boat officer, “we could distinguish equally the big hotels and the cheap dives, and read the flickering neon signs…. Before this sea of light… we were passing the silhouettes of ships recognizable in every detail and sharp as the outlines in a sales catalogue…. All we had to do was press the button.”


It was a turkey shoot. Those first five U-boats bagged 23 ships within days. The German navy couldn’t afford to dispatch more than a dozen U-boats at a time to the happy hunting ground that spring. But with so many targets silhouetted against the lights glaring out from 1,500 miles of coastline, they wasted no torpedoes. Three ships on average went to the bottom every day that spring.


From the Canadian to the Caribbean waters, those scant few U-boats on patrol at any one time a total of sunk more than 360 merchant ships and tankers--about 2,250,000 gross tons--in the first half of 1942. An estimated 5,000 lives, mostly merchant seamen, were lost.


Americans on shore and passengers on flights gaped at the horrifying spectacle of ships exploding and sinking. Vacationers watched tankers burn to the waterline and found debris and bodies washed up on shore. With oil and food supplies suddenly lost in vast quantities, rationing began.


But those same tourists and beach goers shared some of the blame for the carnage and economic toll. The resort towns and other coastal sites where they flocked refused to dim the hotel, restaurant, shop, boardwalk and carnival lights. They were beacons that drew people—customers—from far and wide.


And they nearly lost the nation World War Two.


Historian Samuel Eliot Morison wrote in his official postwar naval history, 


“One of the most reprehensible failures on our part was the neglect of the local communities to dim their waterfront lights, or of military authorities to require them to do so, until three months after the submarine offensive started. When this obvious defense measure was first proposed, squawks went up all the way from Atlantic City to southern Florida that the ‘tourist season would be ruined.’ Miami and its luxurious suburbs threw up six miles of neon-light glow, against which the southbound shipping that hugged the reefs to avoid the Gulf Stream was silhouetted. Ships were sunk and seamen drowned in order that the citizenry might enjoy business and pleasure as usual.”


The U.S. government also shared in the blame. Fearing a demoralized public in the early months of what was sure to be a long and costly war, federal authorities blocked reports about the catastrophic shipping losses and the Navy’s inability to stop the U-boat onslaught. People didn't realize that they could help stop the carnage.


Sound familiar? Instead of refusing to turn out the lights, Americans today are in large numbers refusing to mask and social distance. Many share those earlier Americans’ disbelief in the early months of the war that their actions harmed the country and put their fellow citizens at grave risk. Businesses now, like then, worry about losing customers. Unwilling to undermine the economy, some state and federal officials have blocked public access to infection data and deny the gravity of the public health risk.


Like the global war in early 1942, we are just in the first months of the coronavirus pandemic. But in late spring of 1942, American authorities finally wised up to the need to come to grips with the U-boat onslaught.


Public campaigns urging voluntary dimming of the lights weren’t working. Finally, on April 18, U.S. military and federal officials responsible for protecting the Eastern seaboard ordered a shoreline blackout and a dim-out of coastal cities.


With reluctance from some quarters, headlights and windows were masked, hotels darkened their signs, lighthouses dimmed their beacons, businesses shut off the lights at dusk. Penalties were enforced against those who refused to turn their lights off.


The benefits came quickly. The U-boats were deprived of their silhouetted sitting ducks. Coastal shipping losses dropped immediately, back to 23 that month. The “curve” of sinkings flattened. Convoys with naval escorts such as depicted in “Greyhound” further reduced losses until, in July, there would be only three sinkings, and then none inside U.S. coastal waters for the rest of the year.


As officials and individuals consider their responsibility for halting the coronavirus outbreak, we would do well to recall this earlier failure to act to defeat a common enemy. We should learn from the past and act promptly and decisively to enforce masking before the coronavirus sinks the nation.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176634 https://historynewsnetwork.org/article/176634 0
30 Years Later: Saddam Hussein's Fateful Decision to Invade Kuwait

Oil Well Set Fire during Iraqi Retreat, Kuwait 1991


In the summer of 1990, Iraq was on the verge of bankruptcy and already in arrears on payments to several international creditors. It responded to its predicament by invading its southern neighbor, Kuwait. Thus started a long international crisis. It ended only after the onset of a US-led military campaign in mid-January 1991 to liberate Kuwait. 

The 1991 Gulf War brought an ignominious defeat to Iraq which diminished its regional status and weakened the regime of Saddam Hussein, the Iraqi President. Contemporaries could only surmise that this chain of events was the act of an irrational dictator. Indeed, Saddam was often compared to Hitler. Yet once oil is introduced into the picture, it turns out there was logic, after all, to Saddam's madness.   

Saddam's fate was intertwined with that of oil from the very beginning. The Iraqi Baath party, which Saddam led, came to power in 1968, exactly when oil prices started climbing up. In the preceding decade, industrialized countries came to rely on oil to meet their energy needs. World demand was surging as global capacity remained the same. The first to spot an opportunity was the Libyan junta which demanded in 1969 to receive 55 percent of the profits on its oil (up to that point profits were divided evenly between the hosting government and the corporation which pumped the oil out of the ground). In 1972 Iraq nationalized its oil industry and gained full control over its revenue. A year later, the Organization of Petroleum Exporting Countries (OPEC) reacted to another round of Israeli-Arab fighting by cutting production quotas and raising prices. The revolution in Iran in 1979 temporarily removed Iran's oil from the market, causing panic in Western countries. Lines starched for hours in gas stations across the developed world.

As a result, during the 1970s the price for a barrel of oil shot up from $3 to $30. The impact on Iraq was dramatic. State revenue from oil rose from $487 million in 1968 to $12.2 billion in 1979. All this came at an opportune time for Saddam. He needed the money to overcome several thorny issues. There was an on-off Kurdish rebellion in the north where major oil fields were located. Iraq's other oil fields lay in the south which was dominated by Shiites, the country's majority population, who were also suspected in scheming against the government. Sunnis such as Saddam resided in the center of the country, yet he did not even represent them. Most of the senior officials and officers in the Saddam administration hailed, like their president, from the same tribe residing in the environs of the city of Tikrit. How could one man control such a divided country?

Saddam used the enormous resources at his disposal to either bribe or intimidate the Iraqi population. To those who were willing to support the regime, the state would offer full employment with decent pay by expanding the state bureaucracy and investing in a large military-industrial complex, petrochemical industry, and huge infrastructure projects. If that was not enough, the regime also provided low-cost loans for housing and higher education scholarships. The Kurds in the north, finally subdued in 1975 after a military operation, were offered social welfare handouts. Those who chose to resist the regime would face a vast secret police and a formidable army.

Trouble began in 1980 when oil prices started to fall. After a decade of high oil prices, industrialized countries shifted to other sources of energy such as coal, gas, and nuclear power. In 1986 alone the price plummeted from about $28 to $12 a barrel. OPEC was unable to stop this trend. It tried to dictate production quotas but had no mechanism to punish cheaters. Countries worried about losing market share, kept selling their oil no matter how low was the price.

For the time being, Iraq was shielded from the effects of the downturn. Throughout the years 1980 to 1988, Iraq received lavish subsidies from Saudi Arabia and Kuwait. This was the way in which these countries supported Iraq in its war against fundamentalist Iran whose attempts to export its revolution frightened governments across the region. However, when the Iran-Iraq war ended, Iraq had to face the music on its own.

By 1990 Saddam's patience was running thin. Another year of low oil prices could force him to cut the budget. He and his ministers had no doubt that should he do so, his regime would collapse. Saddam was nothing without the benefits he offered to his people.

In Saddam's view, the main culprit was Kuwait. Indeed, Kuwait was a consistent quota-cheater, selling every year about 1.5 million barrels of oil more than OPEC wanted it to. Unlike Iraq, which was wholly dependent on oil as it was its sole export product, Kuwait was also a processor and marketer of the black gold. It owned three refineries in Europe and 6500 service stations across the continent under the logo "Q8." Low oil prices helped attract customers to buy the products of its petrochemical industry and fuel at its gas stations.

Throughout the first six months of 1990, various Iraqi officials in a series of conferences and summit meetings tried to huff and puff Kuwait into submission. But oil sheikhdom seemed defiant. By June, Saddam had ordered the Republican Guard to prepare for an invasion and two months later Iraqi tanks rolled into Kuwait City. Had Saddam succeeded in annexing Kuwait, as he had intended, Iraq would have turned into an oil superpower, equal in its weight to Saudi-Arabia. Then Saddam could try and whip OPEC into shape and dictate prices.

It was clear from the outset that this was a desperate gamble that put Iraq on a collision course with Washington. But Saddam believed he had no other choice. As one senior Iraqi minister summed it in January 1991: "if death is definitely coming to this people and this revolution, let it come while we are standing." 

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176672 https://historynewsnetwork.org/article/176672 0
Who’s Our Roy Cohn?



Was Roy Cohn evil?  There seems to be a consensus that he was.   

That is at least what we learn from two recent films on New York City’s most notorious lawyer: Matt Tyrnauer’s 2019 documentary, “Where’s My Roy Cohn?,” and Ivy Meeropol’s “Bully. Coward. Victim.  The Story of Roy Cohn,” just released last month on HBO.


A fixture of the New York political and social scene until his death from AIDS in 1986, Cohn made himself indispensable to some of the biggest movers and shakers of the late 20th century, including Joseph McCarthy, Ronald Reagan, members of the Genovese crime family, and Donald Trump.  A friend to the rich and famous, he was the bête noir of liberals and the left.  


As an assistant prosecutor in the espionage case of Julius and Ethel Rosenberg, Cohn illicitly lobbied trial judge Irving Kaufman to impose the death sentence, carried out by electrocution in June 1953.  Later (as we learn in Meeropol’s film), Cohn admitted to fellow lawyer Alan Dershowitz that he had helped frame Ethel Rosenberg by coaching false testimony from her brother David Greenglass, a miscarriage of justice later confirmed by Greenglass in a 60 Minutesinterview.  About using the courts to murder an innocent mother of two young children, Cohn had no more remorse than he had about not paying his income taxes.  “We framed guilty people,” he reportedly told Dershowitz.  To protégé Roger Stone he said, “…if I could have pulled the switch, I would have.” 


Cohn used the notoriety he achieved helping to execute the Rosenbergs and convict other accused Communists to land the coveted position of chief counsel to Wisconsin Senator Joseph McCarthy in his notorious Congressional crusade against alleged communist infiltration of national government and the military.  Cohn’s constant presence at McCarthy’s side in the televised Army-McCarthy hearings in 1954 earned him international fame, while at the same time crashing his political ambitions once McCarthy’s manic overreach brought censure from the public and the Senate. 


His history as a McCarthyite clung to Cohn like a cheap suit, but as he retreated home to New York City he made that suit a fashion statement, amping up his anticommunist rants and selling his legal services to anyone who needed a ruthless courtroom advocate with no compunction about twisting the truth.  His clients included mobsters, crooked politicians and Gotham’s worst powerbrokers, the city’s real estate titans, among them the young Trump, who at the time was turning his father’s outer-borough business into an international empire.  Cohn cherished his friendship with Trump, whom he defended in the 1970s against a federal suit for racial discrimination in housing.  Trump repaid that loyalty by pretending he didn’t know Cohn when the latter’s AIDS diagnosis hit the press.  Until, of course, he needed a truly ruthless advocate during the Russiagate crisis, when Trump’s long dead former personal fixer once again became “my Roy Cohn.”


The banality of “Cohn’s evil”


Many interviewees in both documentaries testify to Cohn’s “evil,” his social pathology, his lack of scruples, conscience or remorse, and his nearly innate criminality.   An unattributed voiceover sets the tone for Tyrnauer’s film from its opening scenes:  “Roy Cohn’s contempt for people, his contempt for the law was so evident on his face that if you were in his presence you knew you were in the presence of evil.”  According to former prosecutor and author, James Zirin on whom Tyrnauer relies heavily for background and color, “He was like a caged animal.  If you opened the door to the cage, he would come out and get you.”  


Meeropol’s film sets almost the same tone, though differently framed by her experience as the granddaughter of the Rosenbergs.  Interviewing very few of the same informants, Meeropol seems to come to the same conclusion.  John Klotz (a lawyer who investigated Cohn in the 1970s) declared, “Roy Cohn was one of the most evil presences in our society during most of my adult life.”  As Cohn’s cousin, journalist David Lloyd Marcus (who also appears in “Where’s My Roy Cohn?”), succinctly put it, Cohn was “the personification of evil.”


In cataloguing Cohn’s misdeeds, both films are precise and exhaustive.  They leave little doubt that this behavior resembles textbook examples of a personality disorder or social pathology.  A lawyer from Cohn’s firm could have been providing a profile to Psychology Today when he tells Tyrnauer that Cohn “knew no boundaries” and that “if you were on the right side of him, you were OK.  If you were on the wrong side of him, it was terrible.” According to Zirin, Cohn was “a personality in disarray.  A personality in anarchy, which had no rules, had no scruples.  It had no boundaries.”    


But as Cohn himself might declare, “what’s it to us?”  Why should we care about this psychologically crippled character? How was Cohn’s corruption and unscrupulousness so different from anyone else’s?  Isn’t the world Cohn inhabited, of socialites and their lawyers, of mobsters and real estate speculators, full of sociopaths and disordered personalities like Cohn’s?  It’s only if we think of Cohn as especially emblematic of the age that we spend so much time learning about his life.  


Tyrnauer only hints at a broader perspective that might have been useful in understanding Cohn as a historical figure.   “Roy was an evil produced by certain parts of the American culture,” writer Anne Roiphe (another Cohn relative) tells us.  But what parts?  “When you look at Cohn’s life,” the voice over opening Tyrnauer’s film intones, “you are shining a light on demagoguery, hypocrisy, and the darkest parts of the American psyche.” Tyrnauer cuts to scenes of Cohn with a young Donald Trump, Trump speaking and, at the end the montage, the scene of a violent attack on a black protester at a recent Trump rally. 


If it’s not one thing, it’s your mother


Unfortunately, we don’t learn much from such juxtapositions.  Did Cohn create Trump?  Was he Trump avant la lettre, as New York Magazine columnist Frank Rich recently proposed?  Or did they have some origin in common?  As they strain to explain Cohn’s significance, both films get hung up on the notion that Cohn was “evil.”  Tyrnauer’s suffers the most, and “Where’s My Roy Cohn?” takes off from that assumption on a psychoanalytic tangent that is all too familiar: It was all about his mother. 


David and Gary Marcus, Cohn’s cousins on his mother’s side, recount the story of a Passover hosted by Dora Marcus Cohn at the family apartment, when a maid died preparing the Seder dinner.  Dora, Roy’s mother, hid the body and kept the incident a secret, until one of the children asked the traditional question, “why is this night different from every other night?”  Dora blurted out “Because there’s a dead maid in the kitchen.”  


Cousin Gary thought Dora was more troubled by the fact that the maid’s death had interrupted the Seder, “not that a life had been lost.” For David, this incident revealed the origins of Roy’s pathology: “That’s totally Roy’s spirit.  His lack of ethics, his lack of empathy.  That came from Dora.  For Roy, life was transactional.  It was all about connections and accruing power.”  


In Tyrnauer’s extensive picture of her, Dora, the homely rich Jewish daughter who at best could achieve a desperately arranged marriage, doted on her only son, holding him so close that he could not break free until her death.  Dora here seems blamed both for his homosexuality and his need to hide it.  And thus for Cohn’s “transactionalism” of corrupt exchange and manipulation, in which people were simultaneously shields against public scrutiny (as was lifelong friend, Barbara Walters, who reluctantly bearded for him) and instruments of power. 


But, wait.  What about the father?  We learn some things about Al Cohn, but not really enough.  Father Al bought his judgeship with money from his wife’s family in exchange for marrying Dora.  He then served the New York political machine for the rest of his long career on the bench, and put his son in contact with some of the biggest hitters of Tammany Hall, including Bronx political boss Edward J. Flynn, a slick Roosevelt loyalist who nonetheless helped Tammany run the city like a cashbox through the middle of the twentieth century.  Later, Cohn the younger maintained that close relation, serving Tammany stalwarts like party boss Carmine DeSapio, Brooklyn’s Meade Esposito, and Bronx party chair Stanley Friedman, who also joined Cohn’s law firm.  Cohn, like his father, was a creature of the machine. In a city run so “transactionally,” why do we even need to be talking about Dora?  


The father’s Tammany connections also help explain the son’s ardent anticommunism. The Bronx political machine, led not only by Flynn but also by its representatives in Albany, stood at the forefront of New York anticommunism during the 1930s, the “red decade.”  Bronx Democrats were responsible for some of the most repressive anticommunist legislation on the eve of World War II, long before the Cold War or any inkling of a Soviet threat to national security.  In March 1940, it was a Bronx Democrat, state Senator John Dunnigan, who launched the city’s Rapp-Coudert investigation of Communists in the public schools and municipal colleges, leading to the firing of several dozen and serving as a prelude to McCarthyism a decade later. 


Why did Tammany hate communists?  Throughout the 1930s, Communists in the teacher’s union and elsewhere effectively challenged Tammany control of city schools and agencies, a wrench in its patronage system rivalled only by reformers in Mayor Fiorello LaGuardia’s City Hall.  Clearly, not all Democrats were “liberal” like FDR, or like Roy Cohn claimed he once was, before he decided to support Republican Cold Warriors such as Reagan.    


Meeropol does a bit better in historically situating Cohn.  In contrast to Tyrnauer’s facile psychoanalysis, she traces Cohn’s evil back to his pivotal role in the Rosenberg case, though the history stops there.  Her film is not as slickly cut or scored as Tyrnauer’s, but Meeropol is more honest and insightful, as she continues the project of self-exploration through family history begun in her 2004 documentary on her grandparents’ case, “Heir to an Execution.”  This project, which includes extensive interviews with her father Michael, the elder of the two Rosenberg/Meeropol children (her uncle Robert is notably absent from this latest film), is valuable as history in its own right. 


Meeropol also provides a more nuanced picture of Cohn’s closeted homosexuality, which is at once public and repressed, as well as weirdly honest, dishonest and corrupting, all at the same time.  In this she follows the lead of Tony Kushner, whose play Angels in America figures prominently in her reconstruction of Cohn’s disturbed and disturbing life.  Like Kushner, Meeropol sees something convoluted and paradoxical about Cohn, even as he represents the worst of American culture.  “To call him ‘evil’ -- it’s true,” journalist Peter Manso tells her at one point.  “But it doesn’t explain a hundred other things about Roy Cohn.”   


That’s a good point.  But it would be nice to learn a few more of those hundred things.  More needs to be said about Cohn’s resemblance to and affinity for Trump, the historical roots of that “strain of evil,” as Rich puts it, in New York’s social register and political “favor bank.”  It’s not enough to justify our interest in Cohn merely by connecting him to Trump, as Tyrnauer does.  We need to know who and what enabled each of them to exist.  Many of those enablers are the same people.  Their story is worthy of yet another film.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176675 https://historynewsnetwork.org/article/176675 0
From Historical Injustice to Contemporary Police Brutality, and Costs of Monuments to the Unworthy

Capt. Silas Soule





On June 22, demonstrators who attempted to topple a statue of Andrew Jackson located in Lafayette Square near the White House were forcibly removed by police in riot gear. Jackson rose to political prominence on the laurels of his exploits as an Army officer during the War of 1812 and Battle of New Orleans, which he used as an effective platform for a political career culminating in a successful run in 1828 as a Democratic candidate for president.  In addition to the military campaigns against the British, Jackson played a prominent role in wars waged against Native nations of the Southeast, while becoming infamous as president for carrying out the policy of Indian Removal resulting in the series of atrocities perpetrated against the so-called five civilized tribes known as the Trail of Tears, in which at least 15,000 died.  The day after the attempt to fell Jackson’s statue, President Trump announced he was working on an executive order to “reinforce” current laws criminalizing the defacement of monuments honoring military leaders. On July 12, the president followed up with a threat of a 10-year prison sentence for anyone damaging federal statues or monuments.

Jackson is only one of many deeply flawed historical figures whose legacies have come under scrutiny and criticism by Americans demanding social change and political reform in the wake of the police killings of Breonna Taylor, Elijah McClain, and George Floyd. Many are rightly questioning why we continue to honor such leaders as Jackson, as well as figures related to Spanish colonial conquest and genocide such as Christopher Columbus and Junípero Serra, or Robert E. Lee and other military officers and politicians who swore their loyalty to the Confederacy and the cause of slavery during the Civil War period.  For many, the display of such monuments shows a gross disregard for the complexity of our shared historical past, while continuing to exclude and silence the experiences and memories of marginalized peoples. Within this impassioned social environment, it didn’t take long for demonstrators to turn their attentions to such monuments, defacing and even destroying those celebrating historical figures with violent histories, especially in their treatment of non-European peoples.


These include one of the numerous statues honoring Junípero Serra, founder of nine Spanish Missions in what is now California. On June 19, citizens brought down a statue honoring him in San Francisco’s Golden Gate Park. A Catholic bishop condemned the action, while California officials in other cities such as San Luis Obispo proactively removed other Serra monuments to prevent similar actions. 


The famed cowboy Kit Carson is another legendary figure glorified in the mythology of the old West. In reality, Carson was a scout and soldier who took part in several massacres of Native peoples in California, Oregon, and Washington throughout the mid-19th century, while also a leading participant in the war with the Diné (Navajo) during the period of the Long Walk. In late June, following the bringing-down of a statue of Columbus in Denver’s Civic Center Park, city officials took action to remove a nearby statue of Carson.


As a national reckoning propels a movement to remove what many see as tributes to racism and oppression, perhaps we might also imagine what could replace such deeply fraught monuments. We needn’t ignore all of our history in the process: Two Civil War-era heroes who rebelled and refused to join a brutal attack against Native peoples represent the moral courage we would do well to honor.


In fact, we should’ve been memorializing these soldiers all along. The ethical stand they took illustrates America’s highest principles – a 156-year-old life lesson that puts to shame many figures who have been celebrated in American history as heroes. It is important to highlight the fact that among all ethnic minorities, Native Americans continue to suffer the highest rate of death at the hands of police.


It was November 1864 when Army Capt. Silas Soule and Lt. Joseph Cramer were at Fort Lyons, Colorado, near a peaceful encampment of Arapaho and Cheyenne people who had settled at Sand Creek. The military had promised protection to the Native people for the coming winter amid growing hostilities from settlers in the Colorado Territory.


A rush of immigrants had been drawn by the discovery of gold at Pikes Peak in 1858. For many, inflamed by anti-Indian sentiment and eager for undisturbed access to land, resources, and gold, warfare was the preferred option. 


Col. John Chivington, commander of the 3rd Colorado Cavalry, was only too willing to oblige. After failing to find a group of Cheyenne dog soldiers who were engaging settlers and troops alike, he decided to take his revenge at Sand Creek. He revealed his intentions to attack the peaceful encampment the night before the assault.  Soule and Cramer protested, to no avail. Soule, later seeking to expose the crime of the massacre and in seeking justice for the victims, wrote to the former commander at Fort Lyons, Major Edward Wynkoop. In this letter, which includes a graphic description of the massacre, Soule says he was “indignant” over the plan.


Like the demonstrators who’ve urged change over the past month, he took action. On the night before the attack, Soule confronted officers readying for the assault, declaring “that any man who would take part in the murders, knowing the circumstances as we did, was a low-lived, cowardly son of a bitch.” 


Cramer spoke up to Chivington himself, saying, “I thought it murder to jump them friendly Indians.” Chivington replied: “Damn any man or men who are in sympathy with them.”  As the vicious assault unfolded the next morning, and under threats of death, Soule refused to participate and ordered soldiers under his direct command to stand down. Cramer followed suit. Around 200 people, many of them women and children, died in the massacre directed by Chivington. No one knows the exact figure. Yet the losses would have been much higher had Soule and Cramer not resisted.


It’s a reality not lost on the descendants of Arapaho and Cheyenne who were present that fateful day. For their bravery and willingness to stand up to evil, Soule and Cramer are still honored by the descendants of the survivors of Sand Creek. Soule was also instrumental in exposing the brutality of the massacre and gave harrowing testimony before a military commission that investigated Chivington’s actions. His descriptions shocked the nation.


He would be murdered on the streets of Denver three months later.  Aside from their honored place in the memories and stories of Cheyenne and Arapaho people, Soule and Cramer have been largely overlooked by historians. Now more than ever, theirs is precisely the kind of example Americans could learn the most from. 


None, perhaps, could benefit more from Soule’s and Cramer’s acts of humanity and moral courage than members of our nation’s police forces, who continue to kill unarmed people at alarming rates while others all too often stand passively by as it happens.


We might wonder how incidents like that which led to Floyd’s killing could have turned out differently if examples like Silas Soule were venerated in place of figures such as Christopher Columbus, Andrew Jackson, Kit Carson, and Robert E. Lee.  Where are the monuments to Soule and Cramer?  Where, too, are the monuments for Cheyenne and Arapaho leaders such as Black Kettle, a leader of the Southern Cheyenne who was wounded as he greeted the soldiers with a white flag of truce at the start of the attack? To the Arapaho Chief Little Raven, who survived the massacre and dedicated his life to peace? Or to another Cheyenne chief, White Antelope, who ran toward soldiers “holding up his hands and saying, ‘Stop! Stop!’”? Then, as the firing intensified, he sang his death song as he was killed: “Nothing lives long, except the Earth and the mountains.”  Black Kettle would survive the Sand Creek massacre but be killed four years later at the Washita River massacre of 1868, carried out by General George Armstrong Custer.


To its credit, the Colorado legislature approved in 2017 a monument to all the people murdered at Sand Creek. But it’s been mired in red tape and disputes over the location of its placement ever since.


Where, in our public consciousness, are stories and events that speak to these brighter understandings of humanity, history, and the value of life? At a time that Washington is choosing a new mascot for its NFL football team — something their owner had recently vowed he would never do – this could be the moment to forge a new path into the future. Let examples of peace, kindness and moral courage guide us.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176676 https://historynewsnetwork.org/article/176676 0
What's in an Un-Naming? Berkeley's Kroeber Hall

Alfred Kroeber with Ishi, the last known living member of the Yahi indigenous people of California, 1911.




I welcome the news that the Berkeley campus has joined the un-naming movement. It provides us with an opportunity to learn about histories we’ve forgotten and to make the honoring of spaces and places into a democratic process rather than a done deal decided by elites in back rooms. 


John Boalt, the 19th century anti-Chinese crusader, is already banished from Berkeley’s law school walls. The University is likely to follow the example of a local elementary school and remove the name of John LeConte, an unreconstructed Southern racist, from the building that houses the physics department. 


Next on the list is anthropologist Alfred Kroeber (1876-1960), after whom Kroeber Hall is named. He didn’t campaign to restrict immigration to the United States on the basis of race, and he wasn’t a white supremacist. But he was the key academic in a department and museum that rose to fame – literally and scientifically – on the bodies of the Native dead.  


Kroeber’s reputation in anthropology rests upon his prodigious scholarship, his success in building Berkeley’s department of anthropology into a nationally ranked program, and his documentation of the cultural experiences of California Indians prior to Spanish colonialism and American genocide. Kroeber supported Native land claims, for which the Council of California Indians acknowledged the role he had played in the struggle “for long delayed justice.”


To most California Tribes and Native activists, especially those in the Bay Area, however, Kroeber’s legacy is more bitter than sweet. 


Kroeber failed in his responsibility to speak out publicly about the genocide that followed the Gold Rush. “What happened to the California Indians following 1849 – their disruption, losses, sufferings, and adjustments – fall into the purview of the historian,” he wrote in 1954, “rather than the anthropologist whose prime concern is the purely aboriginal, the uncontaminatedly native.” The transformation of everyday life after contact was traumatic, Kroeber conceded, but, he added, “it is not gone into here.” It wasn’t that he didn’t know. He just didn’t go into it.


One consequence of this moral cowardice was that until the 1960s a crudely racist imagery about California Indians dominated public discourse in the state, making it easier to frame their near extermination in the imagery of natural history, subject to inevitable processes of erosion and decline, rather than as the result of a planned human intervention. Many people hold Kroeber accountable because he had resources and authority to influence public opinion. Of course, one person, even Kroeber, did not wield such power, but he became the personification of meticulous amnesia. Unlike his widow Theodora Kroeber, who spoke out against the genocide, and his colleague Robert Heizer, who at the end of his career issued a mea culpa for his role in treating California’s Tribal peoples as “non-persons,” Kroeber kept his silence.


As a core faculty member of Berkeley’s department of anthropology (1901-1946) and as director of the anthropology museum (1925-1946), Kroeber oversighted the University’s collection of more than ten thousand Native human remains that it plundered from Native graveyards, and tens of thousands Native artifacts that were excavated from graves or bought cheaply from the desperate survivors of genocide. The University backed up Kroeber’s collecting frenzy and in 1948 proudly showed off to Life magazine its “bone collection [that] has filled two museums and overflows into the Campanile.”


I’ve recently read hundreds of Berkeley’s archaeological reports. Not once have I come across an account that treats the excavated as other than specimens for research. No prayers are spoken, no rituals practiced, no indication that the living and the dead share a common humanity. 


Kroeber failed to document how Native peoples survived against all odds and lived to fight another day. Activists looking for inspirational accounts of struggle and resistance find little solace in Kroeber’s work, which has a tendency to be nostalgic for the good old days rather than forward-looking. 


Kroeber was not particularly interested in the cultures of local Tribes, reporting that they had made “an unfavorable impression on “early voyagers” as “dark, dirty, squalid, and apathetic.” Moreover, he concluded in 1925 that the Bay Area Indians were “extinct as far as all practical purposes are concerned.”


Walking around the Berkeley campus, it is easy to get the impression that the Ohlone are extinct. A plaque at the university’s entrance acknowledges that a Spanish expeditionary force set up camp here in 1772. There are no plaques to mark the settlements of people who lived in this area a few thousand years earlier. The football stadium commemorates faculty, staff, and students who died during World War I. There is no memorial to the thousands of Ohlone who lived and died in this region. A graceful archway celebrates the life of Phoebe Hearst whose philanthropy funded the excavation of Native graves. There is no comparable recognition of the thousands of people who were dug up from their graves in the name of science and “salvage anthropology.” 


Today, the descendants of the Verona Band of Mission Indians and other Ohlone people in the Bay Area are asserting their right to federal tribal sovereignty and to reclaim their ancestral lands, cultural artifacts, and the remains of their dead that are among the nine thousand still held by the University.  


We should take advantage of this un-naming opportunity to honor the people who made Kroeber’s professional success possible and who are un-remembered in the university’s landscape.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176677 https://historynewsnetwork.org/article/176677 0
Constitutional Textualism, Slavery and Undocumented Immigrants





Posting on History News Network, Elliott Young, professor of History at Lewis & Clark College, examined the recent Supreme Court decision in Department of Homeland Security v. Thuraissigiam (2020). Young described the decision as a “fundamental threat to equal protection of the law for all undocumented immigrants” that defied long established legal principles. I strongly support Young’s arguments and, in this article, I wish to extend them. Equally distressing is that it was a seven-to-two majority decision with Ruth Bader Ginsburg and Stephen Breyer joining the rightwing court bloc. Sonia Sotomayor and Elena Kagan posted a powerful joint dissent. 


The 1996 Illegal Immigration Reform and Immigrant Responsibility Act “placed restrictions on the ability of asylum seekers to obtain review under the federal habeas statute.” In this case, Vijayakumar Thuraissigiam, an undocumented immigrant from Sri Lanka applying for refugee status because as a Tamil he faced beatings, torture, and death, claimed that since he had already entered the territory of the United States, he was entitled to due process. Thuraissigiam was represented by the American Civil Liberties Union (ACLU). The Court upheld the constitutionality of the 1996 law and ruled that he was not.


 The majority decision for the rightwing bloc was written by Samuel Alito. Alito argued “Respondent’s Suspension Clause argument fails because it would extend the writ of habeas corpus far beyond its scope ‘when the Constitution was drafted and ratified’” and that the “respondent’s use of the writ would have been unrecognizable at that time.” Not once did Alito reference the 14th Amendment to the United States Constitution. Breyer and Ginsburg, in a concurring opinion written by Breyer, stated that they supported the court majority “in this particular case,” but not the broader assertions made by Alito.


In a dissent endorsed by Kagan, Sotomayor wrote that “The majority declares that the Executive Branch’s denial of asylum claims in expedited removal proceedings shall be functionally unreviewable through the writ of habeas corpus, no matter whether the denial is arbitrary or irrational or contrary to governing law. That determination flouts over a century of this Court’s practice.” She argued “Taken to its extreme, a rule conditioning due process rights on lawful entry would permit Congress to constitutionally eliminate all procedural protections for any noncitizen the Government deems unlawfully admitted and summarily deport them no matter how many decades they have lived here, how settled and integrated they are in their communities, or how many members of their family are U. S. citizens or residents.” If Sotomayor is correct, and I believe she is, the Thuraissigiam decision puts all DACA (Deferred Action for Childhood Arrivals) recipients at immediate risk.


I’m not a big fan of the national Common Core Standards and its high-stakes standardized reading tests, but as a historian and social studies teacher, I like the idea that they promote close reading of text. Former Associate Supreme Court Justice Anton Scalia, the halcyon of judicial conservatism and the patron saint of the Supreme Court’s dominant bloc, justified his rightwing jurisprudence by claiming to be a textualist. According to Scalia, “If you are a textualist, you don't care about the intent, and I don't care if the framers of the Constitution had some secret meaning in mind when they adopted its words. I take the words as they were promulgated to the people of the United States, and what is the fairly understood meaning of those words.” 


But, as Shakespeare reminded us in Hamlet’s famous “To be, or not to be” soliloquy, “There’s the rub.” There is always “the rub.” The problem, with both Common Core and Constitutional textualism is that words have different meanings at different times and to different people and sometimes words are chosen, not to convey meaning, but to obscure it. Understanding “words” requires historical context.


The word slavery did not appear in the United States Constitution until slavery was banned in 1865 by the Thirteenth Amendment because the Constitution, as originally written, represented a series of compromises and contradictions that the authors left to be decided in the future. It was a decision that three score and fourteen years later led to the American Civil War.


The humanity of Africans was generally denied at the time the Constitution was written; they were chattel, property. But in Article I, Section II of the Constitution, which established the three-fifth plan for representation in Congress, enslaved Africans are referred to as “other Persons.” And in Article IV, Section II, the Constitution mandates that “No Person held to Service or Labour in one State, under the Laws thereof, escaping into another, shall, in Consequence of any Law or Regulation therein, be discharged from such Service or Labour, but shall be delivered up on Claim of the Party to whom such Service or Labour may be due.” 


I read text pretty well. As persons, enslaved Africans should have been included in the people of the United States who wrote the Constitution “in Order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defense, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution for the United States of America.” 


But of course, they weren’t. Just reading the Constitutional text, without context, does not help us understand what Scalia called “the fairly understood meaning of those words.”


Unfortunately for the nation, political bias blinded Scalia while he was on the Supreme Court and blinds the rightwing cabal that dominates the Court today so badly that they just don’t read with any level of understanding and ignore historical documents. Because of this, one of the most pressing issues in the 2020 Presidential election is the appointment of future Supreme Court Justices who can read text with understanding, especially the 14th Amendment to the United States Constitution, and are willing to search for supporting historical evidence.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176680 https://historynewsnetwork.org/article/176680 0
The History of the Boycott Shows a Real Cancel Culture

Charles Boycott caricatured in Vanity Fair, 1881.




J.K. Rowling, Margaret Atwood, Salman Rushdie, and Noam Chomsky are among dozens of writers, artists and academics who signed a July 7 letter in Harper’s Magazine that warned of growing “censoriousness” in our culture. They described this as “an intolerance of opposing views, a vogue for public shaming and ostracism, and the tendency to dissolve complex policy issues in a blinding moral certainty.” 


The writers didn’t use the term “cancel culture,” which Wikipedia describes as “a form of public shaming in which internet users are harassed, mocked, and bullied.” But that’s what they are talking about.


While cancel culture deploys modern technology, it is hardly a new tactic. It most famously dates to 1880 in the west of Ireland, when English land agent Charles Boycott's last name became a verb for the practice.


Agrarian activists targeted the County Mayo estate that Boycott managed in the early stages of the Irish Land War as tenants agitated for more influence over their rents and lease terms. Seasonal workers were pressured to withhold their labor from Boycott at harvest time, and nearby shopkeepers were menaced to avoid doing business with him.


The boycott was born. 


Irish parliamentarian Charles Stewart Parnell recommended the tactic weeks earlier during a speech at Ennis, County Clare, about 80 miles to the south of Mayo. 


“When a man takes a farm from which another has been evicted, you must shun him on the roadside when you meet him – you must shun him in the streets of the town – you must shun him in the shop – you must shun him on the fair green and in the market place, and even in the place of worship, by leaving him alone, by putting him in moral Coventry, by isolating him from the rest of the country, as if he were the leper of old – you must show him your detestation of the crime he committed.” 


(Parnell faced his own “canceling” in 1890 when his longstanding extramarital affair with Katherine O’Shea was revealed and created a public scandal.) 


Agrarian activist Michael Davitt used the image of a leper to describe those who did not support the Irish Land League’s fight against the landlord system. Any such person was “a cowardly, slimy renegade, a man who should be looked upon as a social leper, contact with whom should be considered a stigma and a reproach,” Davitt said.


No matter how righteous the cause of landlord-tenant reform, the tactics were taken to brutal extremes, including murder. In 1888, boycotted farmer James Fitzmaurice was shot at point blank in front of his young adult daughter, Nora, as they steered a horse cart to an agricultural fair in County Kerry.


Later, at a special commission exploring the land unrest in Ireland, his daughter testified that after the attack five separate travelers passed on the road because they recognized her as belonging to a boycotted family. Only one stopped, coldly noted that her father was “not dead yet,” then proceeded without helping.


Two men were convicted of the murder and hanged. More typically, intimidated or obstinate witnesses refused to testify against the perpetrators of murders and assaults against their boycotted neighbors.


Threatening notices or placards -- the 19th century print equivalent of social media posts -- appeared in town squares and at rural crossroads, naming names and often including crude drawings of coffins or pistols. 


Social ostracism was hardly new to Ireland, though. Even before the land war, some Irish considered it better to starve to death than to consort with the other side. 


In his 1852 book Fortnight in Ireland, English aristocrat Sir Francis Head described the reaction to attempted religious conversions tied to accepting food in the wake of the Great Famine. “Any Roman Catholic who listens to a Protestant clergyman, or to a Scripture reader, is denounced as a marked man, and people are forbidden to have any dealings with him in trade or business, to sell him food or buy it from him,” he wrote.


In his seminal work Social Origins of the Irish Land War, historian Samuel Clark said of boycotting: “The practice was obviously not invented by Irish farmers in 1880. For centuries, in all parts of the world, it had been employed by active combinations [social groups] for a variety of purposes.”


Yet Clark continued: “What was novel … was … the spread and development of this type of collective action on a scale so enormous that the coining of a new term was necessary. Boycotting was becoming the most awesome feature of the [Irish land] agitation.”


Cyberbullying is unpleasant, to be sure. But it is hardly the same as being beaten or murdered. Authors, academics, musicians, and others bothered by their work being “cancelled” might consider the original boycott for some needed perspective. Or perhaps they should leave the rough and tumble of the marketplace of ideas.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176682 https://historynewsnetwork.org/article/176682 0
Yes, Even George Washington Can Be Redeemed




George Washington was a slaveholder. 

For some Americans, this is reason enough to exclude our first president from the national pantheon. 

According to one poll, 18 percent of respondents believe he should be removed from Mount Rushmore. Others expressed themselves by defacing or toppling Washington statues.

Are these critics right? 

On the surface, it might seem so. American slavery was inexpressibly gruesome. Accounts from the time reveal the horrors of enslaved African-Americans being separated from their families, violently beaten, routinely raped by their owners, subjected to monotonous, backbreaking labor, and forced to live in filthy dwellings with no hope for improvement.

This was reality for millions of American blacks.

Washington benefited from slavery his entire life. He bought and sold slaves and sought to reacquire runaways. These facts are undeniable.

Does this make Washington, as a New York Times columnist states, a “monster”?

This critique fails to account for the specifics of Washington’s personal journey. Within the tragic reality of his owning slaves lies a unique and unexpected story. 

Like his fellow southerners, Washington was born into a society that accepted slavery. It is true he expressed no qualms about the institution until the American Revolution, but once he did, an extraordinary transformation began. 

The earliest change perhaps can be detected in Washington’s correspondence with Phillis Wheatley, an African-American poet who had composed verses dedicated to him. Washington wrote to her in 1776 praising her “great poetical Talents” and expressing his desire for a meeting. The request broke strict etiquette between slaveholders and black people.

Their correspondence highlights something Washington understood about African-Americans lost upon his contemporaries: their abilities and humanity. Compare Washington’s reference to Wheatley’s “genius,” with Jefferson’s harsh assessment that her poems “are beneath the dignity of criticism.”

Many of Washington’s closest associates during the war opposed slavery, such as Alexander Hamilton and Lafayette. These individuals inclined Washington against the institution. Perhaps the greatest influence, however, were the many black people that served courageously during the war. 

After the Revolution, Washington began to speak of slavery in moral terms. He pondered ways to provide slaves with “a destiny different from that in which they were born.” He hoped such actions, if consummated, would please “the justice of the Creator.” 

Washington freed his slaves at his death—but this raises two questions: first, why didn’t he do so in his lifetime, and second, why didn’t he speak against slavery publicly?

First, we must note that Washington detested breaking up slave families, making it a policy not to do so. He realized, however, that freeing his slaves might make family breakups inevitable. Most of the slaves at his estate, Mount Vernon, belonged to his wife Martha’s family, the Custises, which meant he couldn’t legally free them. At Mount Vernon, Custis and Washington family slaves often intermarried. The Custis heirs regularly sold slaves, breaking up their families. Washington knew that if he liberated his slaves, some in the slave families would be free while the others would remain enslaved in Custis hands, vulnerable to being sold (which eventually happened). 

Mary V. Thompson's excellent book The Only Unavoidable Subject of Regret recounts that, as president, Washington developed elaborate plans to emancipate his slaves. Secret letters to family friend David Stuart reveal Washington trying to convince the Custis heirs to join him in manumitting their slaves together, preserving the families, and hiring them out to tenant farmers. Unfortunately, talks with potential tenants fell through. Washington continued to agonize over a situation where emancipation meant separating black family members. 

Second, we must note, while many founders were antislavery, several sought—threatening disunion—to protect the institution, such as South Carolina’s John Rutledge. This left antislavery founders in a difficult situation. They believed the nation could win independence, initiate a risky experiment in self-government, and survive in a dangerous world (threatened by predatory British, Spanish, and, later, French, empires) only by uniting the strength of every state into one union. This necessitated compromises with slave states during the founding, most notably in the Constitution. 

Pulitzer Prize-winning historian Joseph Ellis believes these concessions were necessary, writing “one could have a nation with slavery, or one could not have a nation.” African-American leader Frederick Douglass saw the utility of the union the founders crafted, compromises included, arguing that, if the states separated, northern antislavery forces could less effectively influence southern slavery. 

Washington believed slavery was so divisive that it threatened the nation’s existence, potentially ending any hope of liberty for all Americans. He had good reason to believe this—during his presidency, an antislavery petition signed by Benjamin Franklin provoked much southern outrage.

Washington couldn’t find a satisfactory solution to slavery in life, but he sought to do so upon death. In his will, he ordered that his slaves be freed, the young be taught to read and write and to learn certain trades, and the orphaned and elderly slaves be provided for permanently. He forbade selling any slave “under any pretense whatsoever.”

These were revolutionary acts—educating slaves threatened the entire system. It revealed Washington’s belief that black people could succeed if given the chance. Again, compare this to Thomas Jefferson who once said they were “inferior to whites in endowments both of body and mind.” Jefferson and other Americans, including Abraham Lincoln, believed the two races couldn’t coexist and that the answer was to recolonize African-Americans abroad. Washington never supported these ideas and his will reveals he envisioned black people thriving alongside whites in America.  

George Washington’s achievements are well known—winning independence, presiding over the Constitutional Convention, and serving as the first president. While we cannot ignore his participation in slavery, we shouldn’t discount his remarkable transformation into someone who wished for its abolition and took steps personally to make things right, becoming the only major founder to free his slaves.

We can acknowledge Washington’s monumental victories for liberty while recognizing his personal struggle with slavery. In this time of national angst, Washington’s story helps us understand how the same country that once held humans in bondage can also be the world’s greatest beacon of freedom.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176678 https://historynewsnetwork.org/article/176678 0
What Does it Mean to be Progressive in 2020? After suffering mightily from conservative disdain, for us and for any political principles except sticking it to us, Progressives now sense vindication. On the points that Progressives have advocated over the past few decades, events, meaning reality, have shown us to be right. Everyone but Republicans and fossil fuel companies has gone beyond talking about global warming to planning their responses. The majority of Americans like Obamacare, and the Kaiser Family Foundation found that 56% favor Medicare for All. Racism and sexism are recognized more than ever as deeply embedded flaws in our society, which require systemic change to eliminate. Policing must be made safer for Black lives and for other lives, because of racism and sexism, as well as a culture of impunity from the people police should serve.


It took a cartoon version of conservative ideas to wake up the 20% to 30% of Americans in the middle to the speciousness of Republican political ideology. Progressive causes are becoming American causes.


I worry now that the greatest danger to the political success of progressivism is self-destruction. As soon as Biden pulled ahead in the primaries, David Siders and Holly Otterbein wrote for Politico about a “Never Biden” movement among Bernie Sanders’ supporters. Disappointed revolutionaries are seeking to break off a chunk of progressive support and ensure the victory of Trump and forces of the right. Their motives are as fuzzy as their thinking.


Here’s what I mean. Ted Rall says “Progressives Should Boycott the Democratic Party”. David Swanson tells us “Why You Should Never Vote for Joe Biden”. Victoria Freire says “Joe Biden doesn’t deserve our vote”.


Joe Biden is far from the ideal candidate for Progressives. Biden has personified the corporate wing of the Democratic Party for decades. He has a long history of moderate, even conservative positions as a centrist Democrat, which these articles detail as one of their major arguments. On the burning issues of the day, it is easy to find Biden statements and votes which anger Progressives: opposition to Medicare for All, endorsement of President Obama’s anti-immigrant policies, silencing of Anita Hill during Clarence Thomas’ confirmation hearings, support for military aggression in the Middle East.


Next to these legitimate criticisms, however, anti-Biden voices sink to less honest arguments against him. The least honest is the claim that he is mentally unfit. Rall says, “He is clearly suffering from dementia” and is “senile”, citing as evidence only a poll that shows that many Republicans think he is not fully there. Jeremy Scahill says, “Biden’s cognitive health and mental acuity is, to say the least, questionable”. The senility argument is a Trump talking point, and is just as dishonest when employed by leftists.


As someone who has talked in front of audiences all my life, I can confidently say that Biden shows no signs of dementia. His critics ignore how difficult it is to talk publicly, especially in front of cameras, even for those who have done it a thousand times. I constantly hear college graduates, even college professors, fumble for words, interrupt their sentences, insert “like” and “you know” everywhere, and make those flubs for which Biden is criticized, often using videos from another century.


Somewhat less dishonest, but just as misleading, is the dredging up of every past Biden statement that puts him squarely in the moderate Democratic camp as proof about his policy ideas today. Biden’s centrism has moved leftwards during his career, just as the Democratic electorate has shifted. He is no Bernie Sanders and has not endorsed Medicare for All. But he openly advocates a version of the Green New Deal, a much more radical environmental policy than that of any presidential candidate before this year. He has argued against defunding the police, a purely negative idea which ought not be a progressive litmus test until it has been much more thoroughly discussed. But his current approach to the twin scourges of sexism and racism is far from his previous stands and squarely in the middle of progressive politics.


Anti-Biden leftists ignore the policies that Biden and the Democratic Party are promoting now. Waleed Shahid, of the leftist Justice Democrats, said that Biden’s proposals represent “the most progressive platform of any Democratic nominee in the modern history of the party”.


I believe that Progressives, especially now in the face of Republican anti-democratic politics, should always emphasize the necessity of listening to the voters. But a central part of the anti-Biden clamor is the delegitimization of the will of Democratic voters.


Krystal Ball, former MSNBC host, already in March told millions of viewers of “The Young Turks”, “if they always can say, 'Look, you've got to vote for us no matter what, you've got no other choice,' then they're always going to treat us like this.” Victoria Freire argues this way: “Start by asking why the DNC would choose such a weak candidate for Democrats to consolidate behind. The answer? Corporatist democratic leaders would rather have a fascist in the White House over a democratic socialist.”


A different form of condescension comes from David Swanson, who asserts that those who would pick Biden over Trump are “lesser-evil voters” who become evil-doers themselves: “People, with very few exceptions it seems, cannot do lesser-evil voting on a single day without having it take over their identity and influence their behavior.” He cites his own made-up facts: “the nearly universal practice of those who advocate less-evil voting of becoming cheerleaders for evil for periods of four years”.


A conspiratorial view of American politics is not limited to the right. Many disgruntled Bernie supporters in 2016 attributed his loss to the secret machinations of some Democratic elite. Democratic voters were duped then and are being duped now by people nearly as bad, or maybe worse, than the far right.


American political campaigns are certainly tarnished by deliberate deception, and Trump’s campaign thus far brings the worst form of public lying to the presidential campaign. Voter manipulation is a feature of American politics. But the assertion that a corporate Democratic cabal, a wealthy corporate war-mongering racist and sexist elite, has successfully manipulated Democratic voters to vote for “their” safe candidate is insulting to us voters. That much is obvious.


Less obvious are its racial assumptions. The “Black vote”, the convenient political label for how millions of Black Americans make their political choices, was a central media talking point during the primaries. The collective choices of those voters gave moderate Joe the victory over more progressive Bernie. Were they all duped? Did they throw away their votes out of ignorance or malice?


The political conspiracy theories of the right assume that Democratic voters actively support evil. The conspiracy theories of the “Never Biden” element of the left assume that we are just dumb.


I was frustrated by Bernie’s defeat in 2016 and 2020, and wished that certain Democratic politicians and media personalities had not nudged those elections toward the center. But there is no evidence that the nudging created Hillary’s victory over Bernie by 12% or Joe’s victory this year by 22%.


To assume that Black voters, or Democratic voters in general, have made poor choices, that they don’t understand what they should want and how to get there in today’s political climate, is not progressive. That kind of thinking has led left movements towards dictatorship. Letting Trump win by convincing Americans on the left to vote against Biden will be good for nobody, especially for anyone who supports positions further to the left.


Steve Hochstadt

Springbrook WI

July 28, 2020

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/blog/154381 https://historynewsnetwork.org/blog/154381 0
Prepare for Massive Turnover on the Supreme Court in the Next Four Years Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.



The Supreme Court of the United States has tremendous power and impact on all Americans. The future membership of the Court will likely be determined in the next term, and it could be a massive change.

The three youngest of Justices, Elena Kagan (appointed by Barack Obama in 2010), Neil Gorsuch (appointed by Donald Trump in 2017), and Brett Kavanaugh (appointed by Trump in 2018), are 60, 53 and 55, respectively, seem in good health, and are likely to be on the Court for a long time.

Much attention is, of course, paid to the oldest member, Ruth Bader Ginsburg (age 87), who has served on the Court for 27 years since being appointed by Bill Clinton, and has had five bouts with cancer (to date recovering from all and continuing to be able to work).  Democrats have prayed for Ginsburg to stay healthy enough to remain on the Court in the hope that Joe Biden becomes President in 2021.  It is imagined that she will retire next year if Biden is President, but stay on, if she is able to, if Trump is reelected.

But then, there is also Stephen Breyer (age 82), appointed by Bill Clinton, who has been on the Court for 26 years. While he is in good health, it seems likely that he will leave in the next presidential term.  If both Ginsburg and Breyer leave the Court with President Biden in office, it would preserve a 4 Justice liberal bloc that has occasionally drawn an ally from the more conservative side, but if Trump replaces them, then the Court would become much more right wing, with a 7-2 conservative majority.

But this is not the end of the issue of the future Court as, realistically, there might be up to four other Justices departing by 2024.  This would include Clarence Thomas (age 72), appointed by George H. W. Bush, and Samuel Alito (age 70), appointed by George W. Bush, with Thomas on the Court for 29 years, and Alito having served 14 years.  There have been rumors that either or both of them might leave the Court now, so that Donald Trump can replace them, but apparently as the summer moves on toward a regular October opening, it seems not to be happening.  The point is that if either or both left the Court, Trump could replace them with younger, more ideological conservatives, while if Joe Biden were able to replace them, the Court would move substantially to the left.

But then, we also have Sonia Sotomayor (age 66), on the Court for 11 years after appointment by Barack Obama. It has been publicly reported that she has problems with diabetes, which might, in theory, cause her to resign from the Court in the next term.  Sotomayor has been a Type 1 diabetic since age 7, and  had a paramedic team come to her home in January 2018 to deal with an incident of low blood sugar.  If Trump were able in the next Presidential term to replace her, the conservative majority could be as strong as 8-1 by 2024.

And then, finally, we have Chief Justice John Roberts (age 65), who has led the Court for 15 years since appointment by George W. Bush. Roberts is as much of a “swing vote” as there is among the conservative Justices, surprising many with some of his decisions and utterances regarding Donald Trump.  The problem is that Roberts has had health issues involving seizures, in 1993, again in 2007, and most recently in 2020.  In 2007, after two years as Chief Justice, Roberts collapsed while fishing alone on a pier at his summer home in Maine, fortunately not falling into the water and drowning.  In June 2020, he fell and hit his forehead on the sidewalk, receiving sutures and an overnight hospital stay. In this case, a seizure was ruled out as the cause of the fall, but the possibility that Roberts might leave the Court has become a subject of speculation.

So while the future of these six Supreme Court Justices is for the moment just speculation, the odds are good that two or more might leave the Court, and potentially as many as six, which gives either Joe Biden or Donald Trump the ability to transform the ideology of the majority of the Court until mid-century.

So the Presidential Election of 2020 is not just about who might be in the Oval Office, or which party might control the US Senate, but also a potential revision of the Supreme Court’s role in American jurisprudence, and its impact on 330 million Americans.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/blog/154384 https://historynewsnetwork.org/blog/154384 0
Life during Wartime 516

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/blog/154382 https://historynewsnetwork.org/blog/154382 0
The Roundup Top Ten for July 31, 2020

A Brief History of Dangerous Others

by Richard Kreitner and Rick Perlstein

Wielding the outside agitator trope has always, at bottom, been a way of putting dissidents in their place. The allegation is not even necessarily meant to be believed. It is simply a cover story, intended to shield from responsibility not only the authorities implicated in crimes or abuses of power, but also society as a whole. 


Africa's Medieval Golden Age

by François-Xavier Fauvelle

During the Middle Ages, while Europe fought, traded, explored and evolved, Africa was a continent in darkness, 'without history' – or so the traditional western narrative runs. In fact, as François-Xavier Fauvelle reveals, it was a shining period in which great African cultures flourished.



The Border Patrol’s Brute Power in Portland is the Norm at the Border

by Karl Jacoby

What’s happening in Oregon reflects the long history of unprecedented police powers granted to federal border agents over what has become a far more expansive border zone than most Americans realize. 



Tom Cotton Wants To Save American History. But He Gets It All Wrong.

by Malinda Maynor Lowery

Senator Cotton’s remarks and his proposal to revise history obscure the violence, death and displacement that slavery caused in both Black and Indigenous communities.



Congresswomen Of Color Have Always Fought Back Against Sexism

by Dana Frank

When he called Alexandria Ocasio-Cortez “crazy” and “out of her mind” because he didn’t like her politics, Ted Yoho was harking back to Edgar Berman’s narrative that a political woman who dares to speak up is constitutionally insane.



The Death of Hannah Fizer

by Adam Rothman and Barbara J. Fields

Those seeking genuine democracy must fight like hell to convince white Americans that what is good for black people is also good for them: Reining in murderous police, investing in schools rather than prisons, and providing universal healthcare.



Why "White" Should be Capitalized, Too

by Nell Irvin Painter

Capitalizing "White" makes clear that whiteness is not simply the default American status, but a racial identity that has formed in relation to others. 



How Trump Politicized Schools Reopening, Regardless of Safety

by Diane Ravitch

Amid this uncertainty and anxiety, President Trump has decided that the reopening of schools is essential to his prospects for reelection.



Colonialism Made the Modern World. Let’s Remake It.

by Adom Getachew

What is “decolonization?” What the word means and what it requires have been contested for a century.



On Sex with Demons

by Eleanor Janega

"The idea of having sex with demons or the devil... has a long and proud history. A concern about sleep sex demons traces at least as far back as Mesopotamian myth where we see the hero Gilgamesh’s father recorded on the Sumerian King List as Lilu, a demon who targets sleeping women, in 2400 BC."


Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176664 https://historynewsnetwork.org/article/176664 0
Let Us Now Remove Famous Men

The Ku Klux Klan protests the Charlottesville City Council's decision to remove a monument to Robert E. Lee, 2017.



I passed the statue of Robert E. Lee in Charlottesville, Virginia literally hundreds of times, often admiring the handsome appearance of a general who was proud in defeat, leaving the battlefield with honor intact. The bronze Lee sits ramrod straight in the saddle of his warhorse Traveller, hat clutched in his right hand, atop a sturdy gray stone pillar. In all seasons, whether sprinkled with snow or glowing in a fall sunset, it seldom occurred to me—whose ancestors wore blue and gray—that this statue was a symbol of white supremacy. That’s not because the message was hidden. It was because I was unaware of my own white privilege, which permitted me to view it in terms other than as a potent symbol of white over Black.


But white supremacy is exactly the message that the Lee statue embodied. It was the reason it was built in the 1920s. It wasn’t for the general himself, who led tens of thousands of armed rebels against U.S. forces, wounding and killing American soldiers and re-enslaving Black refugees from bondage. Lee had died over five decades earlier, and he didn’t need another statue. By the early twentieth century, Confederate statuary was a growth industry, with Lee at the center. Charlottesville city boosters commissioned it among several such memorials, and it was unveiled in 1924 to the applause of the Sons of Confederate Veterans and other adherents of the Lost Cause. The president of the University of Virginia dedicated it in the presence of members of several Confederate organizations.


Few Black Virginians voted that year, either for Calvin Coolidge or his segregationist Democratic opponent, who won Virginia’s electoral votes. The commonwealth made sure of it, passing the 1924 Racial Integrity Act to harden the state’s color line. The University of Virginia would not admit Black students for another quarter century. State schools, hospitals, and cemeteries were segregated. African American southerners were fleeing to cities like Newark and Philadelphia, where at least there was a hope of upward mobility. But Lee’s likeness gave the violence of Jim Crow a veneer of respectability and a nostalgic atmosphere.


The Lee statue was a quiet sequel to an adjacent statue of Thomas “Stonewall” Jackson, dedicated in 1921 in Charlottesville to the applause of 5,000 pro-Confederate supporters, many uniformed, in sight of a massive Southern Cross, the Confederate battle flag. The Jackson monument was also a bronze equestrian statue depicting the general steeling himself for battle. He was killed in 1863 and never saw the defeat of the Confederacy for which he gave the last full measure of his devotion.


Had the Confederacy won, 4 million Black Americans would have remained enslaved.


Adding injury to insult, the memorial to Jackson and the Lost Cause was built on the grave of a Black neighborhood, McKee Row, which the city seized through eminent domain and demolished, making room for a symbol of white supremacism that was unambiguous to Black residents. The Lee statue stood near Vinegar Hill, a historically Black neighborhood, which the city demolished as part of urban renewal. Vinegar Hill fell so that white city residents could enjoy a downtown mall, pushing its Black residents to the margins, while relying on them to clean the buildings, tend the children, and cook and serve the food that made the living easier for many white residents.


The statues kept that racial order front and center, and it is worth remembering that statues are not history. They are historical interpretations reflecting the values, assumptions, and interpretations of their times. History books of the 1920s generally argued that slavery benefitted Black people and Reconstruction was a Yankee plot to punish the white South, fastening African American rule on a prostrate people who were gracious in defeat. It was a rallying cry against Black equality.


The Lee statue attracted neo-Confederates, neo-Nazis, white nationalist militias, and other hate groups that converged to defend white supremacy in August 2017. It was the same statue that Heather Heyer lost her life over on the city’s downtown mall.


Should such statues come down? 


New Orleans mayor Mitch Landrieu perhaps said it best in a 2017 speech arguing for the removal of that city’s Confederate monuments. Landrieu recounted the road he’d traveled to the decision and his talk with an African American father of a daughter through whom he framed the argument.  “Can you look into the eyes of this young girl and convince her that Robert E. Lee is there to encourage her? Do you think that she feels inspired and hopeful by that story? Do these monuments help her see her future with limitless potential?”

Landrieu didn’t have to fill in the blanks. “When you look into this child’s eyes is the moment when the searing truth comes into focus,” he concluded. The statues came down.


The wave of iconoclasm in the United States in 2020 seems to look past the nuance of each statue, viewing any stone or bronze figure with a history of racism to be a fair target. Statues of Christopher ColumbusJunípero Serra, and Juan de Oñate came down based on a history of enslaving, torturing, killing, and expropriating land and sacred spaces of indigenous Americans. Philip Schuyler—who the heck was Philip Schuyler (Alexander Hamilton’s father in law and one of New York’s biggest enslavers)? Civil disobedience is jarring. Disorderly property destruction can be downright frightening.


After a mob hauled down a bronze equestrian statue of King George III in New York City in July, 1776, General George Washington wrote that those who cut off the king’s head and melted the metal for bullets acted with “Zeal in the public cause; yet it has so much the appearance of riot and want of order, in the Army, that [Washington] disapproves the manner, and directs that in future these things shall be avoided by the Soldiery, and left to be executed by proper authority.”


Washington condemned it as the wrong execution of the right idea. And in an irony of history, George Washington himself has become as King George—a target for protest and removal. The Washington that led American forces to victory against the British in defense of “all men are created equal” was also the owner of over 100 enslaved people of African descent, the same leader who signed the 1793 Fugitive Slave Actauthorizing deputized agents to cross state lines and kidnap Black people who had no right to defend themselves in court. The same Washington who pursued his own fugitive bondswoman, Ona Judge, who spent decades evading the Washingtons’ property claims to her body. 


But it’s worth noting that over 90 percent of the recent removals were directed by mayors, governors, and other elected officials and assemblies, responding to citizens’ calls for them, and often replacing them with other memorials.


And the tone of the protests or the way in which some statues are defaced brings out another aspect of white privilege. That is, some initially sympathetic observers are uncomfortable with who gets to oversee and control the process. This is not to excuse wanton destruction. The Atlantic’s Adam Serwer tweeted that those who sought to bring down a statue of Ulysses S. Grant “probably just want to break things.” There is that, but where on the social balance sheet do we register the insult of a century and a half of white supremacy set up in America’s town squares and green spaces?


The protests over the murders of Ahmaud Arbery, George Floyd, Breonna Taylor, and so many others are infused with a damning critique of structural racism—which is not so much a set of persistent attitudes as institutional practices. A century and a half after Lee’s defeat on the battlefield, the typical Black family owns 1/10th the wealth of the typical white family, and that wealth gap is widening into a chasm. Black workers’ earnings are diminishing compared to whites fifty years after Civil Rights, and taken as a whole, African Americans have returned to the same income gap versus whites as in 1950, when Harry S. Truman was president, before Brown v. Board of Education, and before the University of Virginia admitted a single Black student. Was that incidental or an intentional part of Robert E. Lee’s legacy, if not Washington’s?  The Covid-19 crisis has put Black and Latinx workers on the front lines as essential workers, delivering healthcare and meals, yet most are underpaid, have little job security, and risk bringing the virus home to children and seniors. Black people account for nearly a third of coronavirus cases and 40 percent of the deaths. How is that sacrifice to be memorialized?


Should the statues remain up, doing the quiet work of reinforcing white supremacy while we get to work dismantling the interlocking components of structural racism? Or are the statues part of a 400-year history of violence against African-descended people that needs urgent attention and rectification? In what direction do the statues and monuments point us?

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176563 https://historynewsnetwork.org/article/176563 0
What's in a Name?: Decolonizing Sports Mascots

Protesters against Washington NFL Team name, Minneapolis, 2014. Photo Fibonacci Blue, CC BY 2.0




Growing up in Swarthmore (PA) during the mid-1970s I played on my high school football team, the Swarthmore Tomahawks.  The image of a tomahawk on our helmets was supposed to inspire fear in our opponents, I guess, though as a 150-lb. Quaker kid with a nickname of “stone hands” I doubt I did. We were one of thousands of schools and colleges to use Native American names and mascots in the 1970s; the list included “Indians,” “Warriors,” “Braves,” “Redmen,” “Fighting Sioux,” “Squaws,” and “Savages,” almost all of them denoting violence, almost all of them justified in the name of “honoring” Native people.  All these teams expropriated Native symbols for entertainment purposes that effectively covered up a violent history of settler colonialism; the use of Tomahawks was pretty bad, especially in Pennsylvania, where 20 Christian Indians were massacred by the so-called Paxton boys in 1763.


As a teenager I didn’t think about the issue of mascots, in part because I never learned about Native people in high school, with the likely exception of a section on Thanksgiving and a shout out to Pocahontas and Sitting Bull; Native Americans never made it to the 20th century in my high school books, which is the case today in many textbooks and classrooms.  I came to understand how problematic the use of tomahawks and other Native symbols and names was in American sports in writing a chapter on Native American mascots for a 2003 book called Native American Issues.  By then, many high schools and colleges and universities had dropped their Native names and mascots in response to protests by Native groups, but hundreds remained, including at the professional level.  In light of the Black Lives Matter movement and a national reckoning of systemic racism, some of these teams are rethinking their use of Native symbols.  It’s 2020.  It’s about time. 


The Redskins name is being axed, finally, in favor of Red Tails or Red Wolves, largely because of pressure finally applied by the NFL and major team sponsors such as FedEx, Nike, and Pepsi-Co; although I applaud these companies for speaking out, let’s not forget that for years FedEx shipped Nike’s Redskins jerseys and shirts without batting an eye, enabling the team to sustain its use of a name offensive to Native people, who had campaigned against the name since the 1970s, and to any American who objects to racist stereotypes. The use of “Redskins” depoliticized Native people, dehumanized them into a stereotype, and debilitated them in terms of their own self-image according to numerous studies.  Washington, D.C. is the center of American political power, the place where diplomats from around the globe visit to conduct business and the place where federal legislation, court decisions, and presidential orders are generated.  What was the impact of the Redskins name, imagery, and associated fan performances, for decades, on politicians, judges, and other officials’ perception of Native people and their place in American society as they debated public policy and legal cases? 


The Atlanta Braves baseball team in particular demonstrates the dishonest and damaging use of Indian imagery to cover up Native people’s traumatic history.  The team has announced that they will retain the name “Braves” but will review the use of the “tomahawk chop.”  In employing the tomahawk on uniforms and merchandise and allowing fans to perform their ritualistic “tomahawk chop,” the team, and Major League Baseball, perpetuate a lie about of one of the most sordid periods of American history: the forced march in the late 1830s of Cherokee from their homeland in Georgia along what they called The Trail Where They Cried to Indian Territory in present day Oklahoma, which led to roughly 4,000 Cherokee dying in federal stockades and on the trail.  It wasn’t the Cherokee who wielded tomahawks.  Rather, they tried to wield legal arguments tied to treaty rights to retain their sovereignty, arguments accepted by the U.S. Supreme Court in 1832 but ignored by President Andrew Jackson and a host of local and state officials eying the Cherokee’s gold resources and rich cotton-growing lands.


Changing the names, mascots, and symbols at all levels of sports is a starting point.  But decolonizing sports history requires a deeper analysis of how false historical narratives that ‘blamed the victim’ became embedded in public venues in everyday life that shaped generations of Americans’ perceptions of Native people, who have served their country’s armed forces for over a century in the hopes that it will honor the promises written in hundreds of treaties that represent the legal and moral legacy of American colonialism.


Editor's Note: As this article was prepared for publishing, it has been reported that owner Dan Snyder will name his team "The Washington Football Team" for the 2020 NFL season. Whether this is a retaliatory measure against sports merchandise companies who will be forced to try to sell a boring product after pressuring Snyder to change names is anyone's guess.  

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176569 https://historynewsnetwork.org/article/176569 0
Lincoln, Cass, and Daniel Chester French: Homely Politicians Divided by Politics, United through Art

Lewis Cass by Daniel Chester French, United States Capitol Statuary Hall. 



At first glance, the immortal Abraham Lincoln and the largely forgotten Lewis Cass had almost nothing in common save for the fact that they were probably the two homeliest men in 19th-century American politics. Not that their forbidding looks discouraged artists from portraying them, or dissuaded admirers from commissioning visual tributes. Now, the Illinoisan and the Michigander are each at the center of separate efforts in Washington to remove iconic statuary dedicated in their honor.

As to their obvious differences, Lincoln was a southern born, anti-slavery Whig-turned-Republican, and Cass, a New England-born, pro-slavery Democrat. Lincoln was self-educated; Cass studied at the tony Phillips Exeter Academy (where Lincoln later sent his eldest son). Cass commanded regiments during the War of 1812; Lincoln’s sole military experience came as a militia volunteer in the Black Hawk War in Illinois, where, he admitted, he battled only “mesquitoes” [sic]. Cass went on to serve as territorial governor of his adopted Michigan, and later as one of its U. S. Senators. Lincoln turned down his one and only chance to serve as a territorial governor—in remote Oregon—and later lost two Senate bids in Illinois. Cass served as an American diplomat in France; Lincoln never set foot overseas. And Cass helped introduce “Popular Sovereignty,” the controversial doctrine giving white settlers the power to welcome or ban slavery from America’s western territories; Lincoln not only opposed the scheme, its 1854 passage “aroused” him to back into politics after a five-year hiatus. Of course, while Cass failed in his only run for the presidency; Lincoln won twice.

No two Northerners could have been more different—politically. But there was no disguising the fact that neither man was easy on the eyes. That burden they shared.

Seeing Lincoln’s likeness for the first time, the Charleston Mercury branded him “a lank-sided Yankee of the unloveliest and of the dirtiest complexion…a horrid looking wretch.”  Encountering him in the flesh, a British journalist recalled that Lincoln’s skin was so “indented” it looked as if it had been “scarred by vitriol.” Lincoln had no choice but to make light of his appearance. Accused in 1858 of being two-faced, he famously replied: “If I had another face, do you think I would wear this one?” Once he claimed that a hideous-looking woman had aimed a shotgun at him, declaring: “I vowed that if I ever beheld a person uglier than myself I would shoot him.” Lincoln claimed that he replied, “Madam, if I am uglier than you, fire away!” 

As for Cass, who came of age politically a generation earlier, his initial biographers politely evaded the subject of his personal appearance—though he possessed a bloodhound face dotted with warts and moles, framed by a meandering reddish-brown wig atop a once-husky frame tending to fat. A tactful 1891 writer conceded only that “where a man of less significant appearance would escape attention…the physical poise and stateliness of Cass would arrest the attention of the heedless.” One mid-century newspaper proved blunter, suggesting of Cass that “it is hard to tell whether he swallowed his meals or his meals him.” 

In their time, not surprisingly, both Cass and Lincoln emerged as irresistible targets for caricature. But it is their other, little-remembered artistic connection that comes as a shock. In a significant irony of American art, Lincoln and Cass were each portrayed heroically in marble by the same great sculptor: Daniel Chester French. The colossal, enthroned 1922 statue in the Lincoln Memorial is not only French’s masterpiece, but arguably the most iconic statue in America. Reproduced on coin and currency and featured in movies like Mr. Smith Goes to Washington, it is perhaps most cherished as the backdrop for history-altering events like the “I Have a Dream Speech” that culminated the 1963 March on Washington.

French’s full-figure Cass likeness, carved 30 years earlier for Statuary Hall, has never competed for similar attention or affection, but it did win praise when it was first unveiled in 1889. But in the last few days, some Michigan lawmakers have spearheaded a drive to remove and replace it. “His history is not reflective of our values here,” declared State Senate Democratic leader Jim Ananich on July 10. “I hope that people look at this as a real opportunity to recognize some important people.” (Michigan’s other contribution to Statuary Hall is a figure of Gerald R. Ford). The Cass controversy has erupted amidst a clamor to reappraise Lost Cause monuments throughout the South, and just a few weeks after protestors demanded the de-installation of the long-standing Freedman’s Memorial near Capitol Hill. That 1876 Thomas Ball sculptural group shows the 16th president symbolically lifting a half-naked, enslaved African American from his knees. A copy has already been scheduled for removal in Boston.

In a sense, the Michigan initiative to change its Statuary Hall complement is anything but unique. A few years ago, California swapped out a statue of its long-forgotten statehood advocate Thomas Starr King for one depicting Ronald Reagan. In the wake of the new scrutiny of Lost Cause monuments, Florida recently announced it would replace its longstanding statue of Confederate General Edmund Kirby Smith with one of the African-American educator and civil rights activist Mary McLeod Bethune. The fact that Michigan, too, is beginning to reckon with its conflicted past—Cass (1782-1866) was a slaveholder who also supported Native American removal—demonstrates how thoroughly we have begun conducting a nationwide reappraisal of American memory and memorials.

There is yet one more irony attached to the proposed departure of the Lewis Cass statue: and that is the early history of its distinctive location. Statuary Hall, a favorite of modern visitors to the U.S. Capitol, served originally as the hall of the House of Representatives. In this very chamber, Abraham Lincoln served from 1847 to 1849 in his one and only term as a Congressman from Illinois.

Although freshman Lincoln only occasionally addressed full sessions of the House there, he did rise on July 27, 1848, a few weeks after the Democratic Party nominated the same Lewis Cass to run for president. Whig Lincoln stood firmly behind Zachary Taylor that year. Now he decided it was time not only to offer support for the old general, but a scathing rebuke of Cass.

Lincoln had been known earlier in his career for his caustic wit (one of his sarcastic newspaper articles had once provoked its intended victim to challenge him to a duel). But he had learned over the years to temper the attack-dog side of his oratorical skill set. Cass managed to re-arouse Lincoln’s dormant instincts for venom. Apparently the fact that the Democratic candidate had been re-branded by supporters as a military hero—like Taylor—inspired the 39-year-old Congressman back into the scathing style of his political apprenticeship.

Hastening to point out that he claimed no military glory of his own, Lincoln quickly went after Cass’s claims otherwise. Yes, Lincoln said in his stem-winder, Cass had indeed served decades earlier in the War of 1812, but his principal contribution had been enriching himself. “Gen: Cass is a General of splendidly successful charges,” Lincoln taunted, “—charges to be sure, not upon the public enemy, but upon the public Treasury.”

“I have introduced Gen: Cass’ accounts here chiefly to show the wonderful physical capacities of the man,” Lincoln ridiculed the stout candidate.  “They show that he not only did the labor of several men at the same time; but that he often did it at several places, many of hundreds of miles apart, at the same time… . By all means, make him President, gentlemen. He will feed you bounteously—if—if there is any left after he shall have helped himself.”  

Lincoln was just getting started, and one can only imagine his colleagues rolling in the aisles as his onslaught gained steam. No, Lincoln charged, Cass never manfully broke his sword rather than surrender it to the British, as his campaign now boasted. “Perhaps it would be a fair historical compromise to say, if he did not break it, he didn’t do any thing else with it.”  

As for Cass’s reported military triumphs north of the border, Lincoln declared tartly: “He invaded Canada without resistance, and he outvaded it without pursuit.” Attaching a military record to Cass, he concluded with a vivid frontier allusion, was “like so many mischievous boys tying a dog to a bladder of beans.”

Emboldened by the reception his assault received on the House floor, Lincoln took his campaign on the road, stumping for Taylor in Pennsylvania, Maryland, Delaware, and Massachusetts. In the end, Taylor won the presidential election handily, but Lincoln apparently lacked Cass’s aptitude for exacting political payback. Ambitious for appointment to the federal Land Office (since his party did not re-nominate him for the House), he ended up disappointed. Offered the Oregon posting as a consolation, he turned it down (in part because his wife refused to move west), returned to his law practice, and for a time disappeared into political hibernation. Only when Stephen A. Douglas re-introduced Popular Sovereignty did Lincoln stage a comeback.  

Winning the presidency sixteen years later, Lincoln arrived back in Washington in February 1861, and almost immediately headed to the White House to pay a courtesy call on outgoing president James Buchanan. There he met with Buchanan’s Cabinet. Alas, history did not allow for a face-to-face meeting between the two ugliest men in politics. The elderly Cass had served nearly four years as Buchanan’s Secretary of State, but had resigned back in December. In the ultimate irony, Cass had quit in protest over Buchanan’s reluctance to resupply federal forts in the South at the onset of secession—showing the kind of pro-Union resolve that might have prevented the crisis that Buchanan left Lincoln to handle. Say this much for Cass: he remained loyal to the Union.

That final burst of patriotism may not be enough to save his Statuary Hall effigy. Daniel Chester French’s almost hideously lifelike Cass statue may depart as soon as a replacement can be identified and sculpted. But the Cass masterpiece should definitely be preserved elsewhere—anywhere—for the simple reason that works by the greatest American monumental sculptor deserve to be appreciated on their own terms: as art. At its dedication, critics lauded the French marble as both “an excellent model of the great statesman” and “a statue full of character and expression as well.” 

No one has yet mentioned it, but the uncompromisingly realistic Cass statue also testifies usefully to an era in which matinee-idol good looks were not required for political success. In the age before the glare of television and instantaneous photography were relentlessly aimed at our leaders, politicians could succeed even if they looked like Lewis Cass. Or Abraham Lincoln.

There is no evidence as to whether Lewis Cass ever believed his repellent appearance held him back from the White House, or disqualified him from romanticized portraiture. As for Lincoln, regardless of whether or not his statue survives in Lincoln Park in Washington, at least he maintained a sense of humor about his portrayals, while fully understanding how they burnished his image.  And that is why he posed for a succession of sculptors long before Daniel Chester French ever undertook to create the Lincoln Memorial. Shown the very first sculpture made of him from life, he was heard to observe with self-deprecating frankness: “There is the animal himself!”

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176570 https://historynewsnetwork.org/article/176570 0
One of the Chicago 7 Reflects on Dissident Politics Then and Now



The defiantly unmasked, gun-toting crowds demanding an end to social distancing from state legislatures and governors in the early spring were loud and obnoxious. A little later, the chants one of my daughters joined in shouting out in New York City against the police murder of George Floyd were loud and necessary. The eruptive, massive and extraordinarily wide-spread demonstrations in cities and towns all across the country against systemic racism were louder still. And it’s possible that the convergence this year of the pandemic with a death toll approaching 150,000, the related economic breakdown, and the raging anger and social unrest on the streets might change America. Because it’s happened before, and not so long ago.

The years 1968, 1969, and 1970 were especially loud and did change America. In the midst of a horrendous war that killed and maimed Americans and Vietnamese seemingly without end, people were shouting and marching and sometimes fighting the police in the streets; there were burnings and looting, and music and long hair and all sorts of drugs seemed to be everywhere and threaten everything. People of color, uppity women, gays, lesbians, university students, the poor: all sorts of people said and did and demanded all sorts of things. And the government was trying its best to stop it all, to demonize and punish people - political and cultural dissidents - who claimed they only wanted to change the country into something better. 

Those times accelerated a splitting of the country into different tribes. Perhaps the cultural split of the sixties was not as severe as the one between early white colonists and Native Americans that ultimately led to genocidal slaughter, or the divisions between the North and South that culminated in over 600,000 Civil War dead. But it was bad enough, and many of the same issues from the sixties feed today’s polarization – race, sexual identity and gender roles, economic inequality, war, a longing for a different and better country. If you want a quick and simple reminder of the sixties’ continuing impact, you can just look at the bumper stickers and emblems on people’s cars and realize you can easily figure out which side of that long-ago political and cultural split they are on, and what tribe they belong to now.

Another, if less direct way of noticing how the sixties are still helping shape today’s political struggles, is to note the birth years of the president, members of the Supreme Court, and the senior leadership of both parties in the House and Senate. More than 40 percent of them were born between 1939 and 1955, which would have put them between the ages of fifteen and thirty in those especially loud years of 1968, 1969, and 1970. And I think that would have made them particularly vulnerable to the cascade of exceptional images and events and noise of those years. Things like that are hard for people to forget and I don’t think it’s unreasonable to suspect that their experiences then have had some impact on the tone and direction of the political decisions they are making now. Everyone used to be certain the perspectives of the generations that experienced the Depression were shaped by those experiences. Why would 1968, 1969 and 1970 be any different? 

Today, once again many people on the left side of a polarized America feel compelled to become more directly involved in political work in opposition to the dominant economic and cultural arrangements of our country. Some of the initial prompts—just like in the sixties—come from being assaulted by vivid media imagery and stories. Then it was newspaper and TV coverage of the black students sitting at segregated lunch counters acting bravely against racism and injustice, or blacks and whites together riding buses into dangerous southern cities. Now there are digital visions of neo-Nazis marching on the streets, chanting against immigrants, blacks, and Jews, video of police violence and murder, and pictures of crying, angry parents mourning the death of their child killed by a police officer for no understandable reason. 

Often far more quickly than I and many of my friends learned so long ago, people have understood that engaging in politics is what is required to stop some of the ugliness and suffering. In part, that’s because seeing that “politics” directly impacts peoples’ lives is easier now. Most media coverage of political news in the fifties and early sixties was almost benign, and certainly never cruel or savage. But today’s often ferocious Republican and Democratic tribal splitting can make it impossible not to see “politics” as an all too visible, divisive force in America. And because of all that noise, and because the images people see of hurt and pain and cruelty offends their sense of what America should be, many people feel compelled to choose a side and act.

The “Resistance” in today’s political tribal divisions most directly and broadly reflects the values, hopes, and commitments that helped define the dissident politics of the sixties: a belief in the essential worth of other people, and that personally engaging in political activity is required to meet your obligations to others and to make America a fairer, more just, and better place. The newer, broad dissident movement even echoes some of the internal conflicts and overlapping strategies of those older times, with arguments about the legitimacy, priority, and importance of local issue organizing, electoral politics, peaceful mass protest, and righteous violence.

When the viral plague finally begins to truly fade away, restaurants and city street life as it all used to be will take a while to come back, if ever. But the brutal tribal politics of “us” against “them” looks almost certain to continue. And the renewed assault by leftist political dissidents on a long-established dominant racist and sexist culture and inequitable wealth distribution will need to be strengthened. I believe and hope that will happen. But for it to work, there will need to be a better balancing between the particularistic aspirations and hopes and goals of all the different groups of dissident activists, far better than we ever achieved in the sixties.  

In my old world, the unbridged differences between otherwise allies – people of different colors, cultures, genders, classes, sexual identities, and competing claims on priorities for action, all swirling underneath crushing waves of unrelenting government repression, first destroyed individuals among us, and finally, us. Far better than we ever did, people now are, as they must, working together, acting and moving forward together, while constructing ways to live daily lives and still be fully engaged politically. Otherwise the conflicts between tribes we see now could strengthen into something even more horrible and threatening to our country’s core values of tolerance, freedom and democracy.   

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176496 https://historynewsnetwork.org/article/176496 0
Mankato’s Hanging Monument Excluded Indigenous Perspectives when it was Erected and when it was Removed



In the wake of the George Floyd Murder and protests in Minneapolis, Minnesota members of the American Indian Movement (AIM) toppled a controversial monument to Christopher Columbus in St. Paul, Minnesota, signaling a new way to engage with inaccurate representations of the American past. While many believe that monuments serve as a form of public history and that their removal "erases history," the removal of statues demonstrates how communities of color and their allies no longer tolerate narratives of American exceptionalism that suppress questions of racism, slavery and conquest.


In fact, Minnesota's history projects similar debates over the state's founding and the methods by which white settlement grew and dominated that state's landscape. The U.S.-Dakota War, one of Minnesota's most defining events, connects debates between the memory of the Civil War and colonialism. As public histories of these two divergent events often suppress racism or enact the erasure of Indigenous peoples, they represent the power cemented in monuments. Not only to celebrate a figure or event, but to also dominate the historical narrative future generations may learn from.


By 1912, many white citizens of Mankato, Minnesota eagerly yearned to commemorate the largest mass execution in U.S. history. Fifty years earlier, thirty-eight Dakota men were hanged from a large scaffold in downtown Mankato for their participation in the short-lived U.S-Dakota War. President Abraham Lincoln ultimately lowered the initial number of the condemned from 303 to thirty-eight, both to appease white settlers demanding justice and avoid sparking a continuation of hostilities between Indigenous communities and white settlers in Minnesota and throughout the Northern Great Plains. The “Hanging Monument” represented an “end” to the Dakota War through the recognition of federal power in controlling not only Indigenous peoples but the histories inclusive of their culture and experiences in resisting America’s settler colonial ambitions. 


The Hanging Monument, an 8,500 pound memorial made of local granite, commemorated the mass execution near the original site. The austere face of the memorial read: "Here Were Hanged 38 Sioux Indians: Dec. 26th, 1862." Judge Lorin Cray, one of the monument's supporters and financial benefactors, insisted to one newspaper that the memorial told the story of a historical event in Mankato and did not seek to demean the Indigenous peoples which it supposedly represented. The celebration of white law that ended the bloody U.S.-Dakota War, as well as the lack of context given on the monument’s face, enacted a continuous silencing and erasure of Dakota people from that history. The monument thus reassured the white citizens of Mankato that they were safe again, while at the same time promoting the reach of local and federal power and law. For many years, the Hanging Monument represented an object for outside visitors to see. It was a celebration that their town served as the platform to end the Dakota War. The city would forever be known as the place that hosted the hanging of the Dakota 38.


Between the 1950s and 1970s, many activists demanded the removal of the monument. In some cases, the memorial had red paint poured over its face or was doused with gasoline with the hopes that it would burn. Yet, despite the City of Mankato moving the monument away from the hanging site and further out of the public's eye, the Hanging Monument remained deeply engrained as part of the town. By 1971, Mankato State College (now Minnesota State University, Mankato) hosted an annual Social Studies Conference. That year AIM leaders spoke to a full audience of educators, activists, and the Bureau of Indian Affairs about the plight of the American Indian. One, Edward Benton, acknowledged that the Hanging Monument continued to cause emotional pain to not only the Dakota community, but all Native Americans. He demanded changing the inscription to "Here Were Hung 38 Freedom Fighters," or he could not promise it would still be standing after the conference concluded. Months after the meeting, the monument was removed and hidden from the public's eye to rid the town of its inconvenient past. Yet, the removal was not only because of the activism sparked by AIM and their allies.


As the American Revolution Bicentennial celebration commenced, many white Mankatoans pushed the city to be recognized by the American Revolution Bicentennial Commission as an "all American town." They needed the dark shadow of the mass hanging to disappear forever, hoping to replace it with other famous history or figures from Mankato's past. One Mankato citizen urged the city to recognize Maud Hart Lovelace and forget the negativity that had long brewed over the mass hanging of the Dakota 38.


The Bicentennial, then, represented, in an Indigenous inflection, what historian Edmund Morgan has called the "American Paradox”: a celebration of America's founding while at the same time ignoring the horrors that many enslaved peoples faced under bondage. In Mankato, that paradox celebrated America's great founding and the campaign of manifest destiny that both displaced Dakotas and all Indigenous peoples and erased them from that historical narrative. When many Mankatoans urged hiding the monument, this represented a second erasure of the historical narrative to conceal the town's participation in larger settler-colonial ambitions. After removing and replacing Indigenous peoples with a settler society, concealing a monument to that removal denied an entry point to the public debate for those communities’ narratives, stories, and experiences.


The Hanging Monument disappeared in the 1990s when city officials moved it from a city parking lot to an undisclosed location. Publicly, no one knows where the monument presently resides. Its story though, offers a clear avenue to understand present contestation over historical monuments better. This monument connects the Civil War era with America's broader ambitions in colonizing Indigenous lands. And, both while it stood and when its presence became inconvenient, the Hanging Monument shows how memorials control historical narratives and elevate particular interpretations of the past.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176566 https://historynewsnetwork.org/article/176566 0
What the Faithless Electors Decision Says about SCOTUS and Originalism  



The outcome of Chiafalo v. Washington (a unanimous decision that states may compel presidential electors to cast votes for the candidate to which they are pledged-ed.) was a foregone conclusion. In this troubling time, SCOTUS was not about to upend our system of selecting a president. To achieve the desired result, however, the justices were forced to turn the clear intent of the Framers on its head. This does not mean they made the wrong call; it simply shows us, point blank, that originalism is no more than a pragmatic tool, to be used or ignored at will. 


The opinion of the Court, delivered by Justice Kagan, states correctly that the system of presidential electors “emerged from an eleventh-hour compromise.” But neither the Court, nor the concurring opinion penned by Justice Thomas, tells us why compromise was required. While Kagan and Thomas quarreled over the Framers’ use of the term “manner,”neither discussed the competing views or warring factions—nor why the Framers, in the end, opted for this convoluted, untested resolution to the heated debate over presidential selection. Any originalist would eagerly explore such promising territory; no justice did, not even the self-proclaimed originalists.


Here’s the backstory:


The Virginia Plan, the Convention’s opening foray, called for “a national Executive ... to be chosen by the national Legislature.” That made sense. Under the Articles of Confederation, all the nation’s business was conducted by committees of Congress or boards it appointed, but that had proved highly inefficient. In the new plan, Congress would select a distinct executive to implement its laws. (Not until mid-summer did the Framers dub this executive “President.”)


A few delegates, however, preferred greater separation between the legislative and executive branches of government; they didn’t want the executive to be totally beholden to Congress. But when James Wilson proposed that the executive be selected by the people, several of his colleagues balked. “It would be as unnatural to refer the choice of a proper chief Magistrate to the people,” George Mason pronounced, “as it would, to refer a trial of colours to a blind man.” Most agreed with Elbridge Gerry, who held that “the evils we experience flow from the excess of democracy,” and with Roger Sherman, who proclaimed, “The people immediately should have as little to do as may be about the government. They want information and are constantly liable to be misled.” Rebuffed, Wilson then suggested that the people choose special electors, and this elite crew would select the “Executive Magistracy.” This option also fared poorly. On June 2, by a vote of eight states to two, the Convention affirmed that the national Executive would be chosen by the national Legislature.


Twice more popular election was proposed and turned down. In mid-July, Gouverneur Morris convinced the Convention to opt for special electors, but four days later five states that had favored electors reversed their votes; Congress would choose the Executive, as originally suggested. That is how matters stood until the waning days of August. Then, by a devious maneuver, Gouverneur Morris managed to refer the matter to a committee charged with taking up unsettled issues—even though the manner of selection had been discussed several times and settled. There, in committee, the system we now call the “Electoral College” was written into the Constitution. 


The Committee reported out on September 4, less than two weeks before the Convention would adjourn. Morris, a member, presented “the reasons of the Committee and his own.” “Immediate choice by the people” was not acceptable, while “appointment by the Legislature” would lead to “intrigue and faction.” The committee’s ingenious elector system, on the other hand, depoliticized the process. “As the Electors would vote at the same time throughout the U. S. and at so great a distance from each other, the great evil of cabal was avoided,” he explained. Under such conditions, it would be “impossible” for any cabal to “corrupt” the electors. 


Hamilton, in Federalist 68, sold this notion to the public: “Nothing was more to be desired than that every practicable obstacle should be opposed to cabal, intrigue, and corruption,. ... The convention have guarded against all danger of this sort with the most provident and judicious attention.” Voting separately and independently, “under circumstances favorable to deliberation,” electors would “enter upon the task free from any sinister bias.” Further, to guard against political interference, the Constitution stated that “no Senator or Representative, or Person holding an Office of Trust or Profit under the United States, shall be appointed an Elector.” This argument addressed the concerns of those who had opposed congressional selection. 


Those who had opposed popular election of the Executive were also pacified. Madison, for example, believed the people should elect their representatives to the lower house of Congress, but selection of senators, judges, and the president should be “refined” by “successive filtrations,” keeping the people at some remove from their government. The elector system did exactly that: people choose their state legislatures, these bodies determine how to choose electors, and the electors choose the president—a most thorough “filtration.” During the debates over ratification, Anti-Federalists complained about this. A New York writer calling himself Cato wrote, “It is a maxim in republics, that the representative of the people should be of their immediate choice; but by the manner in which the president is chosen he arrives to this office at the fourth or fifth hand.” Republicus, from Kentucky, commented wryly, “An extraordinary refinement this, on the plain simple business of election; and of which the grand convention have certainly the honour of being the first inventors.” 


Both arguments presented by the Framers were based on the premise that electors, chosen for their greater wisdom and free to act independently, were best positioned to choose the president. It didn’t work out that way. Political parties quickly gamed the system, leading to the fiascos of 1796 and 1800. The Twelfth Amendment addressed one flaw in the scheme by requiring separate votes for the president and vice president, but it did not change the fundamental structure. To this day, we remain saddled with a system devised to shield selection of the president from politics and the people. 


That was then, and this is now—an unpleasant truth for originalists. A Court composed of faithful originalists would have decided Chiafalo v. Washington unanimously in favor of the rebel electors, who insisted on maintaining their Constitutionally-guaranteed independence. Fortunately, there are no true originalists on the Court. Yet that does raise troubling questions:


Did the Court’s professed originalists consciously ignore the historical context for presidential electors, too embarrassed by the Framers’ distrust of democracy and their inability to foresee the system’s basic flaws? Or did they not understand the textual record, which makes that context clear? Neither alternative is acceptable. Any standard of jurisprudence must be applied evenly and knowledgeably, or it is no standard at all. 


Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176562 https://historynewsnetwork.org/article/176562 0
Two Contagions, One Opportunity to Reboot our Approach

Stearman biplane dropping borate retardant on the 1956 Inaja fire.  After a brief lag, successful air tankers were one of the byproducts of FireStop.

Photo Courtesy U.S. Forest Service




The American West is now experiencing two contagions.  The power of both resides in their power to propagate.  One is ancient, the other recent.  For the old one, fire, there are lots of treatments available but no vaccine possible.  For the new one, COVID-19, treatments are still inchoate; while no vaccine yet exists, producing one is possible. 

Unhappily, the two contagions are meeting on the fireline.  Most fires are caught early; only 2-3% escape containment, but these are becoming larger, and that means a massive buildup in response.  The basic pattern crystallized during the Great Fires of 1910 when some 9,500 men worked firelines and most of the standing army in the Pacific Northwest was called out to assist.  What’s curious is that we are dispatching similar numbers of people today.

Why?  The usual expectation is for machinery and knowledge to replace labor.  In wildland fire, however, equipment and know-how just get added to the amalgam, such that today’s fire camps are logistical marvels (or nightmares) and ideal breeding grounds for a pandemic.  This year the established strategy – massing counterforces against big fires – requires rethinking.  

One approach follows from the letter of April 3 by the chief of the U.S. Forest Service to double down on initial attack and prevent big fires from breaking out. It treats fires as an ecological riot that must be quickly suppressed.  The evidence is clear that this strategy fails over time, and even considered as a one-year exception (actually the second one-year exception in a decade), it will fail to contain all fires.  

Another approach sees the crisis as an opportunity to work remote fires with more knowledge, sharper tools, and fewer people.  The managed wildfire has become a treatment of choice in much of the West removed from towns.  The summer’s crisis is a chance to experiment with novel tactics that do not rely on massed fire crews.  Better, seize the moment to propose a major reconsideration in how people and tools interact.  

History holds precedents for resolving many of the big issues confronting landscape fire today.  A problem with powerlines starting lethal fires?  Consider the case of railroads in the 19th and early 20th century that kindled wildfires with abandon, and presently hardly ever do.  Communities burning?  America’s cities and rural settlements burned much like their surrounding countrysides until a century ago; now it takes earthquakes, wars, or riots to kindle most urban conflagrations (what has delayed dealing with contemporary exurbs is that they got misdefined as wildlands with houses, a novel problem, instead of urban enclaves with peculiar landscaping, a familiar one.)  Too many personnel on firelines and in camps, so that managing the people requires as much attention as dealing with the fire?  Consider Operation FireStop.

Camp Pendleton, California, site for methodical experiments in firefighting technology.

Photo courtesy U.S. Forest Service


In 1954 the U.S. Forest Service and California Department of Forestry met at Camp Pendleton to conduct a year-long suite of trials aimed at converting the machines and organized science spurred by World War II and the Korean war into materiel and methods suitable for fire control.  Operation FireStop experimented with aircraft, retardants, radios, trucks and tracked vehicles, now war surplus for which the Forest Service had priority access.  FireStop helped announce a cold war on fire.

Out of FireStop came air tankers, borate retardant, helicopters for laying hose, helijumping, models for adapting vehicles – from jeeps to halftracks – for fire pumps and plows.  The newly available hardware catalyzed equipment development centers.  The organization of fire science helped push toward fire research labs.  Some of these activities might have happened anyway.  But FireStop quickened and focused the trends.

We should reboot FireStop, this time with contemporary purposes and gear, to align the revolution in digital technology and the reformation in policy that crystallized 40 years ago.  This time we don’t need to militarize fire management: we need to modernize it in ways that reduce the need for mass call-outs and logistical carnivals, that allow us to use today’s cornucopia of technology to do fire more nimbly, precisely, and with less cost and environmental damage.  

Call it Operation Reburn.  Schedule it for two years, one fire season for widespread testing, a second for refining the best outcomes.  It isn’t just new hardware that matters: it’s how that technology interacts with people and tactics.  Presently, for example, drones are being used for reconnaissance and burnouts, but a fully integrated system could transform emergency backfires into something like prescribed fires done under urgent conditions.  The burnouts could proceed at night, do less damage, and demand fewer people.  New tools can encourage new ways of imagining the task to which they are applied.

The two contagions make an odd coupling.  Wearing masks to protect against aerosols is akin to hardening houses against ember storms.  Social distancing resembles defensible space.  Herd immunity looks a lot like boosting the proportion of good fires to help contain the bad ones.   

It’s common to liken a plague to a wild fire.  But it’s equally plausible to model fire spread as an epidemic, a contagion of combustion.  Outbursts of megafires resemble emerging diseases because they are typically the outcome of broken biotas – a ruinous interaction between people and nature that unhinges the old checks and balances.  They kindle from the friction of how we inhabit our land.  For now we have to live with COVID-19.  We’ll have to live with fire forever.

An American way of fire came together during the Great Fires of 1910.  Half of America’s fire history since then has sought to remove fire as fully as possible.  The other half has sought to restore the good fires the early era deleted.  Operation FireStop adapted military hardware and a military mindset to support fire’s suppression.  Operation Reburn could enlist more appropriate technology to enhance the policies wildland fire needs today.  Not least, it could scrap the unhelpful war metaphor for more apt analogues drawn from the natural world that sustains fire – and ourselves.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176572 https://historynewsnetwork.org/article/176572 0
"You Sold Me to Your Mother-in-Law...": An Ongoing Quest to Reconnect a Family Yes, black lives matter, even a century and a half later. Let us remember them with dignity and post proofs that they are not forgotten, as Edward E. Baptist advised in Stony the Road they Trod: Forced Migration of African Americans in the Slave South, 1790-1865 (2002):


The idea that lost loved ones were out there somewhere continued to haunt and to inspire black families for years to come. . . 

. . . historians, who can do their work only because those who have long gone have left them messages and words buried in documents they study, have an ethical obligation to give voice to the dead.


Sparked by those convictions almost 50 years ago, Jan Hillegas and I collaborated with George P. Rawick, a comrade and friend from my teens, to gather, edit, and publish narratives of former slaves in ten supplementary volumes to Rawick’s encyclopedic compilation The American Slave: A Composite Autobiography.


Members of the Federal Writers Project had conducted the interviews in the 1930s, but that Depression Era program ended abruptly when the United States was thrust into World War II. Unfinished transcripts were boxed and stored, many of them unlabeled, without a record of their locations. Surviving files had been neglected for more than 30 years when we mounted a search for them in archives and institutional depositories throughout the Southern states.


That was the largest project undertaken by our Deep South People’s History Project, and probably the most indelible. The essential value of the Mississippi section, which comprises five volumes, is that it increased the number of published first-hand accounts by survivors of slavery in that state from a couple dozen to about 450.*


Those were gratifying achievements, performed without institutional or government patronage. I have continued to write about history ever since, mostly as a contributor to philatelic publications. But an important letter that I bought from a stamp dealer about a decade ago has resisted my attempts to flesh out, follow, and narrate the story of its author. Perhaps History News Network’s community of scholars can suggest a fruitful approach.


One hundred fifty years ago, David Jackson of Montgomery, Alabama, sent this letter to a former Confederate army officer at Richmond, Virginia — a man who had inherited him, possessed him, and sold him before the Civil War:




Montgomery, Ala June 14 1870

            Capt John F C Potts


Dear Sir


Yours of March 29th is at hand and contents noted. To refresh your memory I will give you a little of my history. I was raised by your father, Thos. L. Potts, at Sussex Court House, at the division of your father’s estate I was drawn by you. You sold me to your mother-in-law Mrs. Graves, and at her death, I among others was sold at auction — a speculator bought me and brought me to this place and sold me to Col. C. T. Pollard Pres. M.&W.P.R.R. My wife, whose name was Mary, belonged to Mrs. Graves. We had four children whose names were Martha Ann, Alice, Henrietta & Mariah. My mother’s name was Nanny and belonged to your father. What I have written will probably bring me to your recollection, although it has been 22 years since I left. If you know anything about my wife or children please write me in regard to them. I had three brothers named Henry, Cyrus & Wash. If you know anything about them let me know. By answering this you will greatly oblige. Accept my kind regards and believe me 


Yours truly

                                                                                                David Jackson


Sold at auction to a speculator in 1848, transported from Virginia to Alabama, and sold by the speculator to a Montgomery & West Point Railroad tycoon, David Jackson had been forcibly separated from members of his family with no word of their subsequent fates for more than two decades, yet he had not given up hope of finding them. 


Can today’s historians shed light on his quest?


Census records show that Jackson was born about 1825. He was about age 23 when the estate of his Virginia owner separated him from his wife and children. He was about age 45 when he wrote to John F. C. Potts hoping for word about them. 


During the Civil War, Potts had commanded Company D of the 25th Battalion, Virginia Infantry (the Richmond City Battalion) until a few weeks before Appomattox. After the war he was a banker and an underwriter. He was 58 years old when he died June 21, 1876, at Richmond.


That is all I have. David Jackson, members of his family, and his descendants deserve better. If readers can help bring more to light, please write to me via the HNN editor.


* Our publisher, Greenwood Press, has hidden my 1976 interpretive introduction to the Mississippi narratives, without so much credit as a byline, in The American Slave: A Composite Autobiography, Supplement, Series I, Mississippi Narratives, Part 1, pages lxix-cx.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176497 https://historynewsnetwork.org/article/176497 0
Will the Crisis Year of 2020 Turn Out Like 1968?




The year 1968 was one of the darkest in the nation’s history. With the public deeply divided over the toll of the Vietnam War (35,000 combat deaths by year’s end), the country was plunged into mourning after the murders of Dr. Martin Luther King and Robert F. Kennedy.  When violent protests erupted in dozens of cities after Dr. King’s death, the Republican presidential candidate, Richard Nixon, vowed to use “law and order” to restore “traditional American” values.

Nixon’s appeal to a conservative white electorate worked. After winning a narrow victory over Hubert Humphrey (George Wallace got 13% of the vote), Nixon began a rollback of many of the Kennedy-Johnson civil rights and anti-poverty programs. 

In 2020, the nation is once lost in fear and mourning, this time over 140,000 deaths from an epidemic and the shocking death of George Floyd. As widespread protests (generally peaceful) continued in many cities over racial injustice, the Republican president has drawn from Richard Nixon’s strategy. Vowing to impose “law and order,” he has attacked protestors as criminals and called for a return to “traditional” values.

Will Trump be able to repeat Nixon’s success? 

The short answer is no. 

Here are four reasons why the nation’s political and social climate in 2020 is much different than in 1968.

1. Demographics. Both the overall population and the electorate have changed dramatically from 1968. Fifty years ago, the nation was 85% white and the governing class almost entirely white. In 1968 Congress has only six Black officials (five congressmen, one senator), none from the South. 

In 2020, the Congressional Black Caucus has 55 members. Many of the nation’s largest cities including Chicago, Detroit, Baltimore and San Francisco have Black mayors. 

Today the “Non-Hispanic White” group comprises barely 60% of the country's population. People of color (nonwhites and Hispanics) now comprise a majority of those under age 16. The nation is, in effect, “browning” from the bottom up.

2. Education. The electorate is much better educated today. In 1968, only 14% of men and 7% of women had four+ years of college. In 2020, the numbers have increased to 35% of men and 36% of women. 

Numerous studies have found a strong correlation between higher education and liberal political views. One can debate the causes (e.g. liberal professors) but the outcome is clear; better educated voters support issues such as affirmative action, abortion rights, gun control and increased funding for social programs.     

3. Awareness of racial injustice. In 1968, whites had a very limited understanding of how Black people lived and the pervasive discrimination they endured. The nation’s schools, housing and workplaces were largely segregated. Black people, who were still referred to as “Negroes” in many newspapers, were invisible in popular white culture. For example, the first network TV show to feature a Black family, Sanford and Son (about a junk dealer in Watts), did not appear until 1972. 

Ignorance breed intolerance. In 1966, when Dr. King led a protest march through an all-white suburb in western Chicago he was met with a hail of rocks and bottles. 

I was a senior at an all-white high school in California in 1968. In our U.S. History class, the civil rights movement was ignored. The institution of slavery was dismissed in a few paragraphs in our textbook; it was simply “abolished” by Lincoln at the end of the Civil War. No mention was made of Jefferson and Washington owning slaves or the slaveholding states’ role in shaping the Constitution.  The next year, when I began college and attended classes with Black students for the first time, I experienced culture shock. They wanted to talk about discrimination in jobs, housing and education. I literally did not understand what they were talking about. 

4. The Vietnam War. Today, some 45 years after the war ended, it is difficult to comprehend how completely the conflict dominated public discourse. In 1968, we had 540,000 troops fighting in Vietnam and any man over the age of 18 and not in college was likely to be drafted. Today, the entire (all-volunteer) U.S. Army strength is less than 480,000. 

During each week of 1968, some 250 American soldiers were killed. Images of besieged Army bases and wounded G.I.s filled the network news every night. 

The Vietnam War was the number one issue in the 1968 election. Nixon’s position was one big lie. He promised an “honorable” end to the war, but refused to say how he would achieve it. It took Kissinger and Nixon five more years to negotiate a peace, which was basically a surrender. The cost: 15,000 more dead Americans. 

Today we are still engaged in several foreign wars, but the Pentagon has learned how to maintain a low profile by restricting information and limiting access to the news media. Foreign wars are just not an issue this election.

An All-Star’s Optimism

In a recent editorial in the Los Angeles Times, Kareem Abdul-Jabbar (who was in his third year at UCLA in 1968) wrote: 

“The moral universe doesn’t bend toward justice unless pressure is applied. In my seventh decade of hope, I am once again optimistic that we may be able to collectively apply that pressure, not just to fulfill the revolutionary promises of the U.S. Constitution, but because we want to live and thrive.”

Trump’s attempt to use Nixon’s outdated playbook will fail. Our nation is younger, more diverse and better educated now. 

We know better. 

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176573 https://historynewsnetwork.org/article/176573 0
Who Opened the Door to Trumpism? David Frum's "Trumpocalypse" Reviewed




David Frum is among the most well-known conservative Never Trumpers.  His columns in the Atlantic and appearances on television and podcasts are filled with insightful and cutting criticism of Donald Trump’s policies, personality, and character (or lack thereof).  In his new book, Trumpocalypse: Restoring American Democracy, he offers more of the same, oftentimes presented in witty, pithy prose.  He asserts that Trump, despite all of his obvious flaws, was able to gain control of the Republican Party and win the presidency by walking through an “unlocked door” that was in many ways left open by American conservatives:  


I came of age inside the conservative movement of the twentieth century.  In the twenty-first, that movement has delivered much more harm than good, from the Iraq War to the financial crisis to the Trump presidency.  


As a former speech writer for George W. Bush with considerable conservative bona fides, Frum is in a unique position to give a full accounting of just how conservativism fell short before the 2016 election, and offer a compelling path forward for conservatives eager to reclaim their movement.  Unfortunately, he does neither.  


Frum is at his best when he is attacking Trump’s personal corruption, highlighting his failures as a businessman before becoming president, and his use of the office for his own financial and political gain.  He characterizes Trump as a low-grade grifter as opposed to someone trying to score, dare I say, “bigly”: 


Net-net, how much can Trump have pocketed from Vice President Pence’s two-day stay at Trump’s Irish golf course? How much from the Secret Service renting golf carts at Trump golf courses? How much from Air Force officers being billeted at the Turnberry in Scotland? How valuable were Ivanka Trump’s Chinese trademarks, really? 


Never one to miss twisting the knife, he then scornfully notes that Trump has probably dishonestly extracted roughly $4 million “less… than Michelle Obama earned from her book and speaking fees.” 


Frum also catalogues and assails Trump’s long history of trafficking in racist discourse, from his promotion of birtherism to his insinuation that Barack Obama was an ISIS supporter to his inexplicable equivocation after the 2017 neo-Nazi march in Charlottesville.  In a curious twist that reflects Frum’s frustration over Trump’s weathering of the Russian scandal, he devotes more attention to criticizing Robert Mueller’s failure to dig deeper into Trump’s ties to Russia than he does to the scandal itself.  


Trumpocalypse also takes aim at the administration’s foreign policy.  Frum shares some cringe-inducing quotes revealing Trump’s long-held affinity for authoritarian strongmen.  For example, in a 1990 Playboy interview Trump expressed admiration for Deng Xiao-ping’s brutal suppression of the pro-democracy movement: 


When the students poured into Tiananmen Square, the Chinese government almost blew it.  Then they were vicious, they were horrible, but they put it down with strength.  That shows you the power of strength.  Our country is right now perceived as weak…as being spit on by the rest of the world.  


Frum explores the corrosive impact of Trump’s strange and twisted “bromances” with brutal dictators like Kim Jong Un (“We fell in love”), Vladimir Putin (“I think in terms of leadership he’s getting an A”), and Mohammed Bin Salman (“He’s a strong person.  He has very good control…I mean that in a positive way”).  Such praise has emboldened wannabe dictators from the Philippines to Hungary to Brazil, and dismayed the democratic allies of the United States.  


Through his long analysis of Trump’s follies, Frum never develops his contention that twenty-first-century conservatism helped open the door for Trump.  He notes the Iraq War as a factor, but as the man who wrote the “Axis of Evil Speech” that became the justification for the war, he could have provided a much more detailed accounting of how it helped Trump’s America First slogan resonate with so many people.  He does not.  He also asserts that the 2008 financial collapse contributed to the Trump’s economic populism, but did not explain how.  How did conservative economic policies contribute to the financial meltdown?  Was it a failure of oversight?  Not cutting enough regulations?  Too much deregulation?  Without a full accounting, his political mea culpa is hollow and fails to offer guidance on how to avoid mistakes in the future.  


Drawing a distinction between twentieth and twenty-first-century American conservatism is also disingenuous.  Frum is rightly horrified by the racially-charged tactics that Trump routinely employs, but without looking deeper into the history of the conservative movement, he does a disservice to his readers. For instance, he does not address National Review’s opposition to the desegregation efforts of the Civil Rights movement, Nixon’s Southern Strategy, or George H.W. Bush’s notorious Willie Horton ad, and how each of them has greatly damaged the Republican Party in the eyes of African Americans. Frum also could have delved into the fiscal recklessness of the GOP, which only seems to care about budget deficits when a Democrat is in the White House.  Ronald Reagan went beyond Lyndon Johnson’s promise that the country could have “guns and butter” by essentially saying that the country could have guns, butter, and low taxes.  This brand of free-lunch conservatism led to a near-tripling of the national debt from 1981 to 1989.  


In the second half of the book, Frum offers a list of reforms to help the country restore its democratic structures and ensure their survival, including mandating that presidential candidates release their tax returns prior to elections; eliminating the filibuster to prevent legislative minorities from impeding Senate proceedings; granting statehood to the District of Columbia, an area more populous than Vermont and Wyoming; passing a new Voting Rights Act to ensure that all Americans have access to polling stations without lengthy wait times or cumbersome procedures; and creating non-partisan commissions to create electoral districts that are not simply designed to perpetuate the majority party’s hold on seats.  He also makes sensible policy recommendations on major issues like health care, immigration, and climate change.  This was the most disappointing part of the book, not because of the proposals themselves, but because of his failure to ground them in a conservative historical and philosophical framework.  


In the introduction, Frum dedicates the book to “…those of you who share my background in conservative and Republican politics.  We have both a special duty—and a special perspective.  We owe more; and we also, I believe are positioned to do more.”  Yet he does little to make his recommended reforms palatable—let alone desirable—to conservatives outside the Never Trump camp.  Consider the environment.  Many of today’s self-proclaimed conservatives recoil from measures to protect the environment, rejecting such initiatives as attempts by “the Left” (which seems to include everyone on the political spectrum between Joseph Stalin and David Brooks) to seize control over more of the American economy and society.  In fact, environmentalism and conservatism share a great deal philosophically.  One of the cornerstones of conservative thought comes from Edmund Burke’s Reflections on the Revolution in France, first published in 1790.  In the book, Burke—an Irish Whig who served in the British House of Commons from 1766 to 1794—proposed a social contract in sharp contrast to Rousseau’s, which was predicated on the notion of the “general will” of the people.  For Burke, “Society is a contract between the generations:  a partnership between those who are living, those who have lived before us, and those who have yet to be born.”  Placed in this context, climate change is a vital issue that we need to address, lest we break the covenant and bequeath a world less hospitable to life than the one we inherited from our forebears. As Barry Goldwater starkly put it in The Conscience of a Majority (1970), “It is our job to prevent that lush orb known as Earth…from turning into a bleak and barren, dirty brown planet.”  Reminding Republicans of their party’s numerous contributions to environmental protection—such as Goldwater’s support for the Clean Air Act, Richard Nixon’s establishment of the Environmental Protection Agency, and Ronald Reagan’s signing of the Montreal Protocol, which placed limits on ozone-depleting chemicals—could help Frum convince them that supporting future environmental measures is not an abandonment of conservative principles.  


Trumpocalypse will delight many liberals who revel in watching the internal divisions in the GOP.  It will also supply them with considerable ammunition that they can use in debates with their Trump-supporting friends and family members.  The book provides some sharp digs (describing the One America Network as “Fox-on-meth” was my favorite) and is clearly and concisely argued.  I have my doubts, however, that the book will do much to win over many of Frum’s fellow conservatives who make up the book’s intended audience.  Sadly, due to the carefully constructed information bubbles that Americans have created for themselves, it’s possible that very few self-proclaimed conservatives will even know the book was published.  

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176564 https://historynewsnetwork.org/article/176564 0
Weighing the Evidence when a President is Accused of Antisemitism



Mary Trump, the niece of President Donald Trump, says she has heard him utter “antisemitic slurs” in private. Michael D. Cohen, the president’s former attorney, reportedly will assert in his forthcoming book that Trump has made “antisemitic remarks against prominent Jewish people.” The president, however, denies doing so. How are we to weigh the veracity of the allegations?

Historians demand a high level of evidence when an accusation of bigotry is made against anybody, all the moreso when the accusation is made against the leader of the Free World. In weighing the evidence that has so far been produced concerning Trump, one must consider the standards that historians have applied with regard to the other three presidents who have been accused of antisemitism—Richard Nixon, Harry Truman, and Franklin Roosevelt.

The first question to ask is whether there is documentation that corroborates the claim, such as a tape recording or a diary.

We know that President Nixon made antisemitic remarks because he taped his Oval Office conversations. Until those tapes were made public, the accusation gained no traction. Both the New York Times and CBS-TV reported in May 1974—three months before his resignation—that Nixon had referred to some of his critics as "Jew boys," and had complained about "those Jews" in the U.S. Attorney's Office who were causing him difficulties. So long as the public could not see the evidence, Nixon and his defenders could deny it. 

In the years to follow, the tapes came out, confirming the earlier reports and revealing many more antisemitic slurs, including Nixon’s use of the word “kike.” Hearing such language in the president’s own voice made it impossible to deny his antisemitism.

We know of President Harry Truman’s antisemitism primarily from his diary. Discovered by accident in the Truman presidential library in Missouri in 2003, the previously unknown diary included acerbic comments about Jews that Truman wrote after Treasury Secretary Henry Morgenthau, Jr. telephoned him concerning the British decision to prevent the refugee ship Exodus from reaching Palestine.

The president wrote: "He'd no business, whatever to call me. The Jews have no sense of proportion nor do they have any judgement on world affairs….The Jews, I find are very, very selfish. They care not how many Estonians, Latvians, Finns, Poles, Yugoslavs or Greeks get murdered or mistreated as D[isplaced] P[ersons] as long as the Jews get special treatment. Yet when they have power, physical, financial or political neither Hitler nor Stalin has anything on them for cruelty or mistreatment to the under dog." 

Until the diary surfaced, few historians acknowledged Truman’s antisemitism. His 1918 letter referring to New York City as a "kike" town was chalked up to his immaturity. His 1935 letter referring to a poker player who "screamed like a Jewish merchant” was dismissed as an isolated incident. Truman’s 1946 remark about Jewish lobbyists, "Well, you can't satisfy these people….The Jews aren't going to write the history of the United States or my history” was excused as a momentary outburst in response to tension with Jewish lobbyists. But seeing explicit anti-Jewish language in President Truman’s own handwriting, in the diary, made it impossible to deny his antisemitism any longer.

Franklin D. Roosevelt did not keep a diary or tape-record his Oval Office conversations. What we know about his private sentiments concerning Jews derives from other types of documentation, including diaries kept by his cabinet members and transcripts of official conversations by note-takers who were not FDR’s political enemies.

Captain John McCrea, the president’s Naval Aide, was the note-taker at the 1943 Casablanca conference. He reported that FDR said the number of Jews allowed to enter various professions in Allied-liberated North Africa “should be definitely limited,” in order to avoid a repetition of the “understandable complaints which the Germans bore towards the Jews in Germany, namely, that while they represented a small part of the population, over fifty percent of the lawyers, doctors, school teachers, college professors, etc, in Germany, were Jews."

Harvard professor Samuel H. Cross, one of the foremost experts on Russian and other Slavic languages, was the translator and note-taker at the 1942 White House meeting between President Roosevelt, adviser Harry Hopkins, and Soviet Foreign Minister Vyacheslav Molotov. According to Cross’s record, Hopkins complained that the American Communist Party contained many “largely disgruntled, frustrated, ineffectual, and vociferous people--including a comparatively high proportion of distinctly unsympathetic Jews.” FDR replied that he himself was “far from anti-Semitic, as everyone knew, but there was a good deal in this point of view.” Molotov, Roosevelt, and Hopkins then apparently agreed that “there were Communists and Communists,” which they compared to what they called “the distinction between ‘Jews’ and ‘Kikes’,” all of which was “something that created inevitable difficulties.” 

In assessing President Roosevelt’s private views, historians naturally assign much greater weight to diaries and private memoranda that were authored by the president’s friends or political allies, than to accusations made by enemies who had an axe to grind or a rival agenda to pursue.

FDR’s allies had plenty to say on this subject. Secretary of the Treasury Henry Morgenthau, Jr. wrote privately that President Roosevelt boasted about his role in imposing a quota on the admission of Jewish students to Harvard. Vice President Henry Wallace wrote in his diary that FDR spoke (in 1943) of the need to “spread the Jews thin” and not allow more than “four or five Jewish families” to settle in some regions, so they would fully assimilate. U.S. Senator Burton Wheeler, whom Roosevelt considered for vice president for his third term, wrote in a private memo that FDR boasted (in 1939) of having “no Jewish blood” in his veins. Rabbi Stephen S. Wise, an ardent supporter of the president, privately noted that Roosevelt told him (in 1938) that Polish Jews were to blame for antisemitism because they dominated the Polish economy.

Specificity is important. That’s why a particularly disturbing remark attributed to President Roosevelt has been widely ignored by historians, even though it came from a reliable and friendly source. Samuel Rosenman, FDR’s closest Jewish confidante and chief speechwriter, told a Jewish leader in October 1943 that, in response to a rally by rabbis outside the White House, the president “used language that morning while breakfasting which would have pleased Hitler himself.” But Rosenman never revealed precisely what it was that he heard Roosevelt say.

The lack of specifics—so far—in the allegations by Ms. Trump and Mr. Cohen will be cited by the president’s defenders as reason to doubt their veracity. Others will point to the accusers’ personal conflicts with the president as evidence to question their motives in raising the issue of antisemitism.

Certainly, accounts by embittered relatives need to be scrutinized with extra care. By contrast, a friendly relative presumably has no motive to smear the president. Curtis Roosevelt, a grandson of the president (and not known to be unfriendly toward his late grandfather) told FDR biographer Geoffrey Ward that he “recalled hearing the President tell mildly anti-Semitic stories in the White House…The protagonists were always Lower East Side Jews with heavy accents…" 

Opportunity is also important. Mary Trump’s innumerable interactions with Donald Trump, over the course of decades, certainly gave her ample opportunity to

hear him express his private opinions. Likewise Michael Cohen, who served as Trump’s attorney and confidante from 2016 to 2018. The fact that not one, but two highly placed, unconnected individuals are making similar accusations adds credibility to their charge.

There is also the matter of Trump’s track record on the subject. How should historians judge remarks invoking a stereotype that seems to be complimentary? In a 1991 book, Trump was reported to have said, “The only kind of people I want counting my money are short guys that wear yarmulkes every day.” In 2015, he told a group of Jewish supporters, “I’m a negotiator like you folks.”

Perhaps not everyone would take offense at those kinds of comments. The danger of brushing off such remarks, however, is that it could be a short leap from perceiving Jews as “good with money” and “good at negotiating” to suspecting that rich Jews are using their wily negotiating skills to manipulate economies or governments. Still, the key word is “could.” Until he says it, he hasn’t said it.

When it comes to writing history, patience yields dividends. With the passage of time, more evidence emerges or existing evidence is discredited. Further down the road, archival collections open and previously classified documents shed new light on the topic. 

Pundits, of course, are not always inclined to wait patiently for incontrovertible evidence to accumulate. Their job is to express their opinions on the issues of the day, based on what they know on any given day. They may be comfortable making a case based on unnamed sources, perceived dog whistles, or other bits and pieces.

Assessing the accusations by Mary Trump and Michael Cohen of presidential antisemitism ultimately may depend on which standard of evidence is applied. Newspaper columnists whose goal is to influence public opinion sometimes proceed even with evidence that some might consider doubtful. A court of law, in a criminal case, requires an allegation to be proven beyond a reasonable doubt. A careful historian, weighing a charge this serious, will insist on evidence sufficient to prove the accusation beyond a shadow of a doubt.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176571 https://historynewsnetwork.org/article/176571 0
Can Martin Luther King’s Spiritual Vision Kindle a New Progressivism?

Poor People's Campaign March, Lafayette Square, Washington D.C. 1968




Conservative columnist Ross Douthat’s recent “The Religious Roots of a New Progressive Era" indicates that a new progressive age is possible. Although he does not say much about the Progressive Era of 1890 to 1914, he does mention the religious Social Gospel movement of that period. And historian Jill Lepore has noted, “much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives, too.” Like then, so now, Douthat sees a “palpable spiritual dimension” to much of the “social justice activism, before and especially after the George Floyd killing,” 


The columnist’s article does not mention Martin Luther King (MLK), but the social justice movement he lead in the 1950s and 1960s demonstrated the greatest spiritual-based progressivism between the end of the Progressive Era and today. And rekindling and updating King’s ideas offers us the best hope of creating a new progressive era in the 2020s.


Most of the 1890-1914 progressives did not attempt to overthrow or replace capitalism, but to constrain and supplement it in order to insure that it served the public good. Their efforts reduced corruption in city governments, limited trusts and monopolies, expanded public services, and passed laws improving sanitation, education, housing, and workers’ rights and conditions, especially for women and children. Progressive efforts also helped pass pure food and drug laws and create the National Park Service. 


After three consecutive post-World-War-I Republican presidents from 1920 to 1932, Franklin Roosevelt renewed the progressive spirit in the 1930s, but it was the Baptist minister MLK, leading the Southern Christian Leadership Conference (SCLC) beginning in the 1950s), who restored religious fervor to progressive causes.


It was his religious vision that propelled King, that and the injustice suffered by black and other people. This was true from December 1955 at his Dexter Avenue Baptist Church, when he helped start the Montgomery Bus Boycott in Alabama because of Rosa Parks’ arrest for defying segregated bus seating, until April 1968, when a sniper’s bullet ended his life on a motel balcony in Memphis.


In mid-1955 King had received his doctorate in systematic theology from Boston University. But even before then, while a student at Pennsylvania’s Crozier Theological Seminary, he had been strongly influenced by Gandhi’s ideas. As historian Stephen Oates writes in his biography of King, he considered Gandhi “probably the first person in history to lift the love ethic of Jesus above mere interaction between individuals to a powerful effective social force on a large scale.”

In 1957 MLK delivered a powerful sermon on “Loving Your Enemies.” In it he said, “Yes, it is love that will save our world and our civilization, love even for enemies.” He also spoke of the need “to organize mass non-violent resistance based on the principle of love.” Moreover, he analyzed and amplified in great detail various meanings of love, especially agape, (“understanding, creative, redemptive goodwill for all”). In February 1968, just two months before his assassination, he told a congregation at Ebenezer Baptist Church in Atlanta that he wanted to be remembered as someone who “tried to love and serve humanity.”

If we look at many of King’s speeches and activities their religious underpinnings jump out at us. As a 2013 article in the Jesuit magazine America pointed out, “His famous speech, ‘I Have A Dream’ [1963], was actually a sermon rooted in the words of the prophet . . . Isaiah who too had a dream of a world made new with God’s loving justice.” That same article quotes King as saying, “In the quiet recesses of my heart, I am fundamentally a clergyman, a Baptist preacher.”

The first progressive idea of MLK’s for today is that of racial justice. More than a half-century after his death, it is not just the continuing protests following the police killing of George Floyd that force us to confront this continuing injustice. It is also the Black-Lives-Matter movement, the disproportionate number of black and Hispanic deaths from COVID-19, the economic inequality facing these minorities, and their higher unemployment and incarceration rates. Moreover, the political polarization and racism being stoked by President Trump is another indication that we are still far from the promised land that King dreamt of in his “I Have a Dream” speech--“that day when all of God’s children, black men and white men, Jews and Gentiles, Protestants and Catholics, will be able to join hands.”

The second progressive idea of King’s is his stress on peace and non-violence both at home and abroad. This emphasis owed much to the Gandhian influence on him and to the belief, as he expressed it in a 1957 sermon, that mass non-violent resistance tactics were to be “based on the principle of love.” Although reflecting mainly his religious principles, his non-violent approach also had--and has--political implications. In February 1968, he warned that rioting could lead to a “right-wing takeover,” and indicated that riots just helped segregationist-presidential-political-candidate George Wallace. Today, in the face of some, but not as much, lawlessness following the killing of George Floyd, some observers sound a similar warning--any lawless activities would aid Donald Trump. 

Abroad, the main target of King’s protests was the Vietnam War. In April 1967, in a New York Riverside Church speech, he displayed the type of empathy that deeply religious people should when he said the following about the Vietnamese people: “So they go, primarily women and children and the aged. They watch as we poison their water, as we kill a million acres of their crops. They must weep as the bulldozers roar through their areas preparing to destroy the precious trees. They wander into the hospitals with at least twenty casualties from American firepower for one Vietcong-inflicted injury. So far we may have killed a million of them, mostly children.” And he warned that “a nation that continues year after year to spend more money on military defense than on programs of social uplift is approaching spiritual death.”

Today progressives like Bernie Sanders are arguing that “we need to cut our [swollen Trumpian] military budget by 10 percent and invest that money in human needs.” 

In his Riverside Church speech, MLK also expressed a third important progressive idea--that our economic system needs major reform. Like the first Progressives, he believed our economy should serve the public good, not special interests. “We must,” he said, “rapidly begin the shift from a thing-oriented society to a person-oriented society. When machines and computers, profit motives and property rights, are considered more important than people, the giant triplets of racism, extreme materialism, and militarism are incapable of being conquered.”

In the year remaining of his short life, King worked most diligently on the Poor People’s Campaign, which aimed at pressuring the government and society to reform our economy, especially by reducing economic inequality. In his King biography, Oates writes that the “campaign was King’s ‘last, greatest dream’ because it sought ultimately to make capitalism reform itself, presumably with the power of redemptive love to win over economic oppressors, too, and heal antagonisms America must recognize.” Oates also mentions that MLK’s aims reflected the influence of theologian Walter “Rauschenbusch’s Social Gospel,” which, as Lepore noted, greatly influenced progressives of the 1890-1914 era.


King’s three important progressive ideas--racial justice, non-violence, and economic reform--are all as relevant today as they were in King’s day. But a fourth progressive idea, addressing climate change, was not in the 1960s perceived as an important problem. Yet, from all of MLK’s activities it is clear that if he were living today he would have been in the forefront of those insisting upon actions to confront human-caused climate change.  


Today, more than a half-century after King’s death and in the midst of a terrible pandemic, the chances of enacting much of his progressive agenda seem better than ever. Just as progressive New-Deal reforms arose as a response to the Great Depression, so too today new progressive actions can emerge from our pandemic and national disgust with Trumpism. And these actions, as were King’s, will be strengthened if they proceed from a strong spiritual base.


In 2016, presidential candidate Bernie Sanders gave a talk in Rome entitled “The Urgency of a Moral Economy: Reflections on the 25th Anniversary of Centesimus Annus.” The anniversary he referred to was that of the release of a Pope John Paul II encyclical, and Sanders spoke before a conference of The Pontifical Academy of Social Sciences. He noted that the Catholic Church’s “social teachings, stretching back to the first modern encyclical about the industrial economy, Rerum Novarum in 1891, to Centesimus Annus, to Pope Francis’s inspiring [environmental] encyclical . . . have grappled with the challenges of the market economy. There are few places in modern thought that rival the depth and insight of the Church’s moral teachings on the market economy.”

Although Sanders was not successful in either his 2016 or 2020 run for the presidency, his progressive ideas, often based on spiritual values, have continued to animate the Democratic Party. In early July, a Biden-Sanders task force released proposals indicating Sanders’ continuing influence on the person now favored to become our next president. 


Although not considered as progressive as Sanders, Biden’s background also suggests that he could further advance some of MLK’s progressive ideas. Biden’s popularity with black voters was a major reason for his securing the Democratic nomination, and he has said that his “two political heroes were MLK and Bobby Kennedy,” both assassinated when he was a senior in college.  


As with King and Sanders, Biden’s views are strongly influenced by spiritual values. In a 2015 interview, he praised Pope Francis, saying “he’s the embodiment of Catholic social doctrine that I was raised with. The idea that everyone’s entitled to dignity, that the poor should be given special preference, that you have an obligation to reach out and be inclusive.“ On Francis’ encyclical on the environment and climate change, which many conservatives criticized, Biden said in the same interview, “The way I read it—and I read it—it was an invitation, almost a demand, that a dialogue begin internationally to deal with what is the single most consequential problem and issue facing humanity right now.”

After Franklin Roosevelt had been elected president in 1932, but before he had taken office in March 1933, Arthur Schlesinger, Jr. tells us a couple of “old Wilsonians . . . became so fearful of Roosevelt’s apparent conservatism” that they urged an FDR adviser to persuade “the President-elect to be more progressive.” Perhaps if Biden is elected, he will follow FDR’s example--not only succeeding an increasingly unpopular Republican president, but also pleasantly surprising progressives who thought him too conservative. 

He has already surprised Bernie Sanders, who has been pleased with the results of the Biden-Sanders task force. In early July, on MSNBC, he stated, “I think the compromise that they came up with, if implemented, will make Biden the most progressive president since FDR.”

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176561 https://historynewsnetwork.org/article/176561 0
Don’t Tear Down the Wrong Monuments; Don’t Attack Every Holiday The United States defeated the Confederacy's Army of Northern Virginia at Gettysburg on July 3, 1863. That same evening, General John Pemberton agreed to surrender the Confederate army holding Vicksburg to Ulysses Grant. The next day, when the news of both Union victories began to spread through the nation, was surely the most memorable Independence Day in American history after the first one, four score and seven years earlier. Pemberton’s men stacked their arms and went home. Lee’s men withdrew from Gettysburg and made their way south. 


Historians have argued ever since over which victory was more important.


Both battlefields are now under the care of the National Park Service. This past Fourth of July, few people visited either, but when travel becomes easier, if you're within range, I suggest you visit whichever park is closer. Both parks are beautiful, especially in mid-summer. However, do not ask the question several NPS rangers submitted to me as their nomination for the dumbest query ever received from a visitor: "How come they fought so many Civil War battles in parks?" Instead, if you're at Vicksburg, suggest to the ranger that Gettysburg was the more important victory; if at Gettysburg, suggest Vicksburg. Probably you and your fellow tourists will be informed as well as entertained by the response. 


Perhaps the Mississippi victory was more telling, for several reasons. Vicksburg had been called "the Gibraltar of the Confederacy." After Richmond, the Confederate capital, it was surely the most strategic single place in the South, because it simultaneously blocked United States shipping down the Mississippi River and provided the Confederacy with its only secure link to the trans-Mississippi West. Vicksburg's capture led to the capitulation of the last Confederate stronghold on the Mississippi, Port Hudson, Louisiana, 130 miles south, five days later. This reopened the Mississippi River, an important benefit to farmers in its vast watershed, stretching from central Pennsylvania to northwestern Montana. Abraham Lincoln announced the victory with the famous phrase, "The Father of Waters again goes unvexed to the sea." In the wake of the victory, thousands of African Americans made their way to Vicksburg to be free, get legally married, help out the Union cause, make a buck, do the laundry and gather the firewood, and enlist in the United States Army. No longer was slavery secure in Mississippi, Arkansas, or Louisiana. Many whites from these states and west Tennessee also now joined the Union cause. 


But perhaps the Pennsylvania victory was more important. It taught the Army of the Potomac that Robert E. Lee and his forces were vincible. Freeman Cleaves, biographer of General George Gordon Meade, victor at Gettysburg, quotes a former Union corps commander, "I did not believe the enemy could be whipped." Lee's losses forced his army to a defensive posture for the rest of the war. The impact of the victory on Northern morale was profound. And of course it led to the immortal words of the Gettysburg Address. 


If you go to Vicksburg on the Fourth, be sure to visit the Illinois monument, a small marble pantheon that somehow stays cool even on the hottest July day. In Gettysburg, don't fail to take in the South Carolina monument. It claims, "Abiding faith in the sacredness of states rights provided their creed here" — a statement true about 1965, when it went up, but false about 1863. After all, in 1860, South Carolinians were perfectly clear about why they were seceding, and "states rights" had nothing to do with it. South Carolina was against states’ rights. South Carolina found no fault with the federal government when it said why it seceded, on Christmas Eve, 1860. On the contrary, its leaders found fault with Northern states and the rights they were trying to assert. These amounted to, according to South Carolina, “an increasing hostility on the part of the non-slaveholding States to the institution of slavery.” At both parks, come to your own conclusion about how the National Park Service is meeting its 1999 Congressional mandate "to recognize and include ... the unique role that the institution of slavery played in causing the Civil War."


The twin victories have also influenced how Americans have celebrated the Fourth of July since 1863. Living in Mississippi a century later taught me about the muted racial politics of the Fourth of July. African Americans celebrated this holiday with big family barbecues, speeches, and public gatherings in segregated black parks. Even white supremacists could hardly deny blacks the occasion to hold forth in segregated settings, since African Americans were only showing their patriotism, not holding some kind of fearsome “Black Power” rally. Both sides knew these gatherings had an edge, however. Black speakers did not fail to identify the Union victories with the anti-slavery cause and the still-unfinished removal of the vestiges of slavery from American life. This coded identification of the Fourth with freedom was the sweeter because in the 1960s, die-hard white Mississippians did not want to celebrate the Fourth at all, because they were still mourning the surrender at Vicksburg. We in the BLM movement can take a cue from the past. We can be patriotic on July 4 without being nationalistic. As Frederick Douglass put it, by my memory, “I call him a true patriot who rebukes his country for its sins, and does not excuse them.” And true patriots can also take pleasure from their country’s victories against a proslavery insurrection.


Muted racial politics also underlie the continuing changes on the landscape at both locations. In 1998 Gettysburg finally dedicated a new statue of James Longstreet, Lee's second in command. For more than a century, neo-Confederates had vilified Longstreet as responsible for the defeat. He did try to talk Lee out of the attack, deeming the U.S. position too strong, and his forces did take a long time getting into place.


James Longstreet had to wait to appear on the Gettysburg landscape until the United States became less racist.

Hopefully BLM protesters are informed enough to know not to tag or topple this Confederate monument. 




But the criticisms of Longstreet really stemmed from his actions after the Civil War. During Reconstruction he agreed that African Americans should have full civil rights and commanded black troops against an attempted white supremacist overthrow of the interracial Republican government of Louisiana. Ironically, ideological currents set into motion by the Civil Rights movement help explain why Gettysburg can now honor Longstreet. No longer do we consider it wrong to be in favor of equal rights for all, as we did during the Nadir. 


When I lived in Mississippi in the 1960s and '70s, bad history plagued how Grant’s campaign was remembered on the landscape. For example, a state historical marker stood a few miles south of Vicksburg at Rocky Springs:


Union Army Passes Rocky Springs

Upon the occupation of Willow Springs on May 3, 1863, Union Gen. J. A. McClernand sent patrols up the Jackson road.

These groups rode through Rocky Springs, where they encountered no resistance beyond the icy stares of the people who gathered at the side of the road to watch.


Actually, the area was then and remains today overwhelmingly black. "The people," mostly African Americans, supplied the patrols with food, showed them the best roads to Jackson, and told them exactly where the Confederates were. Indeed, support from the African American infrastructure made Grant's Vicksburg campaign possible. 


In about 1998, Mississippi took down this counterfactual marker. Or maybe a vigilante stole it — no one claims to know. Either way, the landscape benefits from its removal. Six years later, with funding from the state and from Vicksburg, a monument to roles African Americans played in support of Grant’s campaign went up at Vicksburg. It shows a wounded U.S.C.T. (United States Colored Troops) soldier being helped to safety by another member of the U.S.C.T. and by a black civilian. 


Now, if we can just fix that pesky South Carolina monument... 


Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/blog/154378 https://historynewsnetwork.org/blog/154378 0
Life during Wartime 515

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/blog/154380 https://historynewsnetwork.org/blog/154380 0
Roundup Top Ten for July 24, 2020

Trump Has Brought America’s Dirty Wars Home

by Stuart Schrader

The history of the Office of Public Safety, created to support counterinsurgency around the globe during the Cold War, demonstrates that Trump’s ardor for authoritarian force has long-standing, homegrown roots.


Reimagining America’s Memorial Landscape

by David W. Blight

As we are witnessing, the problem of the 21st century in this country is some agonizingly enduring combination of legacies bleeding forward from conquest, slavery and color lines. Freedom in its infinite meanings remains humanity’s most universal aspiration. How America reimagines its memorial landscape may matter to the whole world.



Historic Levels, but Not the Good Kind

by Heather Cox Richardson

Warren G. Harding created an atmosphere in which the point of government was not to help ordinary Americans, but to see how much leaders could get out of it.



How To Interpret Historical Analogies

by Moshik Temkin

Historical analogies, done in good faith, can make crucial points about the present and help to clarify where we stand on moral and political issues. The problem begins when we begin to substitute historical analogies for historical analysis – or, even more problematically, when we come to believe that history ‘repeats itself’.



John Lewis’ Fight for Equality Was Never Limited to Just the United States

by Keisha N. Blain

By linking national concerns to global ones, John Lewis compelled others to see that the problems of racism and white supremacy were not contained within U.S. borders.



Trump’s Push To Skew The Census Builds On A Long History Of Politicizing The Count

by Paul Schor

The Trump administration’s effort not to count undocumented immigrants is nothing less than an effort to redistribute political power, one that calls to mind a particularly fierce battle over the 1920 census that highlights the role of these broader fights.



J.F.K.’s “Profiles in Courage” Has a Racism Problem. What Should We Do About It?

by Nicholas Lemann

The Senators chosen by John F. Kennedy as "Profiles in Courage" would not fare well if their actions were evaluated today. 



History Shows That We Can Solve The Child-Care Crisis — If We Want To

by Lisa Levenstein

Today, in nearly two-thirds of households with children, the parents are employed. In 3 out of 5 states, the cost of day care for one infant is more than tuition and fees at four-year public universities.



The Strange Defeat of the United States

by Robert Zaretsky

Eighty years later, Bloch’s investigation casts useful light for those historians who, gripped by the white heat of their own moment, may seek to understand the once unthinkable defeat of the United States in its “war” against the new coronavirus.



Tearing Down Black America

by Brent Cebul

Ensuring that Black Lives Matter doesn't just require police reform. The history of urban renewal shows that governments have worked to dismantle and destabilize Black communities in the name of progress.


Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176553 https://historynewsnetwork.org/article/176553 0
The Hate-Mongers: Characterizing Racism in Comics

Detail from cover of Fantastic Four #21, 1963.


In 1963, as the Civil Rights Movement was making its presence increasingly felt, writer Stan Lee and penciler Jack Kirby introduced the Hate-Monger in the pages of Fantastic Four #21. Wearing purple KKK-style robes, inciting xenophobic outbreaks with his hate speech, and wielding an H-ray that could transform even peaceful heroes into unthinkingly enraged combatants, the Hate-Monger stokes the American animosities towards nonwhites that civil rights activists struggled against. Ultimately defeated by the Fantastic Four, the Hate-Monger is unmasked as… a clone of Adolf Hitler! The big reveal at story’s end affords Mr. Fantastic—and, through him, Lee—the opportunity to deliver a brief but clear sermon on the importance of building a world like that the activists at the time were fighting for, one in which “men truly love each other, regardless of race, creed, or color…” 

However, Lee’s broader critique—that Americans had racial issues even before the arrival of the Hate-Monger—was ultimately obscured by the conventions of superhero narratives: a villain has tried to divide Americans along racial and ethnic lines, but the heroes thwart his nefarious plans. Readers don’t, in other words, depart worried (or even thinking) about fixing deeper ills in American society; they leave instead with a warm sense of triumph that reinforces their faith in their country. This takeaway allows readers to feel good about American ideals without spending much time pondering the more troubling truths surrounding systemic problems in the American populace and their institutions that both Lee and the Civil Rights Movement signaled. 

In the early 1990s, within the context of an ascending multiculturalism, a second version of the Hate-Monger arrived in Avengers courtesy of writer Fabian Nicieza and artists Steve Epting and Tom Palmer. This version, a pale-skinned energy being clad in black leather, is essentially a living version of the original’s H-ray, both fueling and feeding off people’s hatred. As before, racism and xenophobia external to the story but endemic to US society manifests, as the Rodney King beating serves as the story’s backdrop. New York City finds itself divided after videotaped evidence surfaces of three cops beating a defenseless 15-year-old Latino, a situation that likewise resonates today. The Avengers are drawn into the unrest, battling white supremacists in league with this new Hate-Monger. Though this story ends less concretely than the earlier one—the Hate-Monger simply dematerializes, to return sometime in the future—it remains steeped in the Manichean conventions of the superhero. Good, to the extent it triumphs over evil, does so because of individual rather than systemic change. To that point, the black Avenger Rage is convinced by Captain America not to succumb to his own anger in combatting the Hate-Monger’s, a climax that unfortunately puts an onus for change on nonwhites as well as a damper on more direct action in the name of fostering such change. It’s as if Captain America tells Rage, again echoing King, that it is his—and through him, all black people’s—responsibility for “all” to “get along.” The message is muddied further by an epilogue depicting the Hate-Monger incarnated as a black agitator, suggesting, to use the parlance of today, that there are bad people on “both sides.”  

These Hate-Mongers tell us quite a bit more about ourselves than Americans want to acknowledge to this day. An obvious analogy presents itself: President Trump is another in a long line of real-life Hate-Mongers, playing to underlying American racism to build his power. But to leave the analysis there—tempting as that may be—paints entirely too simplistic and rosy a picture. It might be comforting to tell ourselves that Trump is the root cause of our current troubles. He will, eventually, leave office. But our racial problems won’t just exit with him. The Hate-Mongers’ repeated (and perhaps repetitive) returns teach us at least this much. As protestors take to the streets today, asserting the basic truth that Black Lives Matter and demanding, for instance, that we “defund” the police (by reinvesting some of that money into social service programs), they are in fact demanding that we not only have a real conversation about the system but that we then act to change its inequities. 

Americans, however, still largely want to think about the problem in individual terms: it’s the president, or a few “bad apples” among the police, or the occasional racist person that pops up in American society. Conveniently, then, they don’t have to think about larger, systemic issues. Nor do they have to think about how that very system fosters this kind of thinking in the way it limits their choices. To wit: the upcoming Presidential election. It would be comforting to believe that electing Joe Biden will fix our problems, even if such thinking requires Americans to ignore his past record on criminal justice reform. Putting a kinder, gentler face on neoliberal policies not all that far removed from the current president’s, in reality, amounts to little more than a shuffling of deck chairs. Protestors are demanding that we reframe the issue as a systemic one, and, in so doing, present us with an authentic option that the November election seems unlikely to provide. They insist that Americans recognize that the “choice” offered by the system (both in the fall and more generally) doesn’t allow for significant changes in that very system. And without such changes, meaningful reform can’t happen.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176452 https://historynewsnetwork.org/article/176452 0
"No Longer Just Lincoln and a Slave": Consider Mary McLeod Bethune's Lincoln Park Statue

Mary McLeod Bethune Memorial, sculpted Robert Berks, 1974



In recent weeks a debate has erupted about the future of the Emancipation Memorial – or Freedmen’s Memorial – in Lincoln Park in Washington D.C. Critics say the statue, which shows Lincoln standing over the figure of a crouching Black man, is racist and should be removed. Its defenders point out that the monument was requested and paid for by newly freed Black Americans. In all the discussions about the statue’s fate the presence of another memorial, significant for a number of firsts, is often forgotten: the Bethune Memorial, which honors the life and legacy of civil rights activist, presidential advisor and educator Mary McLeod Bethune. This statue has the potential to deepen our understanding of Lincoln Park as a commemorative space.

The figure of Frederick Douglass is frequently invoked in the current debates about the Emancipation Memorial. He spoke at the dedication in 1876 and is long understood to have misgivings about Thomas Ball’s design. A newly resurfaced letter shows that Douglass suggested another statue be erected alongside it. “There is room in Lincoln park for another monument,” he wrote. The historians who made this discovery, Professors Scott A. Sandage and Jonathan W. White, rightly point out that Lincoln Park got another statue – the Bethune Memorial. They suggest, however, that the later statue doesn’t fulfill Douglass’s suggestion because the memorials aren’t in dialogue. But research suggests that these two statues do ‘speak’ to one another and that this was the very intention of the creators of the Bethune Memorial. 

When the statue of Mary McLeod Bethune was dedicated in the summer of 1974, it was the first monument to an African American or to a woman of any race on federal land in the capital. It marked the culmination of almost two decades of lobbying and fundraising by the National Council of Negro Women, the organization she founded. Designed by Robert Berks, it shows Bethune holding out a scroll – her legacy – to two African American children at her side. Given that statues tend to commemorate the powerful at the behest of the powerful – in other words, white men – the achievements of the NCNW are remarkable. This was a memorial to a Black woman, created by Black women. Unlike Charlotte Scott, who gave her first five dollars earned in freedom to the Emancipation Memorial, the women of the NCNW contributed to the design and meaning of their memorial. With the help of figures such as Shirley Chisholm, they secured Congressional approval and they raised $400,000, a not insignificant sum, given the context of the times and the other demands on people’s check books. It took fourteen years to raise the funds through donors, fundraising events, and sponsors. 

The NCNW claimed Lincoln Park was chosen because it was already an important site of Black commemoration. Its new monument was “inspired by the memorial that is already here.” As part of the redesign of the park in anticipation of Bethune’s arrival, Lincoln was turned to face her. NCNW President Dorothy Height joked it was so “his back will not be to our gracious lady”. The NCNW explained he was repositioned “to convey the message that the children of slaves had progressed from servitude.”  It made much of ‘Mother’ Bethune’s story as the “daughter of slaves.” Her life was evidence of the “progress” of the race after freedom. But interestingly her statue echoes Lincoln, not the freedman. She has a similar stance, with her arms outstretched. This is deliberate. The NCNW made frequent comparisons between the two figures. At the unveiling ceremony, on July 10, 1974, the actor Roscoe Lee Brown read extracts from the speech Frederick Douglass made at the Emancipation Memorial almost 100 years earlier. In Brown’s reading, ‘Lincoln’ was changed to ‘Bethune.’ 

There are a number of reasons the NCNW wanted its monument to be presented in dialogue with the Emancipation Memorial. Some of these were pragmatic – it elevated Bethune’s legacy and helped portray her as a universal, unifying figure. Others were symbolic, representing an inclusion of Black women’s stories into the nation’s history. Height and her organization believed that the appearance of their memorial transformed the story being told in Lincoln Park. Height reflected, “This park has a different context now. It’s no longer just Lincoln and a slave.” 

As part of the memorial project, Lincoln Park was redesigned and landscaped. It had been falling into disrepair and was transformed back into a neighborhood space to be used by the local community. The NCWN wanted residents to engage with the statue and saw it as a place to “remember the struggle of Black Americans and the leadership and contribution of Black Women in that struggle.” Every year the National Park Service, now responsible for the monument, organizes a ceremony on Bethune’s birthday and the anniversary of the memorial’s dedication. But a 2009 report by the NPS found that the reconstruction of the park had altered the relationship between the memorial site and the recreational park, finding a “separation” between the areas. Ironically, the current situation may have transformed Lincoln Park back into the space for dialogue desired by the NCNW. Protestors, locals and others have been exchanging opinions about the future of the space. 

Discussions about the fate of the Emancipation Memorial need to recognize that it does not stand alone. Since the 1970s, it has not just been Lincoln and a slave. A Black woman, with a remarkable and inspiring story, faces them. Her inclusive gesture reaches out towards them. 

The Bethune Memorial was probably not what Douglass had in mind when he called for a statue which represented the African American “erect on his feet like a man.” Perhaps he could not conceive of a representation of freedom which wasn’t masculine. The Black women of the National Council of Negro Women could and did. The statue they created offers, in part, a corrective to the lack of Black agency in the earlier monument. Does it mean that the Freedmen’s Memorial should not be read as a racist portrayal of white domination? No. Does it mean that protestors are wrong in calling for its removal? No. This is something that should be decided by the Black residents of D.C. who must live alongside this statue. But the Bethune Memorial might suggest ways in which commemorative spaces can operate as places of dialogue. And the story of the NCNW’s creation of its memorial is every bit as important and remarkable as that of the formerly enslaved who funded the Emancipation memorial. It too should be told, and ‘Mother’ Bethune must not be forgotten. 

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176409 https://historynewsnetwork.org/article/176409 0
SCOTUS's Thuraissigiam Decision is a Threat to all Undocumented Immigrants

Chicago, July 13, 2019. Photo Charles Edward Miller. CC BY-SA 2.0




The Supreme Court just made it a lot harder for asylum seekers, but its June 25 decision in Department of Homeland Security v. Thuraissigiam  also poses a fundamental threat to equal protection of the law for all undocumented immigrants. The complex legal arguments about habeas corpus and the plenary power doctrine all boil down to the opinion’s basic point: immigrants are not protected by the Constitution unless they have been legally admitted to the country.  This stark declaration threatens to overturn decades of court decisions and the Constitution itself. 


The Vijayakumar Thuraissigiam case revolves around the issue of whether a Sri Lankan man who crossed the border without authorization and was detained 25 feet inside the United States should be considered to be within the territory of the United States or still on the threshold of the country. In a 7-2 decision, the Court decided that Thuraissigiam should be imagined to be outside of the country and therefore not have 5th and 14th Amendment due process protections that would allow him to challenge his detention in court.  However, in her dissent Justice Sotomayor points to a long-list of cases stretching back to the late nineteenth-century that affirm the right to judicial review in deportation cases when constitutional rights are at issue.  


Even though the Thuraissigiam case revolved around the constitutionality of the expedited removal process written into the 1996 Illegal Immigration Reform and Immigrant Responsibility Act (IIRIRA), the “entry fiction,” as it is known in the legal world, has actually been part of jurisprudence since the early twentieth century when the Court made a distinction in Yamataya v. Fisher between noncitizens who were arriving to the country versus those already inside the United States, affirming that those within the country had due process protections. 


The entry fiction was reaffirmed in 1953 in another case when a permanent resident of the United States, Ignatz Mezei, returned from Hungary and was excluded at the border and held at Ellis Island pending deportation. In this case, the Court held that even though Mezei was being held at Ellis Island, he was not entitled to due process because he had technically not entered the country. As a result, Mezei could be held indefinitely pending deportation without any recourse to the courts to challenge his detention.

The entry fiction and the indefinite detention of asylum seekers surfaced again in the mid-1980s when thousands of Cuban refugees found themselves in indefinite detention because they had committed offenses that made them ineligible for asylum and yet could not be deported to Cuba. Circuit Courts ruled that they had no right to a judicial review of their detention since they had still not been legally admitted to the country.  Even though Mariel refugees had been living inside the United States for several years, some had married American citizens and had US citizen children, the court fancifully imagined them as being still outside US territory.  At a congressional hearing in 1988, an exasperated Representative Robert Kastenmeier declared, “We cannot stick our heads in the sand, hiding behind the legal fiction that the Mariel Cubans are not really here and therefore not due any legal rights.” 

For more than one hundred years, the entry fiction has enabled the US government to deny immigrants due process protections that the 14th Amendment clearly indicates apply “to any person within its jurisdiction.” Although Justice Alito seems to restrict the ruling to people who entered the country within the previous 24 hours and within 25 yards of the border, the logic of the decision poses a more ominous threat to all immigrants who were not lawfully admitted. 


As Justice Sotomayor writes in her dissent, “Taken to its extreme, a rule conditioning due process rights on lawful entry would permit Congress to constitutionally eliminate all procedural protections for any noncitizen the Government deems unlawfully admitted and summarily deport them no matter how many decades they have lived here, how settled and integrated they are in their communities, or how many members of their family are U. S. citizens or residents.” 


It is this threat to more than 10 million immigrants living in the United States without authorization that makes the Thuraissigiam decision such a blow to the basic principles of freedom and justice. It would be odd for a country that imagines itself to be a beacon of hope for people around the world to deny basic constitutional protections to asylum seekers when they finally cross our threshold.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176454 https://historynewsnetwork.org/article/176454 0
Monumental Folly




To many curators and historians, President Trump’s National Garden of American Heroes might sound familiar, an echo of Catherine Reynolds’ $38 million offer to the National Museum of American History (NMAH) in 2001 to fund a hall of great achievers.  She mentioned Michael Jordan, Jonas Salk, Steven Spielberg, Oprah Winfrey, and Martha Stewart, among others.  Kenneth Behring, who had funded a wildlife hall at the National Museum of Natural History, had pledged $80 million to the history museum and was keen on an  exhibit featuring great Americans and would eventually have a major say on “The Presidency,” obviously featuring great white men, and on “The Price of Freedom,” a paean to military triumphs.  When NMAH curators complained that they needed time for research on presidents, Behring said they could find all they needed in encyclopedias.  


During the 1990s, Smithsonian exhibits had provoked controversy, in particular “The West as America” at the Smithsonian Art Museum in 1991, the aborted strategic bombing exhibit to feature the B-29 Enola Gay at the National Air and Space Museum in 1994, and “Science in American Life” at NMAH in 1994.  In the 1990s, there was a concerted effort to strip conflict from history exhibits, a trend that unfortunately continues.  Smithsonian Secretary I. Michael Heyman decreed that the Institution would no longer mount controversial exhibits, green lighting corporations to exert more influence on exhibit content.  “Controversial,” in Heyman’s formulation, often equated to cutting edge scholarship.


National Museum of American History curators wrestled with the implications of the Reynolds gift.  At a Congress of Scholars meeting, it was pointed out that the proposed achievers exhibit did not resonate with any scholarship of the past century and, more importantly, that it was not the role of curators to do the bidding of donors but rather to propose exhibit ideas informed by scholarship.  If donors dictated encyclopedia-informed exhibits, then museum curators were superfluous; anyone could curate exhibits.


Meddling in exhibit generation was nothing new, but as scholarship became more central to exhibits and donors more demanding, the Society for History in the Federal Government (SHFG) created Museum Exhibit Standards in 1997. Most scholarly organizations quickly signed on. Victoria A. Harden, historian at the Office of NIH History and its Stetten Museum, headed a SHFG committee that drafted standards that staunchly defended curatorial control.  Exhibits should be “founded on scholarship, marked by intellectual integrity, and subjected to rigorous peer review,” it stated.  Museums should identify stakeholders, be aware of diversity, and when dealing with controversial subjects “acknowledge the existence of competing points of view.”  Lonnie Bunch, then head of curatorial affairs at the history museum, and curator Paula Johnson joined the SHFG committee that authored the standards.


The creation of standards was prescient, for when the implications of Reynolds’ achievers exhibit reached academia, there was a groundswell of support for museum curators, beginning when the Organization of American Historians demanded that the Smithsonian adhere to the History Standards, followed by an American Historical Association resolution that called upon the regents to revise the agreement with Catherine Reynolds.  Eventually, Smithsonian Secretary Lawrence Small’s support of Reynolds in particular and of donor intrusion in general had drawn protests from eighty scholarly organizations, but incredulously, or perhaps predictably, such heavyweight fire from the country’s leading scholars bounced off Secretary Small and the Smithsonian Board of Regents, revealing contempt for academicians, public historians, and especially museum curators. 


Catherine Reynolds, like Kenneth Behring, leveraged a fortune to enhance social ambitions.  Hers is an intriguing stairway to wealth and prominence but does not include a step featuring a sophisticated understanding of history. When someone at a June 2001 meeting suggested that the exhibit would need to be grounded in history, her reply was, “Oh, you mean in chronological order.”  Her contract stipulated that she would appoint ten of the fifteen members of the selection committee giving her control of exhibit content. 


There was an intriguing backstory to the Catherine Reynolds hall of achievers idea, for it closely resembled the American Academy of Achievement run by her husband, Wayne Reynolds, who had earlier attempted to interest Kenneth Behring in a museum of achievement.  Each year the academy chose thirty prominent achievers from across the country to receive a Golden Plate Award and sponsored an annual four-day retreat.  The tie between Catherine Reynolds’ gift to American History and her husband’s Golden Plate Awards largely escaped public scrutiny, although the people she mentioned as potential great achievers for the exhibit were all members of the American Academy of Achievement.  Significantly, the Reynolds’ gift announcement came shortly after Secretary Lawrence Small received his own Golden Plate and billed the Smithsonian $14,600 to charter an executive jet to attend the San Antonio award ceremony.  This tangle of backslapping, self-praise, donations, and Smithsonian exhibit space did not displease the ever-complicit Board of Regents. 


The Reynolds gift challenged curatorial integrity, yet in a money-hungry museum starved of exhibit funds the hall of achievers proposal found purchase, and several curators joined an effort to weave the project into a respectable historical narrative while others criticized the project.  


In July 2001, a NMAH Congress of Scholars’ memo to the board of regents provoked a harsh response from museum Director Spencer Crew, not in support of his curators but rather of the Reynolds proposal.  He was “extremely disappointed,” he replied, “with the timing and the tone” of the memo.  The donor agreement review, he wrote, went through the Office of the General Counsel, the Under Secretary, the Secretary, and the Board of Regents, and he condemned our memo for demonstrating “a lack of respect for this exhaustive review process.”  In retrospect, of course, that system of checks failed miserably, not only with such donors as Reynolds and Behring but also, it would turn out, in monitoring Secretary Small and his minions. 


Crew ignored the fact that the great achievers exhibit came not from museum staff but from a donor, and it drew upon the donor’s hubris and achievement obsession, rather than scholarship.  His reply to our memo, alas, had Smithsonian Castle fingerprints all over it.  What would museum visitors learn from a series of success stories that they couldn’t get from People Magazine? 


The attempt by some curators to placate Reynolds failed, and on February 4, 2002, Catherine Reynolds pouted and took back her $38 million.  “Never in our wildest dreams,” she wrote to Secretary Small, “did we anticipate that the notion of inspiring young people by telling the stories of prominent Americans from all disciplines would be so controversial.”  Her idea for the exhibit, she explained, focused on the power of an individual to make a difference and was “the antithesis of that espoused by many within the Smithsonian bureaucracy, which is ‘only movements and institutions make a difference, not individuals.’”  Several curators were aghast that Reynolds so poorly understood and gravely misarticulated their arguments.  Neither Reynolds nor Behring was interested in history but rather intent on celebrating a conglomeration of callous capitalists, jocks, and TV personalities, and their donor desire floated without a historical anchor.  There was a wax museum sophistication to it.  Catherine and Wayne Reynolds never understood that it was not only their simplistic achievers idea she foisted on the museum that grated on the staff but also the threatened fracture of curatorial prerogatives that traditionally originated, shaped, and mounted exhibits.  Even all the resources of the Castle could not batter down the will of American History curators, who, while attempting cooperation with Reynolds, held to history standards.


With this cautionary event in mind, one can only imagine a national garden monument filled with statues vetted by the White House staff and lacking guidance of museum standards, not to mention history.  President Trump’s garden of heroes seeks to revive a discredited celebratory tradition that cloaked white supremacy in Confederate iconography.  Trump, Behring, and Reynolds share the presumption that wealth makes them historical experts, a flaw they share with a large number of less well-off Americans.  


Are statues outdated relics of a time before Covid-19 and police brutality tore at the idea of American innocence and invincibility?  Is there a moment of opportunity before those intent on coopting and taming demonstrations succeed?  Can historians drive a wedge into the illusion of American perfection during a moment of demonstrations and Black Lives Matter support?  Change is on the front foot, and this is no time to allow wealth and ignorance to gain ground.  Achiever exhibits and sculpture gardens seem pathetic sideshows to the powerful history of the country.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176459 https://historynewsnetwork.org/article/176459 0
A Historian's Reflections on American Dissent

Tony McDade, George Floyd, Breonna Taylor, Minneapolis. Mural by Leslie Barlow as part of the Creatives After Curfew program organized by Leslie Barlow, Studio 400, and Public Functionary. Photo Leslie Barlow. CC BY-SA 4.0




Historically, nothing is new about the current protests and the police brutality that initiated them. Think of Watts in 1965, Detroit and Newark in 1967 or think of the New York Draft Riots of 1863, the Great Railroad Strike of 1877, the Bonus Army protests of 1932, the violent suppression of the labor strikes during the Great Depression. There have been thousands of protests in American history and some have verged, like the present one, on outright rebellion. Some have engendered progressive change, but even then it’s mostly been modest change. 


Despite all the success of the nonviolent civil rights movement in the 1960s, Minneapolis still happens. Ferguson still happens. We know names like Rayshard Brooks, George Floyd, Breonna Taylor, Ahmaud Arbery, Tony McDade, Michael Brown, Riah Milton, Eric Garner, Tamir Rice, Dominique "Rem'mie" Fells, Trayvon Martin....


The list goes on and on. But even the recent killings are nothing new. Think of the thousands of lynchings from the 1870s right through to the 1970s. Think of Jackson State, Augusta and the LA Riots of 1992. What's brought the inhumaneness of four hundred years of American racism sharply into focus in the present moment is social media—iPhones, YouTube, Instagram, etc.—so we see viral videos of police violence perpetrated on African Americans that until the emergence of the smartphone was only heard about and rarely recorded. And on top of that, if you are a thinking, politically-aware American and you see the rise of white supremacist groups and neo-Nazis and anti-Semitism and a commander-in-chief who relishes division and discord, who is throwing gasoline on the fire, you have to think that racial strife is only going to intensify.


One of the frequent criticisms thrown at protestors is that their demonstration has led to violence or encourages violence. Such critics feel that people who are protesting against police violence are hypocritical if violence or looting occurs during the protest. Back during the anti-Vietnam War protests, this was a constant reproach the hawks hurled at the doves (aka peaceniks). Historically most protestors, when they’ve taken to the streets, were not looking to trash businesses or police cars or create mayhem. They were (and today are) trying to disrupt the quotidian activity of society by using civil disobedience to force the public to focus on the object of their dissent—war, police brutality, racism, homophobia, sexism. Most demonstrations remain nonviolent from beginning to end. But there have been many instances when a demonstration devolves into violence, or attracts criminal actors who exploit the demonstration to steal and loot and vandalize.


What critics of protests and rebellions fail to realize is that most often the violence that occasionally occurs during a demonstration is property damage and that pales into insignificance compared to the violence against human beings that Americans are protesting against. 


When the Weather Underground set off bombs in public places, most notably the US Capitol Building, the destruction was nothing compared to the amount of lives lost in a single day in Vietnam. When protests in Minneapolis denouncing the death of George Floyd and the ongoing racism of the police, not only in Minnesota, but throughout the country, got out of hand and looting occurred and a police precinct building was torched, we have to put this into the broader historical/social perspective. 150 years after the abolition of slavery and fifty years after the ending of Jim Crow, African Americans are still treated as second-class citizens, and are routinely subjected to heavy-handed police brutality. 


We can only wonder “what are a few torched police stations or looted businesses compared to centuries of whites looting Black lives?”


The rioting and looting by opportunists who are taking advantage of the protests only serve to reinforce the prejudices of white supremacists and that, of course, exacerbates the central problem of the racial divide—it becomes a downward spiral. But these protests are essential. They focus attention on the "American Dream" by underscoring the fact that for many (far too many) people, it's not a dream, it's a nightmare. Clearly, a reckoning is coming where the United States must look honestly into its own soul. 


At a time like this one longs for a Gandhi or a King to come along and show us the way. Or a Lincoln or a Roosevelt who took up the challenge of leading the United States through existential crises. But I don't see that happening. Not in 2020. Not on the federal level. What is heartening is to see so many whites participating in the protests sweeping the nation. When all white Americans become just as outraged as African Americans at the systemic racism and the storm-trooper tactics of the police, then, and only then, will the United States start to live up to the ideals it embraced at its creation.


America, clearly, is still a work in progress.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176460 https://historynewsnetwork.org/article/176460 0
The Fiftieth Anniversary of the Black Action Movement and the Way Forward



Activists in today’s struggles against institutionalized racism and for black lives can benefit from studying a local victory of fifty years ago. In the spring of 1970, the Black Action Movement (BAM) at the University of Michigan led a thirteen-day strike that won a commitment by the university administration to increasing black admissions to 10 percent of the student body, funding for ten recruiters for African American students and one for Chicano/a students, and increased funding for the Afro-American Studies Program and for other supportive services.

African-American students were 3 percent of the student body at the time of the strike. They prevailed because of unity among African-American organizations and between women and men and because of support from progressive and left-wing non-black groups united in the Coalition to Support BAM. Also crucial was support from radical and liberal faculty and from the university’s service and maintenance workers in Local 1583 of the American Federation of State, County, and Municipal Employees (AFSCME).


As in today’s struggles against police violence, progressive and radical whites recognized the importance of following African-American leadership. As is true today, the Coalition to Support BAM concentrated on educating the predominately white student body on the nature of institutionalized racism and the need for affirmative action.  


BAM won endorsement of its demands from many faculty members, faculty organizations, and even from official university divisions such as the Residential College and the School of Social Work.  Some faculty held classes off campus and others cancelled classes. BAM and its supporters picketed university buildings, sometimes locking arms to prevent entry by those opposed to the strike, sometimes instructing pickets to keep their distance from one another and let people cross. Activists also marched through some buildings such as the business school, making noise to disrupt classes still being conducted. The turning point in the strike came when undergraduate students arose at 5 a.m. on Friday March 27, 1970, to picket their dormitories and other university buildings. Members of AFSCME Local 1583, which had endorsed the strike demands the previous night, refused to cross the picket lines. The dramatic reduction in university services due to AFSMCE’s support led the university administration to make significant concessions in the negotiations with BAM. 


University of Michigan students had been learning about the connections between war, racism, and sexism since the previous fall. On October 15, 1969, students shut down the campus as part of the nationwide Moratorium Day to protest the Vietnam War. The local chapter of the New Mobilization Committee Against the Vietnam War (New Mobe) publicized the moratorium among students and the community by numerous means including calling every member of the student body on the telephone, going door to door in the Ann Arbor community, and sending caravans in the early morning for distributions at Detroit area auto factories.  Many of the workers were black and both black and white workers expressed strong support for the students’ efforts and gave the students money to show their support.  


Students learned a lot from their interaction with workers and community residents and from New Mobe’s emphasis on the connections between the war, racism, and poverty. Among the speakers on October 15 in addition to peace activists were Jack O’Dell, editor of Freedomways magazine, State Senator Coleman Young, who three years later was elected the first African-American mayor of Detroit, and labor leaders Myra Wolfgang of the Hotel and Restaurant Workers Union and Doug Fraser, vice president of the United Auto Workers. Over thirty thousand people gathered at the University of Michigan football stadium for the culminating rally that evening.


For the actions in Washington, D.C. to protest the Vietnam War on November 13-15, New Mobe organized buses and sent 10,000 people to the demonstration, probably the largest concentration of any city or university in the country.  New Mobe had successfully tapped into an immense well of anti-war sentiment among the students and faculty on the University of Michigan campus. At the same time, it educated students on the economic and social impact of the war, encouraged them to view trade unions positively, and worked on developing anti-racist consciousness.  


Although New Mobe speakers and leaders were mostly male, that weakness was partly corrected by the large role assumed by the women’s liberation movement in strike preparations and in the strike itself.  In the fall of 1967 women students had begun a successful campaign to end discriminatory curfews and dress codes. In the fall of 1968, the first consciousness-raising group, the Thursday Night Group, was established by women involved in supporting draft resistance. The next year, the Group protested the Miss Ann Arbor contest. Some of the participants in the Thursday Night Group became key members of the Women’s Liberation Collectives that played a vital role in the Coalition to Support BAM.


Women activists made their influence felt through their positions in the leadership of the strike, the activities of the women’s liberation collectives, a women’s liberation class, a day care center, and through women’s demonstrations.  Nearly three hundred black and white women picketed early in the morning at parking structures on Tuesday March 23, 1970, and moved on, chanting, to other places on campus when the police came.  That afternoon, a similar number of women gathered on the university’s central square, the Diag, and then marched to a hall where President Robben Fleming was “being entertained for tea.”  They sang, “I’ve got that feeling sister that BAM’s gonna shut that mother down.”  They questioned Fleming on budget priorities.  The strike newspaper Rainbow commented:  “Fleming’s meek defense in this confrontation was sharply contrasted by the strength of the women.” 


There was opposition to BAM and to the strike from a variety of places. During a march against the war and in favor of the BAM demands that preceded the strike call on March 19, police picked off and arrested four African-American students from the mostly white group. During the picketing of campus buildings, African-American student picketers experienced far greater hostility from conservative opponents of the strike than did white students. Most well-known was the attack on the university administration by Vice President Spiro Agnew for capitulating to the student demands. 


The BAM strike took place when the labor movement in Michigan was still quite strong. Five months after the BAM strike, the United Auto Workers launched a 67-day strike of 350,000 against General Motors that achieved a substantial wage increase, a 30-and-out retirement program, increased pension benefits, and a Christmas week holiday for all. Moreover, 1970 was a moment when the country was shifting left-ward with the first massive Earth Day demonstration on April 22, 1970, the shutdown of campuses over the invasion of Cambodia, the passage of such progressive legislation as the Occupational Safety and Health Act, the Clean Air Act, and amendments extending and strengthening the Voting Rights Act, and Democratic victories in the House of Representatives and governors’ races in the mid-term elections. 


In the BAM strike, the essential workers who were members of AFSCME were crucial. Today, as we face a continuing pandemic, all rely on essential workers, many of whom, like in 1970, are people of color. The interconnection of issues today is as vital as in 1970. Undermining institutionalized racism through measures such as defunding the police, ending mass incarceration, and enacting legislation for reparations is on the agenda. The movement also is about promoting women’s leadership and ending violence and discrimination against women and against members of the LGBTQ community. The mobilization is continuing also because of the need to protect voting rights, for environmental and climate justice, for jobs for all at a living wage, for science-based public health measures to address the pandemic, and for Medicare for All. Support for unionization of essential workers and the enactment into law of the Protecting the Right to Organize Act can happen with a Democratic Congress and President.  


The BAM strike showed local victories are possible. Although the university was less than consistent in its effort to reach the ten per cent black student enrollment goal, later cohorts of African-American students and their allies again and again renewed protests to secure attention to the initial goals and program commitments.  Eventually, the University Administration itself became a committed exponent of its affirmative action program, becoming a prime defender of university affirmative action before the Supreme Court in 2003.  Nevertheless, the ten percent goal was never reached. Moreover, a ballot initiative led to the banning of affirmative action programs in Michigan in 2006. In recent years African-American enrollment has hovered between four and five percent. 


Achieving permanent change in the country as a whole will require uprooting the structures of institutional racism and establishing new egalitarian structures. It means remaining united and organized, continuing a program of education and self-education, and countering the forces that are still wedded to racist values. The crises we are living through are difficult but there is a way to a better tomorrow. As the leaders of BAM said on the conclusion of the 1970 strike on April 2: “We have shown that, from a well-planned and co-ordinated strategy, coupled with firm internal discipline and undergirded by a strong moral purpose, profound changes can be wrought in the established order; without violence.”

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176453 https://historynewsnetwork.org/article/176453 0
Trump Made it Manifestly Clear: The Discussion of National Destiny is Ongoing

Frederick Jackson Turner, 1902



Before there was a #BLM, before there was a Donald Trump, before there was an American Century, there was a frontier. Not a metaphorical frontier, like space or the edge of human knowledge, but an actual frontier. It marked the Western edge of civilization in America, and beyond it lay seemingly endless plains and mountains – empty of everything except the treasure and adventure which waited for the man (and it was, of course, always a man) who was brave enough to go out and take it with nothing but his own two hands. The taking of that theretofore unseen land and its treasures, from the Appalachians to the Pacific coast, was more than an opportunity for a young nation – it was an almost sacred duty for every young man worthy of the name. It was a fearful dream, a terrible hope, and a beautiful promise all rolled into one package; it was a birthright, an obligation, and an inevitability – it was a destiny. 

But, more than any of that, it was a blood-soaked lie. 

The frontier existed, and so did the men and women who lived on its edge and ventured beyond it. The treasures were real too, but the land was not empty. Peoples like the Sioux, the Apache, the Ute, and many others were already there, and they had been living in that ‘empty’ land for longer than the ancestors of the European-American settlers had lived outside of caves. Over more than two centuries, the American frontier changed in size and location, but, by the time Frederick Jackson Turner picked up his pen in 1893, the myth of a nation built upon an empty land, and its own manifest destiny, had crystallized. If that myth were still held as gospel truth, it would be no surprise to read the celebration of manifest destiny that the White House released on July 6. One and a quarter centuries ago, that exact message would have seemed no more morally repugnant, socially out-of-step, or historically ignorant than Turner’s Frontier Thesis. The only problem is, one and a quarter centuries have passed since that was true. 

In opening years of the 20th century, America appeared poised to follow the examples of expansionist European powers into a new imperialist age. Theodore Roosevelt’s presidential tenure saw, among other things: the consolidation of American control of several former Spanish colonial possessions, the engineering of Panamanian independence to facilitate the construction of a U.S.-operated canal, the declaration of American naval superiority in the form of the world tour of the Great White Fleet, and the continuation of the series of violent suppressions of native Americans (which would continue until the end of the ‘Indian Wars’ in 1924) and African-Americans in both the urban North and the rural South. While the Turnerian view of the frontier was not entirely met without dissent within the American historical establishment, it would hold sway for much of the first half of the century. 

In a book or a movie, this might be the point when a single heroic soul would be called upon to make things right. The problem is that climbing out of the nadir of American race relations and persuading the country to acknowledge the ocean of blood and tears that allowed the United States to grow into the powerhouse which dominated Henry Luce’s American century was too great a task for any one person. The glory of our past is that there were many such heroes, who, together, were up to the task. Out of an abundance of heroes, America was lifted out of the nadir and toward a dream of full freedom. A long line of people like Ida B. Wells, John Hope Franklin, Anna Julia Cooper, Eleanor Roosevelt, Bayard Rustin, Lyndon Johnson, Kenneth Stampp, and Francis Jennings changed the way Americans thought of each other, and how we continue to think about our common history. That is why it is so very shocking and incredibly offensive to even imagine an American president who, after those many lifetimes of desperate struggling to overcome that narrative, could be capable of glorifying the idea of manifest destiny. But, despite all of the work of those who came before us, that is exactly what the Trump administration has done this week. 

As we enter the third decade of the 21st century, we appear to be facing many of the same problems our nation was forced to confront in the first decade of the 20th century. The murders of Breonna Taylor, George Floyd, and many other Americans at the hands of trusted authorities who are too corrupt, too callous, or simply too racist to care would have been all too familiar to Ida B. Wells. That’s why America desperately needs a 21st century Ida, an Anna, a Lyndon, and a Bayard to see our situation for what it is and to choose to do the hard work of building a better country for everyone – in fact, we need thousands of them! As a historian, I am used to telling stories that do not have happy endings, and, for now at least, this is no exception. We can take heart that our country and our discipline have come a long way from the nadir and Frederick Jackson Turner. Somewhere between Teddy Roosevelt and Colin Kaepernick, we have managed to pick up a few yards as Americans and as American Historians. We are, perhaps, justified in taking a small measure of pride in the work that has already been done and the distance that has already been run, but we have miles, perhaps even marathons, that are still left ahead of us. If we still needed proof of that, the president just made it manifestly clear. Let’s get to work.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176458 https://historynewsnetwork.org/article/176458 0
Barry Zorthian's War: The Pentagon and the Press in Vietnam



The terrible relationship during the Vietnam War between the government and the press was what I walked headlong into when I arrived in Saigon in early 1966 as bureau chief for NBC News. Before I landed in Vietnam, relations had already fallen apart between the press, the military, the State department, and the White House. The contentious divisions, that at one time were only skirmishes between the press in Vietnam and government and military officials, had become open warfare. In my many years covering the war the problem would only get worse. In the years after Vietnam the mutual mistrust between the government and the press would further intensify as the United States engaged in a series of late 20th century wars. Vietnam was where today's distrust in the media started and made possible today's climate of "fake news" accusations and open governmental hostility toward the press.                                                                      

In Vietnam in the 1960s most officials were unhappy with what we presented in our reporting. Though we were free to roam wherever we wanted and we were free to photograph and interview anyone we wanted, including those in uniform on the battlefield, we did not always have the trust of the men in command. In time things only went downhill. In some ways (even though it is difficult for me to speak for everyone), when it came to getting accurate information about an event, especially when death was part of the story, we in the press found ourselves dealing with a toxic mess in a war that increasingly seemed endless.                                                     

When I took the desk in Saigon, David Haberstam, Neil Sheehan, and Peter Arnett, among other enterprising reporters, had already started to open the public's eyes with their effective, detailed, tough minded and riveting reporting. The official military and civilian briefers were starting to recoil with fear when confronted with the truth--call it the real war--, and thus they attempted to limit our knowledge of the details of everyday action. I learned quickly that State and Defense did not always agree. Though one person in Saigon was in charge of all information, there was more often a sharp disconnect between those two branches of government. 

We knew there was a man behind the curtain orchestrating the dissemination and flow of information. He was Barry Zorthian, who headed JUSPAO (Joint United States Public Affairs Office.) Zorthian's role as chief of information from 1966 through 1970 was basically to advise the ambassador on how information from both the civilian side and the military got to the press, but his main role was to control the flow of information to the press. During my coverage of the war for more than 6 years, I had no idea what motivated Zorthian or what rules he lived by. 

Now I know what he thought because of a 1970 speech called "Effective Press Relations," recently discovered by Donald M. Bishop of the Marine Corps University. Zorthian gave the speech to the Command and Staff College the year he ended his more than four-year tour in Vietnam, and it was subsequently published in the Marine Corps Gazette. 

Barry Zorthian had a carefully thought out philosophy of how to disseminate information. He was intellectually high-minded. The truth was important to him. Accuracy was important. But he was in and of a government for which the malleability of the truth was sometimes more important than its reality. He was first a high government official with a mandate to make America look good. He had a role to play and he played it well. I know personally and now from his speech that he believed in a free press. This does not mean he always followed that dictate. Considering the pressure he had to be under from every branch of government I understand how his beliefs were probably at times impossible to follow. Because of an ever-changing, fluid war Zorthian never fully achieved what he wanted.

His ideas are particularly relevant today because we live in an era of vicious slurs of fake news fostered by official attempts to harness the press in Washington, especially at the White House. This piece is more about attitudes than specific events. It is about how people who controlled information in Saigon went about their jobs particularly as the war continued and there were always more questions than answers. Zorthian believed that the government's approach to information had been disingenuous, if not purely deceptive and misleading. He believed that "you simply cannot get away with a gap between reality and your (meaning a briefing officer’s) articulation of it."

I was in Zorthian's presence at least once a week for more than 3 years during my first tour. I knew what I was getting from him every moment. I knew that he knew when he was dissembling, modifying, stretching or shrinking the truth to fit an official need. In some ways we were playing a game in which each of us believed we had the upper hand. Because I rarely gave in to his needs and used the information he gave me for my, meaning NBC's needs, I never felt used. Truth is, I did not need Zorthian's help in covering combat. It was important to deal with him because of his official capacity but I had no need for what he had to offer for my daily needs. 

We sometimes played a game. I learned from him what he wanted me to know, mainly about Vietnamese politics. He learned from me that I took most of that information and usually buried it because I had no outlet for it. We were both cynical in our relationship, but we made no pretense it was otherwise. I was not naive nor were most of the press who covered the war, as he was not. Controlling the flow and amount of information is what Zorthian tried to do. Our job in the press was to take a long hard look behind the curtain in the pursuit of the truth--not always an easy task, and rarely fulfilled without flaws. 

Zorthian was rarely completely open. He gave you just enough information but not too much. He maintained as close control over information as possible, only divulging that which benefited America's position on Vietnam. I gleaned inside information about Vietnamese politics. My only regret is because of the demands of the producers in New York we did not cover more local political stories about Vietnam and the Vietnamese than we did. We mostly covered combat with American troops in the field and we did that very well. Because we neglected everyday life in Vietnam, however our storytelling was never complete. 

Barry Zorthian died in 2010 and never lived to see what life has become as a journalist in the era of Donald Trump. He would be appalled. Zorthian thought that the confrontation between the press and the government in those years was the worst in history. That was then, but today it is even more severe. It can be fairly said that where we are today in our poisonous relations between the press and the government is a result of the Vietnam War where we openly went at each other with every weapon at our disposal. When Pentagon or State was unhappy with what we did, which was often, there were complaints to our home offices. There were occasional investigations. Nothing went anywhere because those complaints were mostly ignored and rightly so. 

As laid out in his speech, and from what I saw over the years in my dealings with him, he was against lying to the press. He wanted to restrict security to a minimum. He believed in setting clear ground rules with the press so there would be no misunderstandings. He wanted his briefing officers to be firm with those ground rules and to take the initiative when presenting a point of view. According to the speech he thought a public affairs office should work to blunt the harm that might ensue from a damaging story. Make it less bad than it might otherwise have been was his mantra. But do not lie. 

His biggest problem was controlling military public affairs officers who doted heavily on security and wanted every action they described to stay in the shadows, impossible because of the freedom of movement we had. He thought that too often the military did not understand the distinction between information and publicity. Military briefers lacked training. From my dealings with the military, I knew they resented being told what to do even by their bosses. They were not bothered by their need to protect their own and to do so would err on the side of divulging almost no information. After the Vietnam War, fostered by the Pentagon this led to the concept of embedding, in the wars that followed, a way to control reporting as the term implies, by inserting reporters inside units, thus controlling and limiting their movement. By any other name it is censorship because it limits free access to a story. Recall that in Vietnam we had freedom of movement. That did nothing to end the war but it allowed us to cover combat in an open way. It is sad to note that every news organization I know gave into embedding and as far as I am concerned gave up something of their birthright to a third party, the Pentagon.

I am certain that others who were in South Vietnam when I was or who came after may not agree with me all or in part. Each experience and memory is different. I accept differences. I welcome differences. I know this-- If Barry Zorthian were alive he would be revolted by what is going on between the press, the government and the increasingly difficult partisan divide. 

Our way around changing any new and frequent restrictions in Vietnam, restrictions that did not make sense, and the frequent untruths we believed were created by briefing officials that made less sense was our belief in the freedom of the press. It still may be possible today. That entitlement that we believed we enjoyed extended to everything we did to get the facts we reported on day to day. I still believe in it today.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176408 https://historynewsnetwork.org/article/176408 0
Who Was Our Worst President? Think About it When a Grim 75th Anniversary Arrives

Hiroshima, Two Months after the Atomic Bomb, October 1945.





Princeton Professor Sean Wilentz, who teaches the history of the Revolutionary War and early American history, considers Donald Trump “without question the worst president in American history.”


In Rolling Stone, he cites “Trump’s astounding incompetence in recent months—worsening the effects of the lethal COVID-19 pandemic, mishandling the ensuing economic disaster, and maliciously inflaming racial tensions.”


Runners up are James Buchanan and George W. Bush, neither matching Trump’s “record of bringing on or aggravating three devastating crises at the same time....”


I see Donald and George W. tied for second place. Bush let 9/11 occur, then used it to excuse aggression in Afghanistan and (oil-producing) Iraq. He also instituted the torture of prisoners.


Professor Wilentz’s argument against Trump could be stronger, adding foreign offenses. My points:


  • Although promising peace in his first campaign, President Trump expanded and intensified warfare in Asia and Africa. This included the bombing of Syria, which citizen Trump had repeatedly warned against. Congress authorized none of those actions.
  • He assassinated an Iranian general and threatens Iran and Venezuela (both oil producers).
  • Violating statutes against exports that promote conflict and exports to foes of human rights, he has aided Saudi Arabia’s bombing of Yemeni civilians. He also furthers its nuclearambitions.  Without constitutional authority, he has presumed to make law by repealing treaties on his own. These include the Intermediate-range Nuclear Forces Treaty (INF) to reduce nuclear stockpiles; and Open Skies, to lessen the risk of conflict.
  • He has imprisoned lawful refugees, splitting children from parents.
  • His policies hinge on how they can affect his political and business interests. Withholding aid to Ukraine unless it investigated Joe Biden got him impeached. He allegedly asked China to buy more U.S. beans to win votes from American farmers. 

More surprises may pop up before November 3. In 2011 and 2012, Trump repeatedly predicted that President Obama would attack Iran to help his reelection. So if dwindling poll numbers make Trump desperate, he may try what he projected onto Obama: a new war.


Right now, however, here is my candidate for the title of “America’s Worst President.”


He gave them hell


“Give ‘em hell, Harry!” partisans of Harry S. Truman often shouted at campaign rallies. And he rained hell on two Far Eastern peoples.


Truman was a Missouri haberdasher who became a judge, a senator, and briefly vice-president before succeeding President Franklin D. Roosevelt on April 12, 1945. 


A local talk-show host recently called Truman “one of our greatest presidents.” I place Truman at rock bottom. 


He is praised for leading a Senate committee that investigated waste and overcharges in military production, for banning military racial segregation, and for helping post-war European economies through the Marshall Plan. But the enormity and lasting effects of his misdeeds hugely outweigh any good deeds. 


First of all, Truman introduced the world to nuclear terror.


On August 6, 1945, he condemned as many as 180,000 children, women, and men of the city of Hiroshima to agonizing deaths. In so doing, he ignored The Hague Convention’s bans on sneak attacks, bombarding of defenseless communities, and arms causing unnecessary suffering, 


Imagine. A bomb releasing many times the sun’s heat, ejecting people and buildings miles high into a mushroom cloud. A woman with a baby—converted to charcoal. Skin falling from people’s bodies. Viewers of the explosion—their eyes melting. Winds of 500 miles an hour sucking people from buildings. Glass shards flying at 100 miles an hour—serving as guillotines (From Dr. Helen Caldicott).


Truman announced a military and scientific triumph: “A short time ago, an American airplane dropped one bomb on Hiroshima and destroyed its usefulness to the enemy.... We have now added a new and revolutionary increase in destructiveness.... We have spent $2 billion on the greatest scientific gamble in history and won.” He

mentioned the Japanese people just once—implying their “utter destruction” if leaders rejected U.S. terms.


Truman had no qualms about playing “Almighty God,” which he often invoked. He slept well. Pleased with his handiwork, on August 9 he similarly attacked Nagasaki. There he smote dead as many as 100,000 more civilians.


Why did he do it?


The popular belief—which Truman and staff helped to spread—was that the bombings saved myriad American lives by ending the war quickly. This implies that Japanese lives did not matter. They mattered none to the man from Missouri, who harbored racist attitudes toward those he called “Japs,” “Chinamen,” and “niggers.” 


Actually, the Japanese government tried to surrender before the bombing. Truman knew it, proved by his diary entry for July 18, 1945, relating a talk with Prime Minister Churchill: “Stalin had told P.M. of telegram from Jap Emperor asking for peace.”


In any case, did the new devices have to be demonstrated on people—twice? To one who regarded Japanese as subhuman, the answer seemed to be yes. Differences in the “Little Boy” and “Fat Man” bombs suggest that humans served as guinea pigs in testing the effects of new weapons.


The advance of the Soviet Red Army, which had belatedly joined the fight against Japan, doubtless concerned Truman. The bombs would show the Russians and the world who was boss.


Remember too, the Truman Committee’s chairman had exposed waste in military production. His attitude toward the costly atomic bombs might be expressed in this question (attributed to Donald Trump): ”If we have them, why don’t we use them?” 


Today nine nations possess about 13,400 nuclear bombs, 6,372 of them Russian and 5,800 American. The strongest, a Russian hydrogen bomb, equals 3,800 Hiroshima bombs.


War annihilating all human life has repeatedly threatened. Yet after 75 years, our supposed democracy still entrusts doomsday weapons to one man, no matter how stupid, impulsive, ignorant, or hateful he may be.


Should any president have that much power?


Korea and more


Truman made the president the initiator of war. 


Until he ordered forces to Korea without authorization by Congress, nobody in American government had ever argued that a president could lawfully start a war. Writings of the nation’s founders show their intention to reserve that authority to Congress.


Expecting the incursion into the South by North Koreans on June 25, 1950, Truman was anxious to fight them. The Soviet delegation being absent (protesting Red China’s non-seating), on the 27th he got four permanent members of the UN Security Council to rubber-stamp his decision. 


The United Nations Charter had been signed on June 26, 1945, in San Francisco, pledging to end “the scourge of war.”  It required all five permanent members to agree on any action.


Truman’s war—masquerading as a United Nations “police action— began on June 30, 1950. It ended in stalemate and armistice under President Eisenhower on July 27, 1953, six months after Truman left office. 


The war took nearly five million lives. More than half were civilians, mostly northern victims of saturation bombings. Destruction of cities and towns with napalm flouted international law. “This is moral degeneracy,” journalist I. F. Stone wrote. Even General MacArthur regretted the cruelty. 


War crimes included official massacres of civilians. A North Korean accusation of biological warfare was roundly denied by Washington but likely based on fact. The war almost went nuclear.


U.S. deaths numbered over 54,000 (original toll) or some 37,000 (revised toll). Legions of Chinese “volunteers” succumbed too. 


Every president thereafter has imitated Truman by waging war, overtly or covertly, without the prior congressional approval that the Constitution requires. Millions more have died in those illegal wars in Vietnam Cambodia, Laos, Panama, Iraq, Yugoslavia, Afghanistan, Pakistan, Libya, Somalia, Syria, Yemen, and elsewhere.   


As if nuclear holocaust and presidential war-making were not contributions enough to historical infamy—


  • Truman had Russia and Germany swap roles as ally and enemy, launched a cold war with Russia, and hired Nazis to design weapons against America’s World War II ally. 
  • In 1947 came Truman’s doctrine: “Help free people to maintain their free institutions.” It meant permanent preparation for war, the U.S. as world policeman, and bolstering of anti-Red regimes, however oppressive.  
  • Despite his talk of freedom abroad, he set off a domestic reign of terror, an era of witch hunts and blacklists. He even threatened violence against a music critic who had exercised press freedom by panning the singing of Margaret Truman, Harry’s daughter.
Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176456 https://historynewsnetwork.org/article/176456 0
Make Digital History a National Common Ground




When I was in eighth grade, I traveled with my classmates to Washington, D.C. for the first time. I marveled at the amount of American history that happened in the iconic buildings of our nation’s capital. Perhaps the most notable of these buildings is the White House, an enduring symbol of democracy. 


From the North Lawn, I saw the same driveway that foreign leaders use when arriving for state dinners. I saw the lush green South Lawn that Marine One uses to touch down. The Rose Garden, where President Kennedy greeted astronauts back from the first human space flight, is secluded in the trees, next to the Oval Office. Lafayette Square, a public space where generations of activists and ordinary citizens alike have demonstrated for a myriad of causes, sits in front as a gathering space for onlookers. As I arrive at the Decatur House for work now, I often wonder if I noticed this brick building on the Square those many years ago. 


For many students who are fortunate enough to make the trip to Washington, this can be the first step in a lifelong journey of learning. It certainly was for me. So it’s no wonder that for generations, teachers have brought their students to see the White House for themselves. But for all those who come down from Pennsylvania Avenue, there are many more who can’t make the trip because of distance or cost and, this year, because of COVID-19.


No one should be at a disadvantage because they can’t visit D.C., or other historical landmarks like Presidential homes and libraries. We can take advantage of our increased dependence on online learning to inspire students, no matter where they live. This is why we’ve brought so many of these resources to our website and even built an app that can take you on a virtual tour of the president’s home. Other institutions popular with tourists, including the Smithsonian Museums and Monticello, are offering virtual tours. The Kennedy Center is offering inspiring performances on their digital stage. Resources like these have the ability to send students on new journeys to faraway places or gain an understanding of times long ago. It certainly won’t be the same as seeing historic landmarks in person, but instead of only visiting one or two sites, the options are nearly unlimited. 


In historic times such as these, when the country is facing economic strife and the consequences of ongoing racial injustice, learning about how the American people have overcome obstacles and how our country has and continues to transform over the years, can provide hope and guidance for the future. These lessons may help students put the daily events they are seeing on the news and in their communities in a context they can more easily understand. For instance, how did our country survive the Spanish Flu of a century ago or what innovations and ideas came out of the Great Recession that we still rely upon today? We can also look to past protests on Lafayette Square to ask how they have impacted the course of history, from the 19th Amendment that gave white women the right to vote to ending the Vietnam War. There’s no doubt that future students will look toward the current moment and ask themselves similar questions. 


I think about how meaningful that first visit to Washington was, and how it sparked a passion for history that has defined my career. And while many will not be able to travel from their own house to the People’s House soon, I hope our resources provide students with the same inspiration I had as a young visitor to DC. Our history is our common ground and it continues to shape our collective future. 

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176457 https://historynewsnetwork.org/article/176457 0
Thinking About Racism Beyond Statues and Symbols

Mural at 38th St. and Chicago Avenue, Minneapolis. Painted May 28, 2020 by Xena Goldman, Cadex Herrera, and Greta McLain.

Photo Lorie Shaull. CC BY-SA 2.0





George Floyd’s death at the hands of police officers contributed to protest and debate about statues and other monuments to the Confederacy, but his life and death should also bring attention to a more recent and troubling history.  Floyd’s autopsy and obituary reveal a life subjected to enormous pressures that ended with a policeman’s knee pressing down on his neck.  In his life and his death Floyd experienced the coercive structures that constrain, punish and eventually kill altogether too many Americans.  Floyd’s birth in 1973 made him subject to what historian Elizabeth Hinton has called the shift from the “war on poverty to the war on crime” which Richard Nixon had initially launched two years earlier as the “war on drugs.”  It was the same year that the oil crisis, sparked by US involvement in the Middle East, helped to damage an already faltering economy bring to an end the trend towards greater economic equality for the bottom 90% according to Thomas Piketty and Emmanuel Saez.


His father’s departure ensured that Floyd grew up in the kind of household most likely to experience poverty as his single mother moved into a tough neighbourhood in Houston, Texas. Athletic ability briefly offered him a chance for a university education, but, like many African American athletes, he failed to receive a degree.  As a consequence, Floyd remained in the 74% of African Americans who lack that educational credential, another contributing factor to poverty and precarious employment.  Unstable relationships with women produced five children while apparent involvement in criminal activity seeking drugs and money sent him to prison.  Floyd became an absent father to his own children following his parental example.


A religious conversion seemed to offer Floyd a change in his life’s fortunes and he moved north to Minnesota to be close to his youngest child. Then came COVID-19, which Floyd had contracted before his death.  Unemployment was another common result.  Like many distressed Americans, he died with evidence of drugs in his system which included Fentanyl, Marijuana, and methamphetamine.  Despite the “war on drugs,” many Americans use these drugs to escape from lives of quiet and not so quiet economic desperation. His life ended in the fateful encounter with the Minneapolis policeman that many people have seen, one of the approximately one thousand Americans killed by police each year.  More Euro-Americans are killed in this way, but proportionately more African Americans, including the unarmed, are subjected to this form of violence as the “war on crime” has turned from metaphor into a militarized policing system.


Placing Floyd’s life and death in historical context, the presence of racial domination is clearly an important causative factor going back to slavery, segregation, and the violent imposition of white supremacy in North Carolina, the state where Floyd was born, and Texas where he grew to manhood.  His death in Minneapolis, however, points to the way race entangled with class in northern cities as Thomas Sugrue and other historians have discussed.  Martin Luther King took up this issue in 1965, culminating in his efforts to organise a Poor People’s Movement in the year of his death. That year Nixon won on a law and order campaign inspired by a backlash to the Ghetto revolts, the Civil Rights Act, and the Voting Rights Act.  Had King and Robert Kennedy not fallen victim to violence in 1968, the history that followed might have offered greater hope to Floyd’s family and other poor Americans.


The effort to unite the poor to transcend racial antagonisms was King’s final dream. It was an important theme in Kennedy’s campaign as he deplored poverty in Mississippi, the suicide of young Indians, decaying schools, and unemployed West Virginians.  I enlisted in Kennedy’s campaign that year for those reasons as a twenty-year old who believe that poverty and racial oppression were just as important as ending the war in Vietnam.  Death ended those hopes in 1968 and the United States moved into a more punitive approach to poverty and an increasingly polarised culture during Floyd’s childhood and adult years.  


Perhaps now the protests in cities and towns throughout the United States can revive those campaigns for equality and lives of ‘purpose and dignity’ for all Americans as Robert Kennedy pledged in 1968 but it requires structural change to an undemocratic political system, regressive taxation system, unequal education, inadequate healthcare, and many other areas of American life that contributed to the tragic life and death of George Floyd. As Cass Sunstein has argued, it requires the enactment of Franklin Roosevelt’s “Second Bill of Rights” that he, like King and Kennedy, did not live to realize. It requires what Alice Kessler-Harris described as the politics of “collective responsibility” that has enabled New Zealand, where I now teach American history, to avoid the catastrophe now engulfing the United States.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176455 https://historynewsnetwork.org/article/176455 0
The Three Political Prodigy Governors of the 20th Century Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.



Three 20th century state governors came to office in their early 30s with Presidential ambitions and potential, but two of them faded fast after dramatic early years in public office.  Both Republican Philip LaFollette of Wisconsin (1931-1933, 1935-1939) and Republican Harold Stassen of Minnesota (1939-1943) made a lot of news in their short, meteoric careers as major public figures. The third, Democrat Bill Clinton (1979-1981, 1983-1992), stumbled on his way to a presidential campaign but ended up having a massive impact on the American people in a presidency portrayed in intensely positive and negative terms.  His wife, Hillary Rodham Clinton, became a major figure as First Lady, then as Senator from New York and Secretary of State in the first term of President Barack Obama.

Philip LaFollette was the younger son of “Fighting Bob,” Wisconsin Governor, Senator, and 1924 Progressive Party Presidential nominee Robert M. LaFollette, Sr., who was acknowledged by historians as one of the greatest state governors and US Senators of all time. Philip LaFollette was also the brother of Senator Robert M. LaFollette, Jr., who was one of the major progressive figures in the tradition of his father, and an influential figure during the New Deal of Franklin D. Roosevelt. His significance and that of his brother Philip is described and fully developed in my monograph, Twilight of Progressivism: The Western Republican Senators and the New Deal (Johns Hopkins University Press, 1981).

Philip LaFollette was the youngest governor elected in modern times.  He was about 33 years and 8 months old, and he followed the tradition of Wisconsin progressivism that had been established by his father thirty years earlier.  He was much more outspoken and assertive than his brother, but when their father died in 1925, Philip was only 28, while “Young Bob” had reached the minimum age of 30, so the latter ran for his father’s Senate seat, while Philip was already serving as District Attorney of Dane County (which included Madison, the state capital) from 1925-1927.

Philip LaFollette was, surprisingly, defeated for reelection after his first two year term which ended in 1933, but came back and won the Wisconsin Governorship a second and third time (1935-1939), forming the Wisconsin Progressive Party as his political vehicle, and the LaFollette brothers were at their peak at the height of the New Deal.  But the rivalry between the brothers, the much quieter Bob and the more assertive Phil, led to growing opposition to Franklin D. Roosevelt and the New Deal, and particularly to criticism of any move to abandon the isolationist mentality that gripped much of the nation and the Congress in the late 1930s.  

The threat of Germany and Japan grew to cause a major split between the President and the LaFollette brothers.  Phil decided to form a third party movement, the National Progressive Party of America, and had plans to run for President in 1940, as he assumed FDR would not run for a third term.  But the third party effort failed to get off the ground, and he lost massively to his opponent in the gubernatorial race, 55 to 36 percent in 1938. He never sought political office again, and his involvement with his brother in the America First crusade in 1940-1941 undermined his public reputation.   

Phil LaFollette did serve in World War II under General Douglas MacArthur, however, and promoted the lost candidacy of MacArthur in the Republican Presidential nomination battle in 1948.  His brother would manage to keep his seat in 1940, but would then lose in the Republican primary in 1946 to future Senator Joseph McCarthy, accused not being a true Republican having kept the Progressive tag even after the failed third party movement in 1938.  Phil engaged in private business in later years, wrote his autobiography, and died at age 68 in 1965.  His widow, Isabel, lived on to 1973, and was interviewed by this author in Madison, Wisconsin, while he was doing research on his book on Progressive Republican Senators in the summer of 1970.

Harold Stassen was elected Governor of Minnesota at the youngest age in modern American history, being only 31 years and about 9 months old, when taking office in 1939.  He had been a child prodigy, graduating high school at age 15, and gained his bachelor’s degree at the University of Minnesota at age 20, and that university’s law school degree at age 22.  He was elected District Attorney of Dakota County, part of the Minneapolis-St, Paul metropolitan area, taking office in 1931, while still age 23, and was reelected in 1934.  He became active in state Republican politics, and announced his plans to run for Governor in 1938.

Becoming known as the “Boy Governor”, he became extremely popular during 1939, and some saw him as a potential Presidential candidate, although he was not old enough in 1940. He had high public opinion ratings even from non-Republicans, who saw him as a future nominee.  He gave the keynote address at the Republican National Convention when he was only 33 years old. After being reelected to second and third two-year terms in 1940 and 1942, he resigned early in his third term to report for active duty in the US Navy in the Spring of 1943, and served under Admiral William F. Halsey, Jr. in the Pacific Theatre of World War II. Stassen was awarded the Legion of Merit for his service as Commander in that position.  He was promoted to the rank of Captain in September 1945, and released from active duty in November 1945 after two and a half years of service.

No one looking at Stassen’s meteoric rise would have thought that he would never again hold elective office while pursuing perennial failed presidential campaigns. Over time he became a national joke and embarrassment.  He ran for President nine times---1944, 1948, 1952, 1964, 1968, 1980. 1984, 1988, and 1992.  His only serious effort was in 1948, when he won some early primaries over New York Governor Thomas E. Dewey, the eventual nominee, and participated in a political debate the night before the Oregon Primary, the first such debate in modern times between contending Presidential candidates.  But he was third in delegates in early ballots at the Republican National Convention, and withdrew after the second ballot.  

In 1952, the Minnesota delegation abandoned Stassen and backed Dwight D. Eisenhower, who went on to defeat Ohio Senator Robert Taft for the nomination.  Stassen worked in the Eisenhower administration as Director of the Mutual Security Agency from January to August 1953, and as Director of the US Foreign Operations Administration from August 1953 to March 1955.  He also served as President of the University of Pennsylvania from 1948-1953 before his service for President Eisenhower.

Stassen also ran for Governor of Minnesota in 1982, Governor of Pennsylvania in 1958 and 1966, US Senate in Minnesota in 1978 and 1994, Mayor of Philadelphia in 1959, and US Representative in Minnesota in 1986.  Stassen was always perceived as a liberal Republican, a liberal Baptist who marched with Martin Luther King, Jr. in the March on Washington in August 1963.  He spoke up against an embargo on Cuba, and against the Vietnam War escalation, and participated as a delegate in the founding of the United Nations in 1945. He supported that institution throughout his long life, until passing away at the age of 93 in 2001.   

Bill Clinton, the only “Boy Governor” to become president, was elected Arkansas Governor at age 32, older than Stassen, but younger than Philip LaFollette.  Clinton first ran for public office at age 28, losing by 52-48 to an incumbent Republican Congressman, but then was elected Arkansas Attorney General at age 30 in 1976, and Governor in 1978.  He would lose his Governorship two years later, just as Philip LaFollette did, but came back in 1982 to the position, keeping it for the next ten years and serving a total of three two year and two four year terms, with the last term cut short by his election to the presidency.  

Clinton was a “New Democrat”, more centrist and moderate than Democrats in the 1980s, and he was not originally seen as a serious Presidential contender, particularly after his long winded nomination speech for Michael Dukakis at the Democratic National Convention in 1988, which led to cheers when he finished.  But in a stroke of luck, including better known Democrats choosing not to run, he overcame an early loss in the New Hampshire primary and private scandals, emerged as the Democratic Presidential nominee in 1992, and was elected over President George H. W. Bush and Independent H. Ross Perot, with only 43 percent of the total national vote. Therefore he became our third youngest president at inauguration at age 46 and five months, with only Theodore Roosevelt (42) and John F. Kennedy (43) being younger when taking the oath. 

Clinton would go on to have a very controversial presidency in many respects, and face impeachment during his second term for his private life scandals, but would overcome it and finish his two terms of office with a very high public opinion rating, rare for a president leaving office.  The assessment of his presidency has put him in the top third of all Presidents, most recently being rated number 15 in the C-Span Historians Poll of 2017.  

His wife, Hillary Rodham Clinton, would be equally controversial, going on to lose the Democratic nomination for President in 2008 to Barack Obama, become the nominee of her party in 2016, and lose in the Electoral College to Donald Trump despite a nearly 3 million popular vote victory. The Clintons have been a major part of the American political scene on the national level now for three decades, and assessments of both Bill and Hillary Clinton remain a controversial topic in the new decade of the 2020s.

Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/blog/154375 https://historynewsnetwork.org/blog/154375 0
What Will Happen on November 4? Steve Hochstadt is a writer and an emeritus professor of history at Illinois College. 


It looks like Biden will beat Trump badly and the Republicans will suffer disastrous losses across the country in November. Although the polls have just been inching toward the Democrats, suddenly articles about what Trump might do if he loses are multiplying, herehere, and here.


Trump might declare the elections fake, go on FOX News to say he had really won, call out the National Guard, barricade himself in the Oval Office, order the Secret Service to shoot Biden on sight.


But what he will do is the wrong question. What matters is what his supporters will do when they lose.


Three groups of supporters are crucial to observe. The media will focus at first on Republican politicians. Will Senate losers in Montana, Arizona, Colorado, and other states jump on the fake news bandwagon? Would Mitch McConnell go quietly if Amy McGrath beats him in a very close race? Maybe Susan Collins will accept defeat in Maine, but what about QAnon promoter Jo Rae Perkins in Oregon? How about Lauren Boebert, QAnon enthusiast and House candidate in Colorado, and the other 7 Republican congressional candidates on the ballot who have expressed support for QAnon?


The whole Republican Senate delegation has already cast doubt on the results by allowing their leader to proclaim unchallenged that the election will be fraudulent. On June 22, Trump said the 2020 election “will be the most RIGGED Election in our nations history”. In 2018, the National Republican Senatorial Committee followed up a Trump tweet about “electoral corruption” in the Arizona Senate race by charging that the election they lost there was rigged. Stripped of power, perhaps for years, it’s not hard to imagine Senators Tom Cotton and Lindsey Graham, among others, casting doubt on the results.


Thus far the bravest Republicans in Washington, aside from Mitt Romney, are Pennsylvania Sen. Pat Toomey, who called Trump’s pardon of Roger Stone a “mistake”, and Maine Sen. Susan Collins, who said she won’t campaign against Joe Biden. The majority of Senate Republicans cowers in silence.


Will Trump’s toadies now running our intelligence services and justice system speak up and what will they say? Across the country, Republican state legislators who lose their gerrymandered majorities could join the chorus.


A second set of Trump supporters use their control of media to send waves of influence into every corner of America. Will FOX News report the official results or attack them? How about the other people to whom Republicans listen, even more partisan and less connected to reality, like Limbaugh, Breitbart, InfoWars, Drudge? Will powerful evangelical pastors Franklin Graham and Robert Jeffress proclaim that their God-given leader was cheated?


The third group is the most important and hardest to gauge – the MAGA-hat-wearing, Confederate-flag-waving white supremacists and conspiracy partisans who make up his legendary “base”. Will they see the end times coming as their messiah is defeated? Will the dozens of armed militia groups adopt 2nd Amendment remedies to the impending takeover of America by radical socialist pedophiles? What will Ammon Bundy, the boogaloo boys, the Oath Keepers, and the more than 500 other anti-government groups loosely allied in the so-called “patriot movement” identified by the Southern Poverty Law Center do? Will the NRA call out its members to fight the commies?


Trump could scream himself hoarse with no effect unless his supporters sing along. As full election results trickle in the days after November 3 and as the implications sink in with the approach of Inauguration Day on January 20, 2021, a coalition of supporters in Washington, state governments, media, and on the ground might throw our country into an existential, not merely Constitutional crisis.


It’s certain that Trump will do nothing brave himself, will commit himself to no action that he can’t back out of. But the radicals who drive their cars into Black Lives Matter protesters, who bring their assault rifles to the pizza palace, and who believe anything “Q” says are much more volatile, unhinged, and violent.


Whatever answers such questions might elicit now will change over the next few months, as Trump stokes more fear among his supporters, pushes them further away from the American center, and forces his Republican allies to come along.


Watch Reps. Jim Jordan (Ohio), Matt Gaetz (Florida) and Minority Leader Kevin McCarthy (California), Gov. Ron DeSantis (Florida), and Mark Meadows, White House Chief of Staff. Watch wily Mitch McConnell, who shares with Trump a powerful belief in the importance of his own survival.


And watch out for America.


Steve Hochstadt

Springbrook WI

July 14, 2020


Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/blog/154376 https://historynewsnetwork.org/blog/154376 0
The Roundup Top Ten for July 17, 2020

Equal Opportunity is Not Enough

by Elizabeth M. Smith-Pryor

The myth of America as an equal opportunity society has historically allowed white Americans to hold out equality as a promise redeemable in the future but rarely available in the present.


How Should Teachers Handle the Movement to 'Rewrite' High School History? Embrace It

by Jack Doyle and Chris Doyle

America today is a product of the past and not immune from its racist legacy. Combating racism, now, requires suspending overly optimistic narratives of its demise.



Americans Are The Dangerous, Disease-Carrying Foreigners Now

by Erika Lee

For centuries, we have been the ones demonizing foreigners as carriers of infectious disease. And we have been the ones banning immigrants in the name of protecting Americans’ public health.



Facing America's History of Racism Requires Facing the Origins of 'Race' as a Concept

by Andrew Curran

Many of the most rearguard and unscientific European notions regarding race have remained deeply embedded in the American psyche.



When Plague Is Not a Metaphor

by Hunter Gardner

It's not always a blessing when current events make a researcher's specialty suddenly and urgently relevant. 



The Goya Boycott is Something Much More than "Cancel Culture"

by Allyson P. Brantley

What William (Bill) Coors complained was “political persecution” was, for boycotters, a tool of political expression — of refusing to financially support policies that maligned and marginalized their communities and those of their allies.



Veterans Go to Washington--So What?

by Nan Levinson

Speculation about the effects of electing veterans to national office is seldom historically informed. Although it's assumed military experience and leadership would shape a legislator's vote, today's partisanship is probably the biggest influence. 



The Campus Confederate Legacy We’re Not Talking About

by Taulby Edmondson

When a fraternity chapter sued him for defamation for remarking that it actively preserved the "Lost Cause" mythology of the Confederacy, the author went to the archives to defend himself. 



How a History Textbook Would Describe 2020 So Far

by James West Davidson

A historian imagines the chapter high schoolers might read one day about this momentous time.



The Coal Strike That Defined Theodore Roosevelt’s Presidency

by Susan Berfield

To put an end to the standoff, the future progressive champion sought the help of a titan of business: J.P. Morgan.


Mon, 03 Aug 2020 08:48:57 +0000 https://historynewsnetwork.org/article/176446 https://historynewsnetwork.org/article/176446 0