Political Messiahs ~ From Hitler to Trump

“To be ignorant of what occurred before you were born is to remain  always a child.”  — Cicero

It’s January 1933, the month of Hitler’s ascension as Chancellor of Germany, and a new morning for the Nazification of Christianity has arrived.  So said Emanuel Hirsh, for whom there was no distinction between Christian belief and German Volk.  So said Paul Althaus, for whom “the German Hour of the Christian”  had arrived.  And so said Gerhard Kittel, Germany’s leading Nazi Christian theologian and Senior Editor of the “Theological Dictionary of the New Testament.”  In the 1920s, Kittel had expressed admiration for Jewish scholars of first-century Palestine, remarking in 1926 that everything Christ said could be found in the Talmud.  But now it is 1933, and in his lecture, “Die Judenfrage,”  Kittel clarified that only Jews of the first century were admirable.  Contemporary Jews, on the other hand, could be tolerated temporarily as guests in Germany, but only until they formed their own non-German Volk.  Kittel would serve 17 months in prison after the war.

For the theologians and countless other Germans, Hitler was the new Luther, indeed, as Dietrich Echart proclaimed long before Hitler’s ascension to power, the Messiah.[i]  The Nazi Christian movement, abetted by the establishment of the Protestant Reich Church (Evangelische Reichskirche) in 1933, grew by leaps and bounds.  The Evangelicals led the charge, coalescing into a fighting church of young white anti-Semites aligned with Nazism and celebrating the Führerprinzip, the leader principle.   With Hitler at the helm, the dissolute Weimar Republic would be supplanted by a revitalized Germany unrestrained by the Treaty of Versailles.  Crippling inflation would give way to a robust economy, safeguarded by a reconstituted German army.  German Jews, who poisoned the Volk and threatened the campaign of rebuilding, would be eliminated.  In a word, Germany would be made great again.

This cartoonish image, “the seed of peace, not dragon’s teeth,” appeared in the magazine Kladderadatsch on March 22, 1936.  The angel behind Hitler blows its horn to herald Hitler’s arrival as the new savior.

The Führerprinzip was no less salient in Italy, where Mussolini,  appointed Prime Minister by King Victor Emmanuel III in 1922, was an inspiration to the young Hitler.  Three years earlier, he founded Italy’s National Fascist Party, and, as PM, lost no time in transforming the liberal Italian state into a brutal fascist dictatorship.   From the beginning, his followers hailed him as  Il Duce or, in Latin, Dux, their supreme leader.   No sooner did Mussolini come to power than Piedappio, the village of his birth, became the site of daily pilgrimages, with followers visiting the family crypt and paying homage to the mother, who gave birth to Il Duce in 1883.[ii]  By 1926, Italians universally embraced him as, in the phrase later used by Pope Pius XI, a “man of Providence.”[iii]

In 1937, on the eve of Il Duce’s visit to Sicily, one excited Sicilian exclaimed, “We await our father, the Messiah. He is coming to visit his flock to instill faith.”  He was not alone in his veneration.  Pope Pius XI, ever grateful to Mussolini for his key role in signing the Lateran Pact of 1929, and convinced the Generalissimo would restore Catholicism’s privileged role in Italian life, called Mussolini the “man whom Providence has sent us.”  And Mussolini, a fervent socialist and anti-cleric until 1919, came under his own spell:  he believed he was indeed the Chosen One, destined to effect Italy’s spiritual rebirth under the banner of fascism.

_________________________

“And on June 14, 1946, God looked down on his planned paradise and said, ‘I need a caretaker.’  So God gave us Trump.”[iv]

So intones the narrator of “God Made Trump,” the video produced by the Dilley Meme Team that has invaded the internet in recent weeks.  Trump, mindful of his divine calling, saw fit to post the video on his social media platform and has broadcast it at campaign events.  What can one say of a Christian god who, in His wisdom, chooses as messianic caretaker a convicted sexual predator whose life is suffused with corruption, misanthropy, and criminality?  Obviously, Evangelical Christians, who hail Trump’s ordained arrival, are content to brush aside matters of character and behavior.  God may well choose as caretaker a man who has lied, cheated, and intimidated his way to power, a man whose pretense of religiosity was openly ridiculed by his own Evangelical vice-president.   During his time in office, Trump’s sole visit to St. John’s Episcopal Church, be it noted, was a crass photo-op, with police using tear gas and rubber bullets to remove peaceful protestors from Lafayette Square, so that Trump’s  walk from the White House,  to the photographers’ delight, would be straight and clear.[v]

But manifest irreligiosity has not mattered.  For a majority of American Evangelical Christians, immoral acts in personal life do not compromise a politician in the public sphere, where ethical sensibilities may, mysteriously, be revitalized.[vi]  But in Trump’s case the very distinction is moot:  Moral depravity and criminal behavior have been foundational to both spheres.

But wait – haven’t we  been here before?  Wasn’t Mussolini, in his biographer’s words, “God’s chosen instrument of Italy’s spiritual rebirth”?[vii]  And wasn’t Spain’s Franco the darling of the Roman Catholic Church, God’s warrior for ridding the nation of  godless Communists, tearing down the Second Republic, and restoring the Church’s privileged role in Spanish life?   After the Spanish Civil War (1936-1939), didn’t Pope Pius XII himself, staunch (if silent) supporter of Franco throughout the war, bless Franco as Defender of the Catholic Faith?  In so doing, the Pope managed to overlook the 100,000 Republicans executed by Franco’s Fascist Nationalists during the war, along with an additional 50,000 put to death at war’s end in 1939.   Right into the 1960s, when the Vatican, under Pope Paul VI, began to retreat from Franco, he continued to cloak himself in the mantle of Catholic chosen-ness, appearing at national events, such as Leon’s  Eucharist Congress of 1964, with the golden chain and cross of the Vatican’s Supreme Order of Christ proudly draped over his army uniform.   Perhaps the National Association of Evangelicals will confer on the irreligious Trump a wearable decoration that consecrates his status as America’s Messiah.[viii]

Back in Italy, Mussolini’s charge, throughout the 1930s, was to make Italy great again, to forge a New Roman Empire that would dominate the Mediterranean and, through a vast colonial empire, become an international power of the first order. The aim of his regime, he proclaimed in open-air speeches broadcast over radio, was “to make Italy great, respected, and feared.”[ix] Now, of course, Christians are exhorted to embrace Donald Trump as the divine caretaker who will “fight the Marxists” in our midst.  So avers the narrator of “God Made Trump.”  “Donald Trump carries the prophetic seal of the calling of God,” chimes in cultist pastor Shane Vaughn.  “Donald Trump is the Messiah of America.”[x]   For Trump, of course, it is the “radical left,” a jumble of  Communists, Marxists, atheists, and Democratic appointees that threaten him, and by implication, the nation.  They have all coalesced into a mythical “deep state” that, in some imponderable way, stole the 2020 presidential election from him and led to the criminal indictments and civil suits in which he is ensnared.   His plaint is about as convincing as Hitler’s insistence that, in some equally imponderable way, international Jewish business interests conspired to defeat Germany in World War I and then impose the ruinous Versailles Treaty.

In America today, we are long past George Santayana’s cautionary words of 1905, “Those who cannot remember the past are condemned to repeat it.”[xi]  For a disconcertingly large segment of the American electorate, including a majority of surveyed Christian Evangelicals, the past has indeed gone unremembered, and a new Chosen One has appeared on the scene.  Throughout the 20th century, the divination of autocratic political leaders has had horrendous consequences.  To indulge in it yet again, and now in support of  a psychopathic miscreant, is to proffer an apotheosis that is, quite literally, mind-less.

Trump was a child of privilege who never outgrew the status of a privileged child.  We are well advised to heed Cicero’s admonition, to learn what occurred before us, and to push beyond Trump-like arrested development that consigns us to permanent childhood – a childhood that, by encouraging the sacralization of political figures, is pernicious and potentially disastrous.

___________________

My great thanks to my gifted wife, Deane Rand Stepansky, for her help and support.

[i] See, for example, David Redles, Hitler’s Millennial Reich:  Apocalyptic Belief and the Search for Salvation (NY:  New York Univ. Press, 2008), chapter 4, “Hitler as Messiah.”

[ii] Piedappio, liberated from fascism in 1944, remains a Mussolini pilgrim site to this day.  Piedappio Tricola, its souvenir shop, “teeming with fascist memorabilia, including copies of Adolf Hitler’s Mein Kampf, has always done brisk trade.” “Pilgrims to Mussolini’s birthplace pray that new PM will resurrect a far-right Italy,” The Guardian, October 23, 2022 (https://www.theguardian.com/world/2022/oct/23/pilgrims-to-mussolinis-birthplace-pray-that-new-pm-will-resurrect-a-far-right-italy).

[iii] John Whittam, “Mussolini and the Cult of the Leader,” New Perspective, 3(3), March 1998 (http://www.users.globalnet.co.uk/~semp/mussolini2.htm).

[iv]  “God Made Trump” (https://www.youtube.com/watch?v=lIYQfyA_1Hc).

[v] “Trump shares bizarre video declaring ‘God  made Trump,’ suggesting he is embracing a messianic image” (https://www.businessinsider.com/trump-shares-bizarre-video-declaring-god-made-trump-2024-1); “‘He did  not pray’: Fallout grows from Trump’s photo-op at St. John’s Church”  (https://www.npr.org/2020/06/02/867705160/he-did-not-pray-fallout-grows-from-trump-s-photo-op-at-st-john-s-church).

[vi] In 2010, according to a public opinion poll conducted by the Public Religion Research Institute, only three in 10 American Evangelicals believed immoral acts in personal life did not disqualify a person from holding high public office; in 2016, following Trump’s election to the Presidency, this percentage had increased to 72%.  What, pray tell, is the percentage now?  These statistics come from Robert Jones president of PPRI. “A video making the rounds online depicts Trump as a Messiah-like figure” (https://www.npr.org/2024/01/26/1227070827/a-video-making-the-rounds-online-depicts-trump-as-a-messiah-like-figure).

[vii] See Denis Mack Smith, Mussolini (Essex, UK:  Phoenix, 2002), passim.

[viii] For the key facts of the Spanish Civil War, see the Holocaust Encyclopedia (https://encyclopedia.ushmm.org/content/en/article/spanish-civil-war#:~:text=During%20the%20war%20itself%2C%20100%2C000,forms%20of%20discrimination%20and%20punishment).  On the Church’s measured retreat from Franco, and Franco’s appearance at the Eucharist Congress in Leon of July, 1964, see “Franco Stresses Spain’s Ties to Church,”  New York Times,  July 13, 1964.

[ix] See, inter alia, Stephen Gundle, Christopher Duggan, & Guiliana Pieri, eds., The Cult of the Duce:  Mussolini and the Italians (NY:  Manchester University Press, 2013).

[x] Vaughn’s declaration of Trump’s divinity is quoted on various internet sites, e.g., https://twitter.com/RightWingWatch/status/1610661508601487363.

[xi] In 1948, Winston Churchill  recurred to Santayana’s warning even more tersely:  “Those that fail to learn from history are doomed to repeat it.”

Copyright © 2024 by Paul E. Stepansky.   All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

The Moral Health of America**

The criminal indictments of Donald Trump, we are told, ad nauseum, represent a “test,” indeed, a “stress test,” for American democracy.  And it is a test, some claim, that we are failing.  Really?  Trump’s criminality has resulted in multiple criminal indictments for which he will stand trial and hopefully end up in prison. I tend to the position of Steve Benen, for whom Trump is a scandal-plagued politician in a country that has “all kinds of experience with scandal-plagued politicians getting indicted.  It happens all the time.  It doesn’t tear at the fabric of our civic lives.  It does not open the door to political violence.  It is, for lack of a better word, normal.”[1] 

Well, not exactly normal.   Trump is not only the first ex-president so indicted, but an ex-president whose toxicity has suffused the nation for six years.  He is an ex-president who  commands, via social media, an army of volatile extremists fully capable of mass violence on behalf of their leader, their very own Führer.  My problem is that political commentators who dwell on Trump’s crusade to become an American autocrat by overturning a presidential election miss the bigger picture:  the failed tests of democracy that litter American history.  In what follows, I want to consider the Crimes of Trump in a different way.  I want to take them as an opportunity to look anew at a few of the real failures of American democracy. 

You want a “failure” of democracy?  How about the widespread implementation of coercive sterilization that, beginning with Indiana’s eugenic sterilization law of 1907, robbed the nation’s undesirables – the “feebleminded,” the physically handicapped, the sexually “impure,” the chronically alcoholic, the wrong type of immigrants – of the right to procreate.  Sterilization was the best way to prevent pollution of the gene pool of America’s Nordic white stock, and, in so doing, spare states the cost of maintaining future generations of defectives in publicly funded institutions.[2]

We should explore, among the nation’s most egregious failed tests, the unlawful and criminally abusive Indian private boarding schools  that, from 1879 to 1969, embodied the federal government’s systematic effort to eradicate all aspects of ethnic identity among Native American children.  In 408 schools in 37 states and territories,[3] Indian children as young as three were ripped away from their families and incarcerated in what were, in actuality, penal colonies funded by the Department of the Interior and run by the Catholic Church, especially the Jesuit orders.  In “schools” built on unlawfully seized tribal lands, Indian children had tribal language, song, and tradition literally beaten out of them, sometimes to the point of death; suicide and death from drug use were common. The children’s unmarked graves in school basements and on school properties came to light in the 1990s; mass graveyards continue to be discovered to this day.[4]

And what of the 45,000 Native Americans, many products of the private boarding schools, who served in America’s military during the Second World War?   Despite Congressional passage of the Indian Citizenship Act of 1924, postwar state legislatures, especially in the West, refused to comply.  States like Utah, Arizona, New Mexico, and Maine strengthened “Jim Crow, Indian Style” to keep Indians, including veterans, away from the ballot box. Poll taxes, literacy tests, rejection of non-valid (i.e., reservation) mailing addresses, location of polling places far from tribal communities – such was the welcome afforded Indians who battled Hitler, Mussolini, and Hirohito in the name of representative democracy.  Among the veterans were the Navajo Code Talkers, many from Arizona and New Mexico, whose service in the Pacific Theater was crucial to the successful island-hopping of the U.S. Marines.  Only in 1957 did the Utah legislature grant Native Americans the vote.  Even the Voting Rights Act of 1965 proved unavailing in Maine, which finally relented in 1967.

African Americans are no strangers to democracy’s unconscionable failures.  Racism has seeped deep into the bone marrow of America, and the Civil War and its aftermath did little if anything  to temper it.   Need we mention the race laws that governed life in the Jim Crow South and provided an inspiring precedent for the Nazi lawyers who formulated the Nuremberg Laws of 1935 ? Consider the fate of the gallant African American soldiers who served in the Great War over a half century after the Civil War ended.   The all-black 369th infantry regiment of New York, the legendary “Harlem Hellfighters,” had been unceremoniously assigned to the beleaguered French Army in March 1918.  Under French command, the soldiers proved extraordinary combatants, and on December 13, 1918, the French government conferred its Croix de Guerre regimental citation on the 369th  as well as individual Croix de Guerre medals on  171 regimental members.    

But there was nothing of liberté, égalité, fraternité in their homecoming.  During the Red Summer of 1919, race riots, often potentiated by the sight of black veterans in uniform, occurred in at least 26 cities.  The sight of black troops in uniform was deeply threatening; local officials in Mississippi and Alabama had them strip naked on train platforms by way of jolting them back to second-class status; those who refused were severely beaten, sometimes to death.  In 1919, of 83 recorded lynchings, 76 were of African Americans, of whom more than 11 were returning veterans.  

Even Nazi “racial hygiene” of the 1930s failed to mitigate American racism.  In the aftermath of Pearl Harbor, racist paranoia consumed the nation.  How can we forget the 120,000 Japanese American citizens of coastal California subject to President Franklin Roosevelt’s Executive Order 9066 of February 19, 1942?  These American citizens were forcibly relocated to one of ten concentration camps — euphemistically, “relocation” camps —  throughout the western U.S.   Their status as law-abiding citizens counted for nothing; their Japanese ancestry for everything.  Businesses, homes, property, possessions – all were lost in this sterling example of  American democracy in all its foundational impotence.    

And what of African American male citizens who, in the wake of Pearl Harbor, sought to serve their country?  African American recruits were denied enlistment in the Marine Corps and Army Air Corps.  But they were welcomed into the U.S. Navy, as long as they were content to serve as mess attendants. 

The U.S. Army was little better.  In 1942 African Americans might enlist, but could only serve as replacement troops in the small number of all-black units.  And what did they do?  They were limited to construction and transport units that built the roads and landing strips that took white troops to theater of engagement.   Sixty percent of the troops that built the 1,100-mile Ledo Road linking India and China were African Americans, as were one-third of those who built the 1,600-mile Alaska (aka Alcan) Highway that linked Alaska, Canada, and the continental United States.  The latter project, begun three months after Pearl Harbor, took the black regiments to Alaska, where they began work in arctic temperatures that, one Veteran recounts, fell to -65F.  African American troops spent nights in cloth tents, in which accumulated frost served as insulation.  Their white counterparts were housed in Nissen huts and on army bases.[5]  

An Army Corps Engineer at work on the Alaska Highway in early 1942.

 Black troops in combat?  No, not in 1942.  Black combat pilots?  Perish the thought.  A war department report of 1920 declared African Americans lacked the intelligence, discipline, and courage to pilot aircraft.  And then came the Tuskegee Experiment at the 66th Air Force Flying School in Tuskegee, Alabama – an experiment programmed to fail – and, following the forceful advocacy of Eleanor Roosevelt, the fabled Tuskegee Airmen.

_______________________

It’s 1924 and Albert Priddy, superintendent of the State Colony for Epileptics and Feebleminded outside Lynchburg, Virginia, has a problem.  Like superintendents of institutions for mental, moral, and physical “defectives” that dot the country, Priddy selects those among his wards to be involuntarily sterilized in order to prevent them from passing their hereditary taint onto the next generation.  Since his appointment in 1910, when the Colony opened, he has ordered the sterilization of more than 100 women, and his authority has been formalized in Virginia’s coercive sterilization law of 1924.  Priddy himself helped draft the bill.  According to the law, residents deemed “insane, idiotic, imbecilic, feeble-minded, or epileptic, and by the law of heredity probably potential parents of socially inadequate offspring likewise afflicted” should be sterilized within the institutions.[6]   

Priddy’s problem is that he had recently been successfully sued by the husband of a female resident sterilized by his directive.  To prevent any such eventuality in future, he now decides to personally engineer the legal appeal of one Carrie Buck, a young woman resident and the unwed mother of an infant daughter who lives with her in the Colony.  

Carrie Buck is selected to “protest” her involuntary sterilization and to see it through the court system.  But her odyssey through the courts – the County Court of Appeals,  the Virginia Supreme Court, the U.S. Supreme Court – is sham, the outcome preordained in advance by those who have orchestrated her “appeal.”   It is Priddy, in collaboration with a crony, State Senator Aubrey Strode, who has planned the ruse and then selected Buck’s “defense” attorney, charged with working for the young woman’s “acquittal.”  In reality, the attorney, one Irving Whitehead, was a founding member of the Colony who sought the same  judicial outcome as Priddy and Strode:  Carrie Buck’s involuntary sterilization.  Needless to say, he  adduced no evidence at all on his “client’s” behalf. 

Carrie and Emma Buck at the Virginia Colony in 1924, shortly before Carrie’s case went to trial.

So the constitutional legitimacy of Virginia’s coercive sterilization law was upheld, consecutively, by the Amherst Country Court of Appeals, the Virginia Supreme Court, and, finally, the U.S. Supreme Court.  The Supreme Court found for the superintendent of the Virginia Colony in an 8-1 decision.  Chief Justice William Howard Taft,  an avowed proponent of coercive sterilization, assigned the majority decision to Associate Justice Oliver Wendell Holmes, another proponent of sterilization in the service of race betterment.  

There was little, it turned out, to adjudicate.  Eugenics of the time deemed feeble-mindedness a matter of heritable germ plasm, hence a family trait.  It followed that assessment of individuals for coercive sterilization typically entailed investigation of family members.  Buck’s mother, it turned out, was also “defective” and had resided in the Colony since 1920.  Not only that, Buck’s infant daughter, Amelia, had been examined by the Colony physician and, so he swore in court, already showed signs of feeblemindedness.  After breezily remarking that involuntary sterilization served public welfare no less than vaccination, Holmes ordered the sterilization of the allegedly imbecilic and sexually deviant Carrie Buck. “Three generations of imbecility are enough,” he concluded.[7]    

After Buck, eugenic sterilization became lawful in the U.S., “fusing sterilization and eugenics in the public mind.”[8]  But the case, we now know, manipulated the judicial system, and not in the interest of justice for the young woman about to lose her ability to procreate.  In point of fact, Carrie Buck was far from feebleminded.  She did well in grade school, earning special praise from her teachers for attendance and neat handwriting until her foster parents removed her from school after fifth grade.   A feebleminded “moral delinquent” (Priddy’s words) destined to pass on defective genes to her progeny?  Hardly.  Buck’s pregnancy arose from rape by the nephew of her foster mother, Alice Dobbs. Fearing social embarrassment, the Dobbses petitioned Amherst County Court of Appeals to have Carrie incarcerated.  The request was granted, and with the help of a complicit social worker, Carrie was packed up and deposited at the Lynchburg Colony.  The social worker’s report of feeblemindedness and “moral delinquency” was all that Superintendent Priddy had to hear.  

And Carrie’s infant daughter Vivian, characterized by the social worker as “not quite normal”?  She died from pneumonia at age eight, but not before her name was added to her school’s honor roll.  She never showed any signs of the “imbecility ” ascribed to her by Priddy in court.[9]   (Hansen & King, 110-113, and elaborated by Lombroso).  She was simply Carrie Buck’s daughter, and therefore part of the “deliberate plot carefully orchestrated by powerful institutional actors bent on winning constitutional authority to compulsory sterilization at any cost.”[10]    

Buck v. Bell took place a decade after America’s arch-eugenicist, Harry Laughlin, administrator of the racist Eugenics Records Office at Cold Spring Harbor, Long Island, estimated that 15% of the American population, or 15 million people, would have to be sterilized to rid the U.S. of defective genetic stock.  In 1924, the very year Buck v. Bell began its bogus journey through the court system, Congress passed the highly restrictive Immigration Law that remained on the books until 1965.[11]  It followed by seven years the Immigration Act of 1917, which set in place the immigrant quota system while extending an immigrant’s eligibility period for deportation to five years.  The Act was known as the “Asiatic Barred Zone Act,” since it barred Asians and other non-whites from immigrating.  The Emergency Quota Act of 1921 restricted immigration from individual countries still further.  

America is a nation of immigrants that has never warmed to immigrants.  “Xenophobia,”  the   immigration scholar Erika Lee remarked a year ago, “powers the United States.”[12]  Over the past 200 years, America has admitted over 80 million immigrants, all the while clinging to its irrational hostility to foreigners.  Small wonder that Trump, the Prince of Xenophobes, himself the son and grandson of European immigrants, fired up his MAGA troops to keep  immigrants away from these shores.  Muslims found no safe harbor here, nor were asylum-seeking refugees from Central America welcome.  In January 2017, Trump issued an executive order to construct a wall to keep Mexicans and South Americans, however desperate, from crossing the southern border.  

What followed?  Increasing numbers of immigrants were deported in record time; sanctuary cities became a thing of the past.  The fragments of a wall never erected became a shrine to American nativism and exclusionism.  Trump’s mother, who emigrated from Scotland in 1930, and his paternal grandfather, who emigrated from Germany in 1885, encountered no such wall, literally or figuratively, when they reached America.

For Trump himself, matters could have been different.  The Immigration Law of 1917 stipulated among reasons for exclusion or subsequent deportation, “constitutional psychopathic inferiority.”  Psychiatrists were available to make a determination of mental fitness.  Consider this thought experiment:  Imagine the ex-President of a South American dictatorship, El Presidente Donald Trump.  A century ago, he seeks asylum in America following a popular uprising that repudiated his policies and sent him packing.  But he is in the grips of  an intractable narcissistic personality disorder so severe that he informs the interviewing psychiatrist that he is the Chosen One, destined to revitalize American greatness, and, given his mission, he cannot be subject to the American legal code.[13]  It is doubtful he would have been permitted to disembark.  That, at least, would have been a happy outcome of the Immigration Act of 1917.

_______________________

**I am grateful to my wife, Deane R. Stepansky, for suggesting the title of this essay.  Her great  love and support — not to mention her crack skills as Latinist,  grammarian and proofreader — inform all the essays  gathered in “Medicine, Health, and History.”   

[1] Steve Benen, “Trump’s indictment creates a test our democracy can easily pass,”  MaddowBlog, March 31, 2023 (https://www.msnbc.com/rachel-maddow-show/maddowblog/trumps-indictment-creates-test-democracy-can-easily-pass-rcna77553).

[2] American eugenicists like Madison Grant and Henry Stoddard  proved an inspiration to German colleagues like Alfred Ploetz, the founder of Nazi racial “science.”   Indeed, at the first International Congress for Eugenics in London in 1912, Ploetz proclaimed the U.S. a bold leader in the field of eugenics, foreshadowing the relationship between German and American eugenicists for whom the feebleminded, the physically handicapped, the criminals, alcoholics, and sexually “deviant” were all unworthy forms of life.  Stefan Kühl, The Nazi Connection: Eugenics, American Racism, and German National Socialism (NY:  OUP, 1994), ch 1, and more expansively and analytically, James Q. Whitman, Hitler’s American Model:  The United States and the Making of Nazi Race Law (Princeton:  Princeton Univ. Press, 2017).  Whitman begins his book by noting that the transcript of the meeting among leading Nazi lawyers of June 5, 1934 – the meeting that outlined the three Nuremberg Laws of 1935 – reveals “detailed and lengthy discussions” of the race laws of the United States, especially the Jim Crow laws of the American South (pp. 1-5, 12-13, 29).  American race law, the Nazis well understood,  entailed far more than segregation; it encompassed  immigration, citizenship (including requirements for naturalization), and anti-miscegenation laws (32ff.).  

[3] The definitive source of facts and figures regarding the Indian private boarding schools is the Federal Indian Boarding School Initiative Investigative Report, released by the Department of the Interior on May 11, 2022.  It is available in its entirety at   https://www.bia.gov/sites/default/files/dup/inline-files/bsi_investigative_report_may_2022_508.pdf.

[4] Life in the private boarding schools is recounted by survivors and their descendants in two successive Reveal  podcasts, “Buried Secrets: American Indian Boarding Schools,” Parts 1 & 2 released  on March 18 and March 25, 2023.  Both can be accessed at: https://revealnews.org/?s=Buried%20Secrets%3A%20.  The podcasts focus on the Red Cloud Indian School of Lakota County, South Dakota.  For an insightful overview of all the private boarding schools, see David W. Adams, Education for Extinction:  American Indians and the Boarding School Experience, 1875-1928 (Norman:  Univ. Press of Kansas, 1995)  and Brendan J. Child, Boarding School Seasons: American Indian Families, 1900-1940  (Lincoln: Univ. of Nebraska Press, 1998), which makes extensive use of letters by students, parents, and school officials.

[5] From the recollections of Reginald Beverly, member of the 95th Engineering Regiment, one of the African American units that constructed the Alaska (Alcan) Highway in 1942, recorded in “African Americans in World War II:  A Legacy of Patriotism and Valor” (1997), available  at  https://www.youtube.com/watch?v=vGpP3mj6FrU.  For a readable overview of the experience of African American soldiers, including their experience in WWI, see Peter C. Baker, “The Tragic, Forgotten History of Black Military Veterans,” The New Yorker, November 27, 2016 (https://www.newyorker.com/news/news-desk/the-tragic-forgotten-history-of-black-military-veterans).   General interest articles on the Harlem Hellfighters are plentiful on the internet; those interested in more scholarly historical presentations might begin with Stephen L. Harris & Rod Paschal, Harlem’s Hellfighters: The African-American 369th Infantry in World War I (Sterling, VA: Potomac Books, 2005). 

[6] Randall Hansen & Desmond King, Sterilized by the State: Eugenics, Race, and the Population Scare in twentieth-Century North America (Cambridge:  Cambridge University Press,  2013 ),104.   All the histories of eugenics in America discuss Buck v. Bell.  I have found Hansen & King the best chapter-length presentation.  For those interested in a  detailed, book-length account, there is  Paul A. Lombardo, Three Generations, No Imbeciles:  Eugenics, the Supreme Court, and Buck v. Bell (Baltimore:  Johns Hopkins Univ. Press, 2008).

[7] Buck v. Bell, 274 U.S. 207 (1927).

[8] Molly Ladd-Taylor, Fixing the Poor:  Eugenic Sterilization and Child Welfare in the Twentieth Century (Baltimore:  Johns Hopkins Univ. Press, 2017), Introduction, and Hansen & King, op. cit., 110.

[9] Hansen & King, op. cit., 110-113, elaborated at length in Lombardo, op. cit.

[10] Hansen & King, op cit., 114.

[11]   Ibid., 157, and, more expansively, in Henry Friedlander, The Origins of Nazi Genocide:  From Euthanasia to the Final Solution (Chapel Hill:  Univ. of North Carolina Press, 1995), ch 1.

[12] Erika Lee, “Xenophobia Powers the United States,” Public Books (https://www.publicbooks.org/xenophobia-powers-the-united-states), 6/15/2022.

[13] Trump is a textbook case of    “narcissistic personality disorder” in the sense of Heinz Kohut, the founder of post-Freudian psychoanalytic self psychology.  See  Kohut, The Restoration of the Self (NY:  IUP, 1977), and How Does Analysis Cure? edited by Arnold Goldberg, with the collaboration of Paul E. Stepansky  (Chicago:  University of Chicago Press, 1984).  In this type of primitive pre-Oedipal pathology, the patient’s “grandiose self,” which requires a continuous stream of affirmation from others in the form, say, of mirroring or idealization, always masks a profound sense of inferiority.  In Kohut’s lexicon, the grandiose self is “archaic” and, as such, brittle and prone to fragmentation.   I discuss Kohut in my consideration of medical empathy in In the Hands of Doctors:  Touch and Trust in Medical Care (Santa Barbara:  Praeger, 2016; Keynote Books pbk, 2017), 55-59, 72-73.

Copyright © 2023 by Paul E. Stepansky.   All rights reserved. The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Christian Healthcare for Christian Nationalists

 

Christian Nationalism (CN):   The belief that the United States is, and always has been, a Christian nation.  An oxymoron in the context of the explicit language of the Bill of Rights and the Constitution.

CN Proponents:  Titans of  American Ignorance in all its anti-historical, anti-rationalist, anti-democratic glory.  Nationalism, as set forth in the Bill of Rights and the Constitution, does not permit the qualifier, “Christian.”  The phrase is quite literally non-sensical.

America’s Founding Fathers:   A group of educated gentlemen, some avowed Christians and others deists influenced by English and French freethinkers. Whatever their personal convictions, the Founders collectively established  a secular republic predicated on religious freedom and the separation of Church and State.  The documents they bequeathed to us and that continue to shape our national sense of self – the Declaration of Independence and the Constitution – do not establish a Christian nation.[i]

_____________________

It’s August 21, 1790, and George Washington sets pen to paper and writes a letter to the Hebrew Congregation of Newport, Rhode Island.  Following the state’s ratification of the Constitution, Washington congratulates the Newport congregants for joining a nation where “every one shall sit in safety under his own vine and fig-tree and there shall be none to make him afraid.”   And he continues: “For happily the Government of the United States gives to bigotry no sanction, to persecution no assistance, requires only that they who live under its protection should demean themselves as good citizens, in giving on all occasions their effectual support.”

Now it’s  June 7, 1797, and Washington’s successor, John Adams, adds his ringing endorsement to Congress’s unanimous ratification of the Treaty of Tripoli.  All the nation’s “citizens and inhabitants thereof”  are enjoined “faithfully to observe and fulfill the said Treaty and every clause and article thereof.”   Article 11 of the Treaty begins with this avowal: “the government of the United States of America is not in any sense founded on the Christian Religion.”  The Article in its entirety mitigates in no way at all the plain meaning of this statement.

And now on New Year’s Day, 1802, Thomas Jefferson composes a letter to the Danbury Baptist Association.  Here the  third president asserts, famously, that “the legitimate powers of the government reach actions only, & not opinions.”   It followed that Jefferson contemplate[d] with “sovereign reverence that act of the whole American people which declared that their legislature should ‘make no law respecting an establishment of religion, or prohibiting the free exercise thereof,’ thus building a wall of separation between Church & State.”[ii]

These words tell us how the first three American presidents understood the nation they helped create.  But more important than these words or any other words they wrote or spoke, is the document they and their colleagues signed in 1787 and bequeathed to future generations.  This charter of government, the Constitution of the United States, took effect on March 9, 1789 and has guided the nation these past 233 years.

____________________

Christian Nationalism, code for an un-American Christian Nation-State, seeks to overthrow the Constitution, the founding document on which the American Republic is predicated.  The prospect of a Christian Nation that consigns the values, principles, and precepts of the Constitution to the dustbin of history is the stuff of nightmares.  Many nightmares.  What follows is a gloss on one of them:  What might well follow, indeed, what ought to follow, in the domain of healthcare after the CNs come to power:

  1.  Society’s commitment to public health would be overturned by the Supreme Court.  Jehovah’s Witnesses and Christian Scientists would be entitled, as a matter of law, to deprive their children of life-saving blood transfusions and tissue and organ transplants.  If you’re a Christian Scientist parent, for example, go ahead and let your children die, as they have in the past, from untreated diabetes (leading to diabetic ketoacidosis), bacterial meningitis, and pneumonia.  What American courts define as criminal neglect would be sanctioned – as long as it resulted from one or another variety of Christian belief.  A litmus test for membership in the Christian Nation could be repudiation of compulsory childhood vaccination, with parents who deny their children vaccination on religious grounds applauded for putting their children at risk, and sometimes sacrificing them, out of adherence to their version of a Christian Life.  Similarly, during times of pandemic, Christians who, as beneficiaries of Divine protection, chose to ignore quarantine directives from leftist health organizations like the CDC and WHO would  receive the blessing of the  State.  All such groups would be following in the footsteps of many contemporary Evangelicals.  As Covid-19 gripped the nation and the world, Evangelicals from California to Florida, courageous Christians all, refused to follow social distancing and stay-at-home guidelines; they continued to assemble for communal worship in churches and homes, placing themselves and their communities in peril.[iii]
  2. America’s long and inglorious tradition of discrimination in medical education would be rejuvenated on behalf of the Christian state.  Examples of exclusion by race, religion, and gender abound, and they can be drawn on to guide Christian Nationalists in any number of discriminatory practices for marginalizing the presence of non-Christians in American healthcare.  Consider only that by the mid-1930s, over 2,000 American medical students, 95% of whom were Jews, were driven to Europe to pursue medicine.  Seven years later, Charles Drew wrote a blistering letter to the editor of JAMA, protesting the AMA’s racially based exclusion of qualified black applicants whose state chapters refused them membership, thereby keeping them out of the national organization.  The American Nursing Association (ANA) was little better.  Founded in 1896, it  allowed qualified black nurses from states with racist state chapters direct admittance to the national organization only in 1950.  The Georgia chapter, incidentally, continued to exclude blacks until 1961, and retreated only after the ANA threated to expel it from the national organization.[iv]  And let us not forget quota systems, implemented to keep Jews out of both elite universities and medical schools after World War I.  After all, they were followed by the quota system implemented in the Immigration Acts of 1921 and 1924, a device to keep East European immigrants out of the country – a project no doubt congenial to Christian Nationalists.[v]
  3. Christian Healthcare would enjoin believing Christians to follow the dictates of conscience in deploying life-saving medications, procedures, and technologies on nonbelievers. EMTs and Medics, for example, would no longer be legally or professionally obligated to provide assistance to Jews, Muslims, Hindus, atheists, and other non-Christians. This will require a Constitutional amendment, since the Constitution makes no allowance for conscience as a ground for violating laws and lawfully implemented directives, as in the denial of life-saving medical interventions.  The First Amendment provides only for freedom of religion, understood as the freedom to practice the religion of one’s choice through voluntary affiliation with one or another House of Worship (or no House of Worship at all).
  4. It follows that Christian physicians, nurses, and other providers would be free, as practicing Christians, to provide services only to Christians. They might, at their conscience-driven discretion, avoid nonbelievers entirely or simply privilege the needs (as to medications, nourishment, and allocation of scarce resources) of Christians.  Self-evidently, Christian surgeons would be under no legal, professional, or moral obligation to operate on Jews, Muslims, Hindus, atheists, and other nonbelievers; nor would Christian anesthesiologists be required to administer anesthesia to them.  Professional codes of ethics would have to be revised (i.e., Christianized) accordingly.  In toto, under the auspices of a Christian nation, there would be a vast expansion of the “refusal laws” that individual states have passed to free hospitals, physicians, and nurses from any obligation to provide patients with abortions and other reproductive services, including contraceptives, genetic counseling, infertility treatment, STD and HIV testing, and treatment of victims of sexual assault.  Constitutional amendments would be required on this score as well, since such “laws of conscience,” whatever their religious moorings, have no legal, judicial, or moral status in the Constitution.
  5. Following the example of the National Blood Program of 1941, the blood bank set up to provide Caucasian-only blood to the American armed forces, all nationally sanctioned blood banks should be limited to Christian donors.[vi]  There is ample historical precedent regarding the sacrosanctity of Christian blood and blood products; witness the Italian residents of Bolzano who, newly absorbed into  Bavaria by Napoleon in 1807, launch an armed revolt against mandatory smallpox vaccination lest Protestantism be injected into their Catholic veins. Over a century later, a Nazi military directive forbidding the transfusion of Jewish blood into the veins of German military personnel led to the death of countless war wounded.  America was little better in the collection, identification, and storage of  blood.  The Red Cross Blood Donor Program, after refusing the blood of black Americans for a year, began accepting it in January 1942.  But it continued to segregate blood by donor race until 1950; southern states like Arkansas and Louisiana held firm to segregated blood collection until the early 1970s.[vii]  These precedents will be seized on in the time of CN.  In the new America, the blood of nonbelievers could be collected by their respective agencies, and made available to hospitals and clinics amenable to receiving and storing impure blood for non-Christian patients. Institutions that continued to  permit cross-religious transfusions would require signed waivers from Christian patients willing to accept transfusions of non-Christian blood under exigent circumstances.  Such waivers could be incorporated into Living Wills.

Christian Healthcare is only one of the societal transformations that await the ascendancy of Christian Nationalism.  The anti-intellectual disemboweling of American public education, especially in the South, is already well under way; where will it end up when the white CNs assume control?  To those who espouse it, I say:  Congratulations.  You have destroyed the America envisioned by the Founding Fathers and enshrined in the Constitution and Bill of Rights.  You have replaced the wall of separation between Church and State with a wall of separation between Christian and non-Christian.  In so doing, you have laid the seedbed for one more religious theocracy, a Christian sibling to the virulently anti-democratic Muslim theocracies of the Middle East.

The American theocracy will reach its apotheosis over time.  But when the Christian Nationalists assume political control, there will be immediate changes.  The United States will all at once be a two-tier society stratified along religious lines.  It will not only be Jews who, failing to throw their votes to Christian leaders, will have to watch their backs.  Everyone who opposes Christian National hegemony will be at risk.  We will all have to, in the ex-president’s  subtle formulation, “watch it.”

What to call the new state?  Christian Nationalists may profitably analogize from the example of Saudi Arabia.  If we replace “Saudi” (i.e., the Kingdom of Saud) with the New Testament’s “Kingdom of God,” and let “New Jerusalem” stand  for it, we arrive at a suitable replacement for the United States of America.  Here, Christian Nationalists, is the nation of your dreams and our nightmares.  I give you  New Jerusamerica.

January 6, 2021

______________________

[i] I am grateful to my friend and colleague of many decades, Professor Jeffrey Merrick, for his help in formulating my comments on the Founding Fathers, religion, and the founding of the American Republic.  Among recent books elaborating in scholarly detail these comments, see especially Steven K. Green, Inventing a Christian America:  The Myth of the Religious Founding (New York:  OUP, 2015).

[ii] Washington’s and Jefferson’s letters and Adams’ remarks to Congress are in the public domain and widely reproduced on the internet.

[iii] Ed Kilgore, “Many Evangelicals are Going to Church Despite Social-Distancing Guidelines,” New York Magazine, April 17, 2020  (https://nymag.com/intelligencer/2020/04/many-evangelicals-defying-guidelines-on-in-person-gatherings.html); Bianca Lopez, “Religious Resistance to Quarantine Has a Long History,”  (https://blog.smu.edu/opinions/2020/08/07/religious-resistance-to-quarantine-has-a-long-history).  “In numerous parts of the United States,”  Lopez writes, “certain stripes of Christianity and quarantine orders stand in direct opposition, resulting in deadly outcomes due to the COVID-19 pandemic.”

[iv] Edward C. Halperin, “The Jewish Problem in Medical Education, 1920-1955,” J. Hist. Med. & Allied Sci., 56:140-167, 2001, at 157-158; Patricia D’Antonio, American Nursing: A History of Knowledge, Authority, and the Meaning of Work (Baltimore: John Hopkins, 2010), 130.

[v] David Oshinsky, Bellevue:  Three Centuries of Medicine and Mayhem at America’s Most Storied Hospital.  NY: Doubleday, 2016), 196-198; Ian Robert Dowbiggin, Keeping America Sane: Psychiatry and Eugenics in The United States and Canada, 1880-1940 (Ithaca: Cornell University Press,1997), 224-227.

[vi] Charles E. Wynes, Charles Richard Drew: The Man and The Myth (Urbana: Univ. Illinois Press, 1988), 67; “Nazi Order Prohibiting Jewish Blood for Transfusions Causing Death of Many Soldiers,” JTA Daily News Bulletin, March 2, 1942 (https://www.jta.org/archive/nazi-order-prohibiting-jewish-blood-for-transfusions-causing-death-of-many-soldiers).  Note that I am not addressing Christian sects, like Jehovah’s Witnesses, whose members refuse blood transfusions altogether, only those that accept transfusions, but only of Christian blood.

[vii] Thomas A. Guglielmo, “Desegregating Blood:  A Civil Rights Struggle to Remember,” February 4, 2018 (https://www.pbs.org/newshour/science/desegregating-blood-a-civil-rights-struggle-to-remember).  For   lengthier consideration of blood and race in American history, see Spencer Love, One Blood:  The Death and Resurrection of Charles R. Drew (Chapel Hill:  Univ. North Carolina Press, 1996), 139-160.

Copyright © 2022 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Malaria in the Ranks

Malaria (from Italian “bad air”): Infection transmitted to humans by mosquito bites containing single-celled parasites, most commonly Plasmodium (P.) vivax  and P. falciparum.  Mosquito vector discovered by Ronald Ross of Indian Medical Service in 1897.  Symptoms:  Initially recurrent (“intermittent”) fever, then constant high fever, violent shakes and shivering, nausea, vomiting.  Clinical descriptions as far back as Hippocrates in fifth century B.C. and earlier still in Hindu and Chinese writings.  Quinine:  Bitter-tasting alkaloid from the bark of cinchona (quina-quina) trees, indigenous to South America and Peru.  Used to treat malaria from 1630s through 1920s, when more effective synthetics became available.  Isolated from chinchona bark in 1817 by French chemists Pierre Joseph Pelletier and Joseph Caventou.  There you have it.  Now read on. 

_______________________

It’s 1779 and the British, commanded by  Henry Clinton adopt a southern strategy to occupy the malaria-infested Carolinas. The strategy appears successful, as British troops commanded by Charles Lord Cornwallis capture Charleston on March 29, 1780.  But appearances can be deceiving. In reality, the Charleston campaign has left the British force debilitated.  Things get worse when Cornwallis marches inland in June, where his force is further ravaged by malarial  fever carried by Anopheles mosquitoes and Plasmodium parasites.  Lacking quinine, his army simply melts away in the battle to follow. Seeking to preserve what remains of his force, Cornwallis looks to the following winter as a time to  recuperate and rebuild.  But it is not to be.  Clinton sends him to Yorktown, where he occupies a fort between two malarial swamps on the Chesapeake Bay.  Washington swoops south and, aided by French troops, besieges the British.  The battle is over almost before it has begun.  Cornwallis surrenders to Washington, but only after his army has succumbed to malarial bombardment by the vast army of mosquitoes. The Americans have won the Revolutionary War.  We owe American independence to Washington’s command, aided, unwittingly, by mosquitoes and malaria.[1] 

Almost two centuries later, beginning in the late 1960s, malaria again joins America at war.  Now the enemy is communism, and the site is Vietnam.  The Republic of Korea (ROK), in support of the war effort, sends over 30,000 soldiers and tens of thousands of civilians to Vietnam.  The calculation is plain enough: South Korea seeks to consolidate America’s commitment to its economic growth and military defense in its struggle with North Korean communism after the war.  It works, but there is an additional, major benefit:  ROK medical care of soldiers and civilians greatly strengthens South Korean capabilities in managing infectious disease and safeguarding public health.  Indeed, at war’s end in 1975, ROK is an emergent powerhouse in malaria research and the treatment of parasitic disease.  Malaria has again played a part in the service of American war aims.[2]

Winners and losers aside, the battle against malaria is a thread that weaves its way through American military history.  When the Civil War erupted in 1861, outbreaks of malaria and its far more lethal cousin, yellow fever, did not discriminate between the forces of North and South.  Parasites mowed down combatants with utter impartiality.  For many, malarial infection was the enemy that precluded engagement of the enemy.  But there were key differences.  The North had the U.S. Army Laboratory, comprised of  laboratories in Astoria, New York and Philadelphia.  In close collaboration with Powers and Weightman, one of only two American pharmaceutical firms then producing quinine, the Army Laboratory provided Union forces with ample purified quinine in standardized  doses.  Astute Union commanders made sure their troops took quinine prophylactically, with troops summoned to their whiskey-laced quinine ration with the command, “fall in for your quinine.” 

Confederate troops were not so lucky.  The South lacked chemists able to synthesize quinine from its alkaloid; nor did a Spanish embargo permit the drug’s importation.  So the South had to rely on various plants and plant barks, touted by the South Carolina physician and botanist Frances Peyre Porcher as  effective quinine substitutes.  But Porcher’s quinine substitutes were all ineffective, and the South had to make do with the meager supply of quinine it captured or smuggled.  It was a formula for defeat, malarial and otherwise.[3] 

Exactly 30 years later, in 1891, Paul Ehrlich announced that the application of  a chemical stain, methylene blue, killed malarial microorganisms and could be used to treat malaria.[4]   But nothing came of Ehrlich’s breakthrough seven years later in the short-lived Spanish-American War of 1898.   Cuba was a haven for infectious microorganisms of all kinds, and, in a campaign of less than four months, malaria mowed down American troops with the same ease it had in the Civil War.  Seven times more Americans died from tropical diseases than from Spanish bullets.  And malaria topped the list.  

As the new century approached, mosquitoes were, in both senses, in the air.  In 1900, Walter Reed returned to Cuba to conduct experiments with paid volunteers; they established once and for all that mosquitoes were the disease vector of yellow fever; one could not contract the disease from “fomites,” i.e., the soiled clothing, bedding, and other personal matter of those infected.  Two years later, Ronald Ross received his second Nobel Prize in Medicine for his work on the role of mosquitoes in transmission of malaria.[5]   But new insight into the mosquito vector of yellow fever and malaria did not mitigate the dismal state of affairs that came with World War I.  The American military was no better prepared for the magnitude of malaria outbreaks than during the Civil War.  At least 1.5 million doughboys were incapacitated, as malaria spread across Europe from southeast England to the shores of Arabia, and from the Arctic to the Mediterranean.  Major epidemics broke out in  Macedonia, Palestine, Mesopotamia, Italy, and sub-Saharan Africa.[6]

In the Great War, malaria treatment fell back on quinine, but limited knowledge of malarial parasites compromised its effectiveness.  Physicians of the time could not differentiate between the two strains of  parasite active in the camps – P. vivax and P. falciparum.  As a result, they could not optimize treatment doses according to these somewhat different types of infection.  Malarial troops, especially those with falciparum, paid the price.  Except for the French, whose vast malaria control plan spared its infantry from infection and led to victory over Bulgarian forces in September 1918, malaria’s contribution to the Great War was what it had always been in war – it was the unexpected adversary of all.

Front cover of “The Illustrated War Times,” showing WWI soldiers, probably Anzacs, taking their daily dose of quine at Salonika, 1916.

In 1924, the problem that had limited the effectiveness of quinine during the Great War was addressed when the German pharmacologist Wilhelm Roehl, working with Bayer chemist Fritz Schönhöfer, distilled the quinine derivative Plasmoquin, which was far more effective against malaria than quinine.[7]  By the time World War II erupted, another antimalarial, Atabrine (quinacrine, mepacrine), synthesized in Germany in1930, was available.  It would be the linchpin of the U.S. military’s malaria suppression campaign, as announced by the Surgeon General in Circular Letter No. 56 of December 9, 1941.  But the directive had little impact in the early stages of  the war. U.S. forces in the South Pacific were devastated by malaria, with as many as 600 malaria cases for every 1,000 GIs.[8]  Among American GIs and British Tommies alike, the daily tablets were handed out erratically.  Lackluster command and side effects were part of the problem:  The drug turned skin yellow and occasionally caused nausea and vomiting.  From there, the yellowing skin in particular, GIs leapt to the conclusion that Atabrine would leave them sterile and impotent after the war.  How they leapt to this conclusion is anyone’s guess, but there was no medical information available to contradict it.[9]   

The anxiety bolstered the shared desire of some GIs to evade military service.  A number of them tried to contract malaria in the hope of discharge or transfer – no one was eager to go to Guadalcanal.  Those who ended up hospitalized often prolonged their respite by spitting out their  Atabrine pills.[10]   When it came to taking Atabrine, whether prophylactically or as treatment, members of the Greatest Generation could be, well, less than great.

Sign posted at 363rd Station Hospital in Papua New Guinea in 1942, sternly admonishing U.S. Marines to take their Atabrine.

Malarial parasites are remarkably resilient, with chemically resistant strains emerging time and again.  New strains have enabled malaria to find ways of staying ahead of the curve, chemically speaking.  During the Korean War (1950-1953), both South Korean and American forces fell to the vivax strain.  American cases decreased with the use of chloroquine, but the improvement was offset by a rash of cases back in the U.S., where hypnozoites (dormant malarial parasites) came to life with a vengeance and caused relapses.  The use of yet another antimalarial, primaquine, during the latter part of the war brought malaria under better control.  But even then, in the final year of the war 3,000 U.S. and 9,000 ROK soldiers fell victim.[11]   In Vietnam, malaria reduced the combat strength of some American units by half and felled more troops  than bullets.  Between 1965 and 1970, the U.S. Army alone reported over 40,000 cases.[12]  Malaria control measure were strengthened, yes, but so were the parasites, with the spread of drug-resistant falciparum and the emergence of a new chloroquine-resistant strain.  

Malaria’s combatant role in American wars hardly ends with Vietnam.  It was a destructive force in 1992, when American troops joined the UN Mission “Operation Restore Hope” in Somalia.  Once more, Americans resisted directives to take a daily dose of  preventive medicine, now Mefloquine, a vivax antimalarial developed by the Army in 1985.  As with Atabrine a half century earlier, false rumors of  debilitating side effects led soldiers to stop taking it.  And as with Atabrine, malaria relapses knocked out soldiers following their return home, resulting in the largest outbreak of malaria stateside since Vietnam.[13] 

In Somalia, as in Vietnam, failure of commanders to educate troops about the importance of “chemoprophylaxis” and to institute “a proper antimalarial regimen” were the primary culprits.  As a result, “Use of prophylaxis, including terminal prophylaxis, was not supervised after arrival in the United States, and compliance was reportedly low.[14]  It was another failure of malaria control for the U.S. military.  A decade later, American combat troops went to Afghanistan, another country with endemic malaria.  And there, yet again, “suboptimal compliance with preventive measures” – preventive medication, use of insect repellents, chemically treated tent netting, and so forth – was responsible for “delayed presentations” of malaria after a regiment of U.S. Army Rangers returned home.[15]  Plus ca change, plus c’est la même chose. 

Surveying American history, it seems that the only thing more certain than malarial parasites during war is the certainty of war itself.  Why is this still the case?  As to the first question, understanding the importance of “chemoprophylaxis” in the service of personal and public health (including troop strength in the field) has never been a strong suit of Americans.  Nor has the importance of preventive measures, whether applying insecticides and tent netting (or wearing face masks) been congenial, historically, to libertarian Americans who prefer freedom in a Hobbesian state of nature to responsible civic behavior.  Broad-based public-school education on the public health response to epidemics and pandemics throughout history, culminating in the critical role of preventive measures in containing Coronavirus, might help matters.  In the military domain, Major Peter Weima sounded this theme in calling attention to the repeated failure of education in the spread of malaria among American troops in World War II and Somalia. He stressed “the critical contribution of education to the success of clinical preventive efforts. Both in WWII and in Somalia, the failure to address education on multiple levels contributed to ineffective or only partially effective malaria control.” [16]  As to why war, in all its malarial ingloriousness, must accompany the human experience, there is no easy answer.   

_____________________

[1]  Peter McCandless, “Revolutionary fever:  Disease and war in the lower South,1776-1783,” Am. Clin. Climat. Assn., 118:225-249, 2007.   Matt Ridley provides a popular account in The Evolution of Everything:  How New Ideas Emerge (NY: Harper, 2016).

[2] Mark Harrison & Sung Vin Yim, “War on Two Fronts: The fight against parasites in Korea and Vietnam,” Medical History, 61:401-423, 2017.  

[3] Robert D. Hicks, “’The popular dose with doctors’: Quinine and the American Civil War,” Science History Institute, December 6, 2013 (https://www.sciencehistory.org/distillations/the-popular-dose-with-doctors-quinine-and-the-american-civil-war).

[4] Harry F. Dowling, Fighting Infection: Conquests of the Twentieth Century  (Cambridge, MA: Harvard Univ. Press, 1977), 93.

[5] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 263.

[6] Bernard J Brabin, “Malaria’s contribution to World War One – The unexpected adversary,” Malaria Journal, 13, 497, 2014;  R. Migliani, et al., “History of malaria control in the French armed forces:  From Algeria to the Macedonian Front  during the First World War” [trans.], Med. Santé Trop, 24:349-61, 2014.

[7] Frank Ryan, The Forgotten Plague:  How the Battle Against Tuberculosis was Won and Lost (Boston:  Little, Brown,1992), 90-91.

[8]  Peter J. Weima, “From Atabrine in World War II to Mefloquine in Somalia: The role of education in preventive medicine,” Mil. Med., 163:635-639, 1998, at 635.

[9] Weima, op. cit., p. 637, quoting Major General, then Captain, Robert Green during the  Sicily campaign  in August 1943:  “ . . . the rumors were rampant, that it made you sterile…. people did turn yellow.”

[10] Ann Elizabeth Pfau, Miss Yourlovin (NY:  Columbia Univ. Press, 2008), ch. 5.

[11] R. Jones, et al., “Korean vivax malaria. III. Curative effect and toxicity of Primaquine in doses from 10 to 30 mg daily,” Am. J. Trop. Med. Hyg., 2:977-982, 1953;  Joon-Sup Yeom, et al., “Evaluation of Anti-Malarial Effects, J. Korean Med. Sci., 5:707-712, 2005.

[12] B. S. Kakkilaya, “Malaria in Wars and Victims” (malariasite.com).

[13] Weima, op. cit.  Cf. M. R. Wallace et al., “Malaria among United States troops in Somalia,” Am. J. Med., 100:49-56, 1996.

[14] CDC, “Malaria among U.S. military personnel returning from Somalia, 1993,” MMWR, 42:524-526, 1993.

[15] Russ S. Kotwal, et al., “An outbreak of malaria in US Army Rangers returning from Afghanistan,”  JAMA, 293:212-216, 2005, at 214.  Of the 72% of the troops who completed a postdeployment survey, only 31% reported taking both their weekly tablets and continuing with their “terminal chemoprophylaxis” (taking medicine, as directed, after returning home).  Contrast this report with one for Italian troops fighting in Afghanistan from 2002-2011. Their medication compliance was measured 86.7% , with no “serious adverse events” reported and no cases of malaria occurring in Afghanistan. Mario S Peragallo, et al.,  “Risk assessment and prevention of malaria among Italian troops in Afghanistan,”   2002 to 2011,” J. Travel Med., 21:24-32, 2014.

[16] Weima, op. cit., 638.

 

Copyright © 2022 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

 
 

Why Pellagra Matters

It was the dread disease of the four “D”s:  dermatitis, diarrhea, dementia, and death.  The symptoms were often severe: deep red rashes, with attendant blistering and skin sloughing on the face, lips, neck, and extremities; copious liquid bowels; and deepening dementia with disorganized speech and a host of neurological symptoms.  Death followed up to 40% of the time.  The disease was reported in 1771 by the Italian Francisco Frapolini, who observed it among the poor of Lombardy.  They called it pelagra – literally “rough skin” in the dialect of northern Italy.  Frapolini popularized the term (which later acquired a second “l”), unaware that the Spanish physician Don Gaspar Casal had described the same condition in 1735.

Case reports later identified as pellagra occasionally appeared in American medical journals after the Civil War, but epidemic pellagra only erupted at the dawn of the 20th century.  Between  1902 and 1916, it ravaged mill towns in the American South.  Reported cases declined during World War I, but resumed their upward climb in 1919, reaching crisis proportions in 1921-1922 and 1927.  Nor was pellagra confined to the South.  Field workers and day laborers throughout the country, excepting the Pacific Northwest, fell victim.  Like yellow fever, a disease initially perceived as a regional problem was elevated to the status of a public health crisis for the nation.  But pellagra was especially widespread and horrific in the South.

In the decades following the Civil War, the South struggled to rebuild a shattered economy through a textile industry that revolved around cotton.  Pellagra found its mark in the thousands of underpaid mill workers who spun the cotton into yarn and fabric, all the while subsisting on a diet of cheap corn products:  cornbread, grits, syrup, brown gravy, fatback, and coffee were the staples.  The workers’  meager pay came as credit checks good only at company stores, and what company stores stocked, what the workers could afford to buy, was corn meal.  They lacked the time, energy, and means to supplement starvation diets with fresh vegetables grown on their own tiny plots.  Pellagra sufferers (“pellagrins”) subsisted on corn; pellagra, it had long been thought, was all about  corn.  Unsurprisingly, then, it did not stop at the borders of southern mill towns.  It also victimized the corn-fed residents of state-run institutions:  orphanages, prisons, asylums.         

In 1908, when cases of pellagra at southern state hospitals were increasing at an alarming rate, James Woods Babcock, the Harvard-educated superintendent of the South Carolina State Hospital and a pellagra investigator himself, organized the first state-wide pellagra conference.[1]  It was held at his own institution, and generated animated dialogue and comraderie among the 90 attendees.  It was followed a year later by a second conference, now billed as a national pellagra conference, also at Babcock’s hospital.  These conferences underscored both the seriousness of pellagra and the divided opinions about its causes, prevention, and treatment. 

At the early conferences, roughly half the attendees, dubbed Zeists (from Zea mays, or maize), were proponents of the centuries-old corn theory of pellagra. What is it about eating corn that causes the disease?  “We don’t yet know,” they answered, “but people who contract pellagra subsist on corn products.  Ipso facto, corn must lack some nutrient essential to  health.”  The same claim had been made by Giovanni Marzani in 1810.  The Zeists were countered by  anti-Zeists under the sway of germ theory.  “A deficiency disease based on some mysterious element of animal protein missing in corn? Hardly. There has to be a pathogen at work, though it remains to be discovered.”   Perhaps the microorganism was carried by insects, as with yellow fever and malaria.  The Italian-born British physician Louis Sambon went a step further.  He claimed to have identified the insect in question:  it was a black or sand fly of the genus Simulium.  

Germ theory gained traction from a different direction.  “You say dietary reliance on corn ‘causes’ pellagra?  Well, maybe so, but it can’t be a matter of healthy corn.  The corn linked to pellagra must be bad corn,  i.e., corn contaminated by a protozoon.”  Thus the position argued at length by no less than Cesare Lombroso, the pioneer of criminal anthropology.  Like Sambon, moreover, he claimed to have the answer:  it was, he announced, a  fungus, Sporisorium maidis, that made corn moldy and caused pellagra.  But many attendees were unpersuaded by the “moldy corn” hypothesis.  For them pellagra wasn’t a matter of any type of corn, healthy, moldy, or otherwise. It was an infectious disease pure and simple, and some type of microorganism had to be the culprit.  How exactly the microorganism entered the body was a matter for continued theorizing and case reports at conferences to come.         

And there matters rested until 1914, when Joseph Goldberger, a public health warrior of Herculean proportions, entered the fray.  A Jewish immigrant from Hungary, educated at the Free Academy of New York (later CUNY) and Bellevue Hospital Medical College (later NYU Medical School), Goldberger was a leading light of the Public Health Service’s Hygienic Laboratory.  A veteran epidemic fighter, he had earned his stripes battling yellow fever in Mexico, Puerto Rico, and the South; typhoid in Washington, DC; typhus in Mexico City; and dengue fever in Texas.[2]   With pellagra now affecting most of the nation, Goldberger was tapped by Surgeon General Rupert Blue to head south and determine once and for all the cause, treatment, and prevention of pellagra.  

Joseph Goldberger, M.D.

Goldberger was up to the challenge.  He took the South by storm and left a storm of anger and resentment in his wake.  He began in Mississippi, where reported cases of pellagra would increase from 6,991 in 1913 to 10,954 in  1914.  In a series of “feeding experiments” in two orphanages in the spring of 1914, he introduced lean meat, milk, and eggs into the children’s diets; their pellagra vanished.  And Goldberger and his staff were quick to make a complementary observation:  In all the institutions they investigated, not a single staff member ever contracted pellagra.  Why?  Well, the staffs of orphanages, prisons, and asylums were quick to take for themselves whatever protein-rich foods came to their institutions.  They were not about to make do with the cornbread, corn mush, and fatback given to the hapless residents.  And of course their salaries, however modest, enabled them to procure animal protein on the side. 

Joseph Goldberger, with his assistant C. H. Waring, in the Baptist Orphanage near Jackson, Mississippi in 1914, in the painting by Robert Thom.

       

Alright, animal protein cleared up pellagra, but what about residents of state facilities whose diets provide enough protein to protect them from pellagra.  Were there any?  And, if so, could pellagra be induced in them by restricting them to corn-based diets?  Goldberger observed that the only wards of the state who did not contract pellagra were the residents of prison farms.  It made sense:  They alone received some type of meat at mealtime, along with farm-grown vegetables and buttermilk.  In collaboration with Mississippi governor Earl Brewer, Goldberger persuaded 11 residents of Rankin State Prison Farm to restrict themselves to a corn-based diet for six months.  At the study’s conclusion, the prisoners would have their sentences commuted, a promise put in writing.  The experiment corroborated Goldberger’s previous findings:  Six of the 11 prisoners contracted pellagra, and, ill and debilitated, they became free men when the experiment ended in October 1915.  

Now southern cotton growers and textile manufacturers rose in arms.  Who was this Jewish doctor from the North – a representative of “big government,” no less – to suggest they were crippling and killing mill workers by consigning them to corn-based diets?  No, they and their political and medical allies insisted, pellagra had to be an infectious disease spread from worker to worker or transmitted by an insect.  To believe otherwise, to suggest the southern workforce was endemically ill and dying because it was denied essential nutrients – this  would jeopardize the textile industry and its ability to attract investment dollars outside the region.  Goldberger, supremely unfazed by their commitment to science-free profit-making, then undertook the most lurid experiment of all.  Joined by his wife Mary and a group of colleagues, he hosted a series of  “filth parties” in which the group transfused pellagrin blood into their veins and ingested tablets consisting of the  scabs, urine, and feces of pellagra sufferers.  Sixteen volunteers at four different sites participated in the experiment, none of whom contracted the disease.  Here was definitive proof: pellagra was not an infectious disease communicable person-to-person.[3]  

The next battle in Goldberger’s war was a massive survey of over 4,000 residents of textile villages throughout the Piedmont of South Carolina.  It began in April 1916 and lasted 2.5 years, with data analysis continuing, in Goldberger’s absence, after America’s entry into the Great War.  Drawing on the statistical skills  of his PHS colleague, Edgar Sydenstricker, the survey was remarkable for its time and place.  Homes were canvassed to determine the incidence of pellagra in relation to sanitation, food accessibly, food supply, family size and composition, and family income.  Sydenstricker’s statistical analysis of 747 households with 97 cases of pellagra showed that the proportion of families with pellagra markedly declined as income increased.  “Whatever the course that led to an attack of pellagra,”  he concluded, “it began with a light pay envelope.” [4]      

But Goldberger was not yet ready to retire his suit of armor for the coat of a lab researcher.  In September 1919, the PHS reassigned him to Boston, where he joined his old mentor at the Hygienic Laboratory, Milton Rosenau, in exploring influenza with human test subjects.  Once the Spanish Flu had subsided, he was able to return to the South, and just in time for a new spike in pellagra rates.  By the spring of 1920, wartime prosperity was a thing of the past.  Concurrent dips in both cotton prices and tobacco profits led to depressed wages for mill workers and tenant farmers, and a new round of starvation diets led to dramatic increases in  pellagra.  It was, wrote The New York Times on July 25, 1921, quoting a PHS memo, one of the “worst scourges known to man.”[5]

So Goldberger took up arms again, and in PHS-sponsored gatherings and southern medical conferences, withstood  virulent denunciations, often tinged with anti-Semitism.  Southern health care officers like South Carolina’s James A. Hayne dismissed the very notion of deficiency disease as “an absurdity.”  Hayne angrily refused to believe that pellagra was such a disease because, well, he simply refused to believe it – a dismissal sadly prescient of Covid-deniers who refused to accept the reality of a life-threatening viral pandemic because, well, they simply refused to believe it.[6]   

As late as November 1921, at a meeting of the Southern Medical Association, most attendees insisted that pellagra was caused by infection, and that Goldberger’s series of experiments was meaningless.  But they were meaningless only to those blinded to any and all meanings that reflected poorly on the South and its ability to feed its working class.  Even the slightest chink in the physicians’  self-protective armor would have opened to the epidemiological plausibility of Goldberger’s deficiency model.  How could they fail to see that pellagra was  a seasonal disease that reappeared every year in late spring or early summer, exactly like endemic scurvy and beriberi, both of which were linked to dietary deficiencies?     

Back in the Hygienic Laboratory in Washington, Goldberger donned his lab coat and, beginning in 1922, devised a series of experiments involving both people and dogs. Seeking to find an inexpensive substitute for the meat, milk, and eggs unavailable to the southern poor, he tested  a variety of foods and chemicals, one at a time, to see if one or more of them contained the unknown pellagra preventative, now dubbed the “P-P factor.”  He was not inattentive to vitamins, but in the early ’20s, there were only   vitamins A, B , and C to consider, none of which contained the P-P factor.  It was not yet understood that vitamin B was not a single vitamin but a vitamin complex.  Only two dietary supplements, the amino acid tryptophan and, surprisingly, brewer’s yeast, were found to have reliably preventive and curative properties.[7] 

Brewer’s yeast was inexpensive and widely available in the South.  It would soon be put to the test.  In June 1927, following two seasons of declining cotton prices, massive flooding of 16,570,627 acres of the lower Mississippi River Valley lowered wages and increased food prices still further.  The result was drastic increases in pellagra.  So Goldberger, with Sydenstricker at his side, headed South yet again, now hailed on the front page of the Jackson Daily News as a returning hero.  After a three-month survey of tenant farmers, whose starvation diet resembled that of the mill workers interviewed in 1916, he arranged for shipment of 12,000 pounds of brewer’s yeast to the hardest hit regions.  Three cents’ worth of yeast per day cured most cases of pellagra in six to ten weeks.  “Goldberger,” writes Kraut, “had halted an American tragedy.”[8]  Beginning with flood relief in 1927, Red Cross and state-sponsored relief efforts following natural disasters followed Goldberger’s lead.  Red Cross refugee camps in 1927 and thereafter educated disaster victims about nutrition and pellagra and served meals loaded with P-P factor.  On leaving the camps, field workers could take food with them; families with several sick members were sent off with parcels loaded with pellagra preventives.

But the scientific question remained:  What exactly did brewer’s yeast, tryptophan, and two other tested products, wheat germ and canned salmon, have in common?  By 1928, Goldberger, who had less than a year to live,[9] was convinced it was an undiscovered vitamin, but the discovery would have to await the biochemistry of the 1930s.  In the meantime, Goldberger’s empirical demonstration that inexpensive substitutes for animal protein like brewer’s yeast prevented and cured pellagra made a tremendous difference in the lives of the South’s workforce.   Many thousands of lives were saved.

___________________ 

 It was only in 1912, when pellagra ripped through the South, that Casimir Funk, a Polish-born American biochemist, like Goldberger a Jew, formulated the vita-amine or vitamine hypothesis to designate organic molecules essential to life but not synthesized by the human body, thereby pointing to the answer Goldberger sought.[10]  Funk’s research concerned beriberi, a deficiency disease that causes a meltdown of the central nervous system and cardiac problems to the point of heart failure.  In 1919, he determined that it resulted from the depletion of thiamine (vitamin B1).  The covering term “vita-amine” reflected his (mistaken) belief that other deficiency diseases – scurvy, rickets, pellagra – would be found to result from the absence of different amines (i.e., nitrogen-containing) molecules.  

 In the case of pellagra, niacin (aka vitamin B3, aka nicotinic acid/nicotinamide) proved the missing amine, Goldberger’s long sought-after P-P factor.  In the course of his research, Funk himself had isolated the niacin molecule, but its discovery as the P-P factor was only made in 1937 by the American biochemist Conrad Elvehjem.  The circle of discovery begun with Frapolini’s observations in Lombardy in 1771 was closed between 1937 and 1940, when field studies on pellagrins in northern Italy conducted by the Institute of Biology  of the NRC confirmed the curative effect of niacin.[11] 

 Now, ensnared for 2.5 years by a global pandemic that continues to sicken and kill throughout the world, we are understandably focused on communicable infectious diseases.  Reviewing the history of pellagra reminds us that deficiency diseases too have plagued humankind, and in turn brought forth the best that science – deriving here from the collaboration of laboratory researchers, epidemiologists, and public health scientists – has  to offer.  Louis Pasteur, Robert Koch, and Walter Reed are the names that  leap to the foreground in considering the triumphs of bacteriology.  Casimir Funk, Joseph Goldberger, Edgar Sydenstricker, and Conrad Elvehjem are murky background figures that barely make it onto the radar.

In the developed world, pellagra is long gone, though it remains  common in Africa, Indonesia, and China.  But the entrenched commercial and political interests that Goldberger and his PHS team battled to the mat are alive and well.  Over the course of the Covid pandemic, they have belittled public health experts and bewailed CDC protocols that limit “freedom” to contract the virus and infect others.  In 1914, absent Goldberger and his band of Rough Riders, the South would have languished with a seasonally crippled labor force far longer than it did.  Mill owners, cotton-growing farmers, and politicians would have shrugged and accepted the death toll as a cost of doing business.

Let us pause, then, and pay homage to Goldberger and his PHS colleagues.  They were heroes willing to enter an inhospitable region of the country and, among other things, ingest pills of pellagrin scabs and excreta to prove that pellagra was not a contagious disease.  There are echoes of Goldberger in Anthony Fauci, William Schaffner, Ashish Jha, and Leana Wen as they relentlessly fan the embers of  scientific awareness among those who resist an inconvenient truth: that scientists, epidemiologists, and public health officers know things about pandemic management that demagogic politicians and unfit judges do not.  Indeed, the scientific illiterati appear oblivious to the fact that the health of the public is a bedrock of the social order, that individuals ignore public health directives and recommendations at  everyone’s peril.  This is no less true now than it was in 1914.  Me?  I say,  “Thank you, Dr. Goldberger.  And thank you, Dr. Fauci.” 

___________________________           

[1] My material on Babcock and the early pellagra conferences at the South Carolina State Hospital come from Charles S. Bryan, Asylum Doctor:  James Woods Babcock and the Red Plague of Pellagra (Columbia: Univ. of S C Press, 2014), chs. 3-5. 

[2] Alan Kraut, Goldberger’s War:  The life and Work of a Public Health Crusader (NY: Hill & Wang, 2004), 7.

[3] To be sure, the “filth parties” did not rule out the possibility of animal or insect transmission of a  microorganism.   Goldberger’s wife Mary incidentally, was transfused with pellagrin blood but didn’t ingest the filth pills. 

[4] Kraut, Goldberger’s War, 164.

[5] Quoted in Kraut, Goldberger’s War, 190.

[6] On Hayne, Goldberger’s loudest and most vitriolic detractor among southern public health officers, see Kraut, pp. 118, 194; Bryan, Asylum Doctor, pp. 170, 223, 232, 239; and Elizabeth Etheridge, The Butterfly Caste: A Social History of Pellagra in the South (Westport, CT: Greenwood, 1972), 42, 55, 98-99, 110-111.  This is the same James Hayne who in October 1918, in the midst of the Great Pandemic, advised the residents of South Carolina that “The disease itself is not so dangerous: in fact, it is nothing more than what is known as ‘Grippe’” (“Pandemic and Panic: Influenza in 1918 Charleston” [https://www.ccpl.org/charleston-time-machine/pandemic-and-panic-influenza-1918-charleston#:~:text=Pandemic%20and%20panic%20visited%20Charleston,counter%20a%20major%20health%20crisis]).

[7] The tryptophan experiments were conceived and conducted by Goldberger’s assistant, W. F. Tanner, who, after Goldberger’s return to Washington, continued to work out of the PHS laboratory at Georgia State Sanitarium        (Kraut, Goldberger’s War, 203-204, 212-214).

[8] Kraut, Goldberger’s War, 216-222, quoted at 221. 

[9] Goldberger died from hypernephroma, a rare form of kidney cancer, on January 17, 1929.   Prior to the discovery of niacin, in tribute to Goldberger, scientists referred to the P-P factor as Vitamin G. 

 [10] The only monographic study of Funk in English, to my knowledge, is Benjamin Harrow, Casimir Funk, Pioneer in Vitamins and Hormones (NY:  Dodd, Mead, 1955).  There are, however, more recent articles providing brief and accessible overviews of his achievements, e.g.,  T. H. Juke, “The prevention and conquest of scurvy, beriberi, and pellagra,” Prev. Med., 18:8877-883, 1989;  Anna Piro, et al., “Casimir Funk: His discovery of the vitamins and their deficiency disorders,” Ann. Nutr. Metab., 57:85-88, 2010.

[11] Renato Mariani-Costantini & Aldo Mariani-Costantini, “An outline of the history of pellagra in Italy,” J. Anthropol. Sci., 85:163-171, 2007.

 

Copyright © 2022 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Anti-vaxxers in Free Fall

I read a news story in which a man is dying of Covid-19 in the hospital.  He is asked whether he regrets not getting vaccinated and rallies enough to reply, “No, I don’t believe in the vaccine.”  So what then does he believe in?  Systemic viral infection, suffering, and death?  If you don’t believe in vaccination, you don’t believe in modern medicine in toto.  You don’t believe in bacteriology, virology, cellular biology, microbiology, or immunology.  What then is left to prevent, diagnose, and treat disease?  Trump-ish medievalism, mysticism, shamanism, divine intervention?

A study by researchers at Harvard’s Brigham and Women’s Hospital used natural language processing to comb through 5,307 electronic patient records of adult type 2 diabetics living in Massachusetts and followed by their primary care physicians between 2000 and 2014.  They found that 43% (2,267) of patients refused to begin insulin therapy when their doctors recommended it.  Further, diabetics who declined the recommendation not only had higher blood sugar levels than those who began insulin, but had greater difficulty achieving glycemic control later on.[1]  So what do the insulin-declining diabetics believe in?  Chronic heart and kidney disease, blindness, and amputation – the all but inevitable sequelae of poorly managed diabetes?

The problem, really an epistemological problem, is that such people apparently have no beliefs at all – unless one imputes to them belief in disease, suffering, and death, and in the case of Covid vaccine resisters, the prerogative to inflict them on others.  This is not tantamount to a scientifically specious belief system that unintentionally infects others.  During the Yellow Fever epidemic that left Philadelphia in ruins in 1793, Dr. Benjamin Rush, highly acclaimed throughout the newborn nation, set about his curative missions by draining his patients, in successive bleedings, of up to four pints of blood while simultaneously purging them (i.e., causing them to vomit) with copious doses of toxic mercury.

Rush’s “Great Purge,” adopted by his followers, added hundreds, perhaps thousands, to the death toll in Philadelphia alone.  But at least Rush’s “system” derived from a belief system.  He did in fact  find a theoretical rationale for his regimen in an essay by the Virginia physician and mapmaker John Mitchell.  Describing yellow fever in Virginia in 1741, Mitchell noted that in yellow fever the “abdominal viscera were filled with blood, and must be cleaned out by immediate evacuation.”[2]   Bleeding, of course, was conventional treatment for all manner of disease in 1793, so Mitchell’s recommendation came as no surprise. Taken in conjunction with the system of mercuric purges employed by Dr. Thomas Young during the Revolutionary War, Rush had all the grounding he required for a ruinously misguided campaign that greatly extended recovery time of those it did not kill.  But, yes, he had his theory, and he believed in it.

In the early 19th century, Napoleon, sweeping through Europe, conquers the north Italian province of Bolzano, which in 1807 he incorporated into Bavaria. Two years later, when the Bavarian government mandates smallpox vaccination for all residents, the newly absorbed Italians launch an armed revolt, partly because they believed vaccination would inject Protestantism into their Catholic veins.[3]

All right, it is a nonsensical belief, even in 1809, but it is still a belief of sorts.  It is epistemically flawed, because it fails to stipulate what exactly makes a substance inherently Protestant in nature; nor does it posit a mechanism of transmission whereby a Protestant essence seeps into liquid smallpox vaccine in the first place.  In the realm of ethics, it suggests that the possibility of death pales alongside the certainty of spiritual contamination by a fluid that, however neutral in life-saving potency, is injected by a Protestant hand.

Only slightly less ridiculous to modern ears is the mid-19th-century belief that general anesthesia via ether or chloroform, introduced by James Young Simpson in 1847, must be withheld from women giving birth.  The reason?  Genesis 3.16 enjoins women to bring forth new life in suffering.  Forget that the belief is espoused solely by certain men of the cloth and male physicians,[4] and was based on a highly questionable rendering of the biblical Hebrew.  Forget as well that, for Christians, Christ’s death redeemed humankind, relieving women of the need to relive the primal curse.  Bear in mind further that the alleged curse would also forbid, inter alia, use of forceps, caesarian operations, and embryotomy.  A woman with a contracted pelvis would die undelivered because she is guilty of the sin over which she has no control – that of having a contracted pelvis.[5]

In a secular nation guided by a constitution that asserts everyone’s right to pursue happiness in his or her own pain-free terms, we see the primal curse as archaic misogynist drivel, no less absurd than belief that the Bible, through some preternatural time warp, forbids vaccination.  But, hey, it’s a free country, and if a mid-19th-century or early-21st-century man chooses to believe that anesthesia permits men to escape pain whenever possible but women only in male-sanctioned circumstances, so be it.  It is a belief.

Now it’s 1878, and the worst yellow fever epidemic in American history is sweeping across the lower Mississippi Valley, taking lives and destroying commerce in New Orleans, Memphis and surrounding cities and towns to which refugees are streaming.  The epidemic will reach the Ohio Valley, bringing deadly Yellow Jack to Indiana, Illinois, and Ohio.  Koch’s monograph on the bacteriology of sepsis (wound infection) was published that very  year, and neither his work nor that of Lister is universally accepted in the American south.  Nor would its precepts have counted for much in the face of a viral (not bacterial) invader carried up the Mississippi from Havana.

What can city boards of health do in the face of massive viral infection, suffering, and death?  Beyond imposing stringent new sanitary measures, they can quarantine ships arriving in their harbors until all infected crew members have either died or been removed and isolated.  This will prevent the newly infected from infecting others and crippling cities still further – assuming, that is, a belief system in which yellow fever is contagious and spread from person to person.

But in 1878 Memphis, where by September the epidemic is claiming 200 lives a day, this “modern” belief is widely contested among the city’s physicians.  Some are contagionists, who believe that disease is caused by invisible entities that are transmissible.  But others, greater in number, favor the long-held theory that infectious disease results from “miasma” or bad air – air rendered toxic by decaying plant and animal matter in the soil.  If you believe miasma causes disease, then you’re hard-pressed to understand how quarantining ships laden with sick people will do anything to control the epidemic.

This was precisely the position of the 32 Memphis physicians who defeated the city council’s plan to institute a quarantine and set up a quarantine station.  Quarantine is pointless in the face of bad air.  The city’s only recourse, so held the 32, was to alter the “epidemic constitution” of the atmosphere by inundating it with smoke.  Canon blasts and blazing barrels of tar up and down city streets – that’s the ticket to altering the atmospheric conditions that create infectious disease.[6]

The miasmic theory of disease retained a medical following throughout the 1870s, after which it disappeared in the wake of bacteriology.  But in Memphis in 1878, bad air was still a credible theory in which physicians could plausibly believe.  And this matter of reasonable belief – reasonable for a particular time and place – takes us back to the hospitalized Covid patient of 2021 who, with virtually his last breath, defends his decision to remain unvaccinated because he doesn’t believe in the vaccine.  What is the knowledge base that sustains his disbelief?  There isn’t any.  He has no beliefs, informed or otherwise, about bacteriology, virology, cellular biology, or immunology.  At best, he has decided to accept what someone equally belief-less has told him about Covid vaccination, whether personally, in print, or over the internet.

It is no different among the 43% of Massachusetts diabetics who, a century after Banting’s and Best’s miraculous discovery, declined insulin therapy when their doctors recommended it.  Their disbelief is actually a nonbelief because it is groundless.  For some, perhaps, the refusal falls back on a psychological inability to accept that one is diabetic enough to warrant insulin.  They resist the perceived stigma of being insulin-dependent diabetics.[7]  Here at least the grounds of refusal are intelligible and remediable.  An insulin phobia does not sustain real-world belief; it is an impediment to such belief in relation to diabetes and insulin, illness and long-term health, lesser and greater life expectancy.

Back in the present, I read another news story in which two unvaccinated hospital nurses explain to a journalist that they have refused Covid vaccination because the vaccines’ effectiveness is based on “junk data.”  Really?  Here there is the glimmering of a belief system, since scientific data can be more or less robust, more or less supportive of one or another course of action.

But what exactly makes Covid vaccine data worthless, i.e., junk?  And how have these two nurses acquired the expertise in epidemiology, population statistics, and data analysis to pass judgment on data deemed credible and persuasive by scientists at Pfizer, Moderna, Johnson & Johnson, the CDC, and the WHO?  And how, pray tell, have they gained access to these data?  Like all opponents of vaccine science, they pontificate out of ignorance, as if the mere act of an utterance confers truth-value to what is being uttered.  It’s an extreme example of asserting as fact what remains to be demonstrated (argument petitio principii), the legacy of an ex-president who elevated pathological lying to a political art form.

Even the nurses pale alongside the anti-vax protester who is pictured in a news photo holding a sign that reads, “Vaccines Kill.”[8]  Whom do they kill and under what circumstances?  Does he mean all vaccines are deadly and kill people all the time, or just certain vaccines, such as the Covid  vaccine?   But what does it matter?  The sign holder doesn’t know anything about any vaccines.  Does he really believe that everything we know about the history of vaccine science from the time of Jenner is bogus, and that children who once died from smallpox, cholera, yellow fever, diphtheria, pertussis, typhoid, typhus, tetanus, and polio are still dying in droves, now from the vaccines they receive to protect them from these infectious diseases during the earliest years of life?  Is the demographic fact that, owing to vaccination and other public health measures, life expectancy in the U.S. has increased from 47 in 1900 to 77 in 2021 also based on junk data?  In my essay, Anti- vaccinationism, American Style, I provide statistics on the total elimination in the U.S. of smallpox and diphtheria, and virtual elimination of polio.  Were my claims also based on junk data?  If so, I’d appreciate being directed to the data that belie these facts and demonstrate that, in point of fact, vaccines kill.

Maybe the man with the sign has an acquaintance who got sick from what he believed to be a vaccine?  Perhaps someone on his internet chat group heard of someone else who became ill, or allegedly died, after receiving a vaccine.  Of course, death can follow vaccination without being caused by it.  Do we then assume that the man with the sign and like-minded protesters are well-versed in the difference between causation and correlation in scientific explanation?

We know that for a tiny number of individuals aspirin kills.[9]   So why doesn’t the man hold up a sign that reads, “Aspirin Kills.”  Here at least, he would be calling attention to a scientific fact that people with GI conditions should be aware of.    We know that sugary drinks have been linked to 25,000 deaths in the U.S. each year.  Why not a sign, “Soda Kills”?  It would at least be based on science.  He chooses not to proclaim the lethality of aspirin or soda because he cares no more about aspirin- or soda-related deaths than Covid-related deaths.  If he did, then, like the two nurses with their junk data and the Covid patient announcing disbelief in Covid vaccination on his deathbed, he would have to anchor his belief in consensually accepted scientific facts – a belief that someone, anyone, might find believable.

He is no different than other American anti-vaxxers I read about in the paper. They are the epistemological Luddites of our time, intent on wrecking the scientific machinery of disease prevention, despite profound ignorance of vaccine science and its impact on human affairs since the late 18th century.  Indeed, they see no need to posit grounds of belief of any kind, since their anger – at Covid, at Big Government, at Big Science, at Big Medicine, at Big Experts – fills the epistemic void.  It fuels what they offer in place of the science of disease prevention:  the machinery of misinformation that is their stock in trade.

And therein is the source of their impotence.  They have fallen into an anti-knowledge black hole, and struggle to fashion an existence out of anger that – to push the anti-matter trope a little further – repels rational thought.  Their contrarian charge is small solace for the heightened risks of diseases, suffering, and death they incur, and, far less conscionably, impose on the rest of us.

______________________

[1] N. Hosomura, S. Malmasi, et al., “Decline of Insulin Therapy and Delays in Insulin Initiation in People with Uncontrolled Diabetes Melitus,” Diabetic Med., 34:1599-1602, 2017.

[2] J. M. Powell, Bring Out Your Dead:  The Great Plague of Yellow Fever in Philadelphian in 1793 (Phila: Univ. of Pennsylvania Press, 1949), 76-78.

[3] My thanks to my friend Marty Meyers for bringing to my attention this event of 1809, as reported by Emma Bubola,In Italy’s Alps, Traditional Medicine Flourishes, as Does Covid,” New York Times, December 16, 2021.

[4] With reason, wrote Elizabeth Cady Stanton in The Woman’s Bible (1895), “The Bible and the Church have been the greatest stumbling blocks in the way of women’s emancipation.”

[5] For a fulller examination of the 19th-century debate on the use of general anesthesia during childbirth, see Judith Walzer Leavitt Brought to Bed: Childbearing in America, 1750-1950 (NY:  OUP, 1986), ch. 5.

[6] On the measures taken to combat the epidemic in Memphis, including the rift between contagionists and noncontagionists physicians, see John H. Ellis, Yellow Fever and Public Health in the New South (Lexington: Univ. Press of Kentucky, 1992), ch. 3.

[7] A. Hussein, A. Mostafa, et al., “The Perceived Barriers to Insulin Therapy among Type 2 Diabetic Patients,” African Health Sciences, 19:1638-1646, 2019.

[8] Now, sadly, we have gone from hand-written “Vaccines Kill” signs to highway billboards, e.g., https://www.kxxv.com/hometown/mclennan-county/a-new-billboard-in-west-claims-vaccines-kill.

[9] Patients prescribed aspirin before developing a GI bleed or perforation are prominent among those killed by aspirin.  See A. Lanas, M. A. Perez-Aisa, et al., “A Nationwide Study of Mortality Associated with Hospital Admission and Those Associated with Nonsteroidal Antiinflammatory Drug Use,” Am. J.  Gastroenterol., 100:1685-1693, 2005; S. Straube, M. R. Trainer, et al., “Mortality with Upper Gastrointestinal Bleeding and Perforation,” BMC Gastroenterol., 8: 41, 2009.

Unmasked and Unhinged

The Great Influenza, the Spanish Flu, a viral infection spread by droplets and mouth/nose/hand contact, laid low the residents of dense American cities, and spurred municipal officials to take new initiatives in social distancing.[1]  City-wide bans on public gatherings included closing schools, theaters, motion picture houses, dance halls, and – perish the thought – saloons.  In major cities, essential businesses that remained open had to comply with new regulations, including staggered opening and closing times to minimize crowd size on streets and in trolleys and subways.  Strict new sanitation rules were the order of the day.  And yes, eight western cities, not satisfied with preexisting regulations banning public spitting and the use of common cups, or even new regulations requiring the use of cloth handkerchiefs when sneezing or coughing, went the full nine yards:  they passed mask-wearing ordinances.

In San Francisco and elsewhere, outdoor barbering and police courts were the new normal.

The idea was a good one; its implementation another matter.  In the eight cities in question, those who didn’t make their own masks bought masks sewn from wide-mesh gauze, not the tightly woven medical gauze, four to six layers thick, worn in hospitals and recommended by authorities.  Masks made at home from cheesecloth were more porous still.  Nor did most bother washing or replacing masks with any great frequency. Still, these factors notwithstanding, the consensus is that masks did slow down the rate of viral transmission, if only as one component of a “layered” strategy of protection.[2]   Certainly, as commentators of the time pointed out, masks at least shielded those around the wearer from direct in-your face (literally) droplet infection from sneezes, coughs, and spittle.  Masks couldn’t hurt, and we now believe they helped.  

Among the eight cities that passed mask-wearing ordinances, San Francisco took the lead.  Its mayor, James Rolph, with a nod to the troops packed in transport ships taking them to war-torn France and Belgium, announced that “conscience, patriotism and self-protection demand immediate and rigid compliance” with the mask ordinance. By 1918, masks were entering hospital operating theaters, especially among assisting nurses and interns.[3]  But mask-wearing in public by ordinary people was a novelty.  In a nation gripped by life-threatening influenza, however, most embraced masks and wore them proudly as emblems of patriotism and public-mindedness.   Local Red Cross volunteers lost no time in adding mask preparation to the rolling of bandages and knitting of socks for the boys overseas.

A trolley conductor waves off an unmasked citizen. The image is from Seattle, another city with a mask-wearing ordinance.

But, then as now, not everyone was on board with face masks.  Then as now, there were protesters.  In San Francisco, they were small in number but large in vocal reach.  The difference was that in 1918, cities like San Francisco meant business, with violators of mask laws fined $5 or $10 or imprisoned for 10 days.  On the first day the ordinance took effect, 110 were arrested, many with masks dangling around their necks.  In mid-November,

San Francisco police arrest “mask slackers,” one of whom has belatedly put on a mask.

following the signing of the Armistice, city officials mistakenly believed the pandemic had passed and rescinded the ordinance.  At noon, November 21, at the sound of a city-wide whistle, San Franciscans rose as one and tossed their masks onto sidewalks and streets.   In January, however, following a spike in the number of influenza cases, a second mask-wearing ordinance was passed by city supervisors, at which point a small, self-styled Anti-Mask League – the only such League in the nation – emerged on the scene.[4]  

A long line of San Franciscans waiting to purchase masks in 1919.  A few already have masks in place.

The League did not take matters lying down, nor were they content to point out that masks of questionable quality, improperly used and infrequently replaced, probably did less good than their proponents suggested.  Their animus was trained on the very concept of obligatory mask-wearing, whatever its effect on transmission of the unidentified influenza microbe.  At a protest of January 27, “freedom and liberty” was their mantra.  Throwing public health to the wind, they lumped together mask-wearing, the closing of city schools, and the medical testing of children in school.  Making sure sick children did not infect healthy classmates paled alongside the sacrosanctity of parental rights.  For the protesters, then as now, parental rights meant freedom to act in the worst interests of the child.

___________________

One wants to say that the Anti-Mask League’s short-lived furor over mask-wearing, school closings, and testing of school children is long behind us.  But it is not.  In the matter of contagious infectious disease – and expert recommendations to mitigate its impact – what goes around comes around. In the era of Covid-19, when San Francisco mayor London Breed ordered city residents “to wear face coverings in essential businesses, in public facilities, on transit and while performing essential work,” an animated debate on mask-wearing among city officials and the public ensued.  A century of advance in the understanding of infectious disease, including the birth and maturation of virology – still counts for little among the current crop of anti-maskers.  Their “freedom” to opt for convenience trumps personal safety and the safety of others. Nor does a century of improvements in mask fabrics, construction, comfort, and effectiveness mitigate the adolescent wantonness of this freedom one iota.  

“Liberty and freedom.”  Just as the Anti-Mask League’s call to arms masked a powerful political undertow, so too with the anti-vaxxers and anti-maskers of the present.  Times change; some Americans – a much higher percentage now than in 1918 – do not. Spearheaded by Trumpian extremists mired in fantasies of childlike-freedom from adult responsibility, the “anti” crowd still can’t get its head around the fact that protecting the public’s health – through information, “expert” recommendations and guidelines, and, yes, laws – is the responsibility of government.  The responsibility operates through the Commerce Clause of the Constitution, which gives the federal government broad authority to impose health measures to prevent the spread of disease from a foreign country.  It operates through the Public Health Service Act, which gives the Secretary of Health and Human Services authority to lead federal health-related responses to public health emergencies.  And it operates through the 10th Amendment to the Constitution which grants states broad authority to take action during public health emergencies.  Quarantine and restricted movement of those exposed to contagious disease, business restrictions, stay-at-home orders – all are among the “broad public health tools” available to governors.[5]   

When a catastrophe, natural or man-made, threatens public health and safety, this responsibility, this prerogative, this Constitutional mandate, may well come down with the force of, well, mandates, which is to say, laws.  At such moments in history, we are asked to step up and accept the requisite measure of inconvenience, discomfort, and social and economic restriction because it is intrinsic to the civil liberties that make us a society of citizens, a civil society. 

Excepting San Francisco’s anti-masker politicos, it is easier to make allowances for the inexpert mask wearers of 1918 than for anti-masked crusaders today.  In 1918, many simply didn’t realize that pulling masks down below the nose negated whatever protection the masks provided.  The same is true of the well-meaning but guileless who made small holes in the center of their masks to allow for insertion of a cigarette.  It is much harder to excuse the Covid-19 politicos who resisted mask-wearing during the height of the pandemic and now refuse to don face masks in supermarkets and businesses as requested by store managers.  The political armor that shields them from prudent good sense, respect for store owners, and the safety of fellow shoppers is of a decidedly baser metal. 

The nadir of civil bankruptcy is their virulent hostility toward parents who, in compliance with state, municipal and school board ordinances – or even in their absence – send their children to school donned in face masks.  The notion that children wearing protective masks are in some way being abused, tormented, damaged pulls into its orbit all the rage-filled irrationality of the corrosive Trump era.  Those who would deny responsible parents the right to act responsibly on behalf of their children are themselves damaged.  They bring back to life in a new and chilling context that diagnostic warhorse of asylum psychiatrists (“alienists”) and neurologists of the 19th century:  moral insanity.  

The topic of child mask-wearing, then and now, requires an essay of its own.  By way of prolegomenon, consider the British children pictured below.  They are living, walking to school, sitting in their classrooms, and playing outdoors with bulky gas masks in place during the Blitz of London in 1940-1941.  How could their parents subject them to these hideous contraptions?  Perhaps parents sought to protect their children, to the extent possible, from  smoke inhalation and gas attack from German bombing raids.   It was a response to a grave national emergency.  A grave national emergency.  You know, like a global pandemic that to date has brought serious illness to over 46.6 million Americans and claimed over 755,000 American lives.  

 


[1] For an excellent overview of these initiatives, see See Nancy Tomes, “’Destroyer and Teacher’: Managing the Masses During the 1918-1919 Influenza Pandemic,” Public Health Rep. 125(Suppl 3): 48–62, 2010.  My abbreviated account draws on her article. 

[2] P. Burnett, “Did Masks Work? — The 1918 Flu Pandemic and the Meaning of Layered Interventions,” Berkeley Library, Oral History Center, University of California, May 23, 2020  (https://update.lib.berkeley.edu/2020/05/23/did-masks-work-the-1918-flu-pandemic-and-the-meaning-of-layered-interventions).  Nancy Tomes, “’Destroyer and Teacher’” (n. 1), affirms that the masks were effective enough to slow the rate of transmission. 

[3]  Although surgical nurses and interns in the U.S. began wearing masks after 1910, surgeons themselves generally refused until the 1920s: “the generation of head physicians rejected them, as well as rubber gloves, in all phases of an operation, as they were considered ‘irritating’.”  Christine Matuschek, Friedrich Moll, et al., “The History and Value of Face Masks,” Eur. J. Med. Res., 25: 23, 2020.

[4] My brief summary draws on Brian Dolan, “Unmasking History: Who Was Behind the Anti-Mask League Protests During the 1918 Influenza Epidemic in San Francisco,” Perspective in Medical Humanities, UC Berkeley, May 19, 2020.  Another useful account of  the mask-wearing ordinance and the reactions to it  is the “San Francisco” entry of the The American Influenza Epidemic of 1918-1919: A Digital Encyclopedia, produced by the  University of Michigan Center for the History of Medicine and Michigan Publishing (www.unfluenzaarchive.org/city/city-sanfrancisco.html).

[5] American Bar Association, “Two Centuries of Law Guide Legal Approach to Modern Pandemic,”  Around the ABA, April 2020                           (https://www.americanbar.org/news/abanews/publications/youraba/2020/youraba-april-2020/law-guides-legal-approach-to-pandem).

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Everything You Didn’t Want to Know About Typhoid

Extreme fatigue; dangerously high fever; severe abdominal pain; headaches; diarrhea or constipation; nausea and vomiting – the symptoms of severe typhoid fever can be a panoply of horrors.  Like cholera, the bacteria in question – Salmonella typhi, cousin to the Salmonella that causes food poisoning – find a home in water and food contaminated with human feces.  The infection is contracted only by humans, and it is highly contagious.  More persons contract it from human contact – often from unwashed hands following defecation – than from drinking contaminated water or ingesting contaminated food.  But the latter are hardly incidental causes.  At least two billion people worldwide, the World Health Organization tells us, drink feces-contaminated water.[1]

And the story gets worse. Through the 19th century, “chronic carriers” could not be conceptualized, much less detected.  They were symptom-free folks in whom typhi found safe harbor in the gall bladder, where they traveled with stored bile through the bile duct into the small intestine en route to fecal expulsion.  The chronic carriers brought infection to entire communities in sudden, explosive outbreaks; typhoid is a prime example of what epidemiologists term a “fulminant” disease (from the Latin fulmināre, to strike with lightning).  And worse still, the ranks of common carriers were enlarged by some of those who contracted the disease and recovered.  Typhi lived on in their gall bladders as well, and were passed on to others via the same fecal-oral route.

The Mother of all Common Carriers, the Super Spreader who comes down to us as Typhoid Mary, was one Mary Mallon, an Irish cook who passed on typhi to no less than 53 members of seven prominent Manhattan-area households between 1900 and 1906.  In 1907 she was quarantined in a bungalow on New York’s North Brother Island near Riverside Hospital, only to resume her career as cook-super spreader on release in 1910.  Tracked down five years later, she was whisked back to her island bungalow, where she lived out her remaining 23 years. 

Here is what Salmonella typhi do once ingested through the mouth.  Absent sufficient gastric acid to neutralize them in the stomach, the bacteria make their way to the terminal of the small intestine and enter the cells lining it.  Intestinal cells respond to the invaders with a massive inflammatory response that leads to an intestinal rupture, a hole, through which intestinal contents drain into the abdomen, with attendant and severe pain.  And from there matters go from bad to worse.  Without fast, effective treatment, the bacteria penetrate lymphatic tissue and enter the blood stream, which shuttles them to other organs:  the liver, the spleen, bone marrow.  In the worst cases, bacterial ulceration can extend all the way to the terminal lining of the ileum, from which typhi flood the body, carrying infection to the brain, heart, and pancreas.  Death is now around the corner; only major abdominal surgery holds any prospect of survival.  It is a pernicious disease of microbial migratory urgency.    

Improvements in water treatment and personal hygiene, along with antibiotic therapy and – yes! – a newly effective vaccine for adults, brought typhoid to its knees in the United States after World War II.  But the disease is alive and well in Central and South America, Africa, and parts of Asia, where it claims between 11 and 21 million victims and some 200,000 deaths each year.[2]  Typhi has evolved along with the antibiotics that control it, and multidrug-resistant strains (MDR) remain deadly.  And even here, on these ostensibly sanitized shores, typhi can still make its presence known.  As recently as 2010, nine Americans contracted typhoid, five in California and four in Nevada.[3] 

But such instances are aberrational, and in the northern hemisphere typhoid fever has long since vanished from anyone’s disease-monitoring radar.  Now federal and state governments, the bane of anti-vaccine irrationalists and mask-wearing naysayers, make sure we don’t drink water or eat food contaminated by microbe-laced feces.  But it was not so for our forebears. In the Civil War, typhoid fever devastated north and south alike; the Union Army’s general hospital, the Satterlee Hospital in West Philadelphia, was constructed in 1862 largely to cope with its victims.  In the Spanish-American War of 1898, typhoid fever shared center stage with yellow fever and, at war’s end, rated its own federal investigative commission.  Chaired by Walter Reed, the Typhoid Commission determined that contact among soldiers (“comrade contact”) was primarily responsible for the transmission of typhoid fever in military camps.[4]  Four years later, Koch’s investigations during a typhoid epidemic in Trier, Germany led him to generalize the Commission’s finding: typhoid fever was contracted less from contaminated water or sewage than from nonsymptomatic carriers; the “carrier hypothesis” was among his final significant contributions.[5] 

The era of modern typhoid prevention began in 1897, when Almroth Wright, then a pathologist at the British Army’s Medical School at Netley Hospital, published a paper on the typhoid vaccine he had developed with killed typhi.  The Army took note and, in the South African war the following year, made very limited use of it: of 330,000 British troops, only 14,000 received the vaccine.  It was effective in this limited trial but never caught on after the war.[6]  Beginning in 1902, the U.S. government’s Public Health and Marine Hospital Service, renamed the Public Health Service in 1912, focused its research on typhoid.  Progress was made, and by the time America entered WWI, the PHS’s Hygienic Laboratory had developed an antityphoid vaccine.[7]  American troops sailing to France in 1917 were not asked how they felt about receiving a typhoid vaccine; they received their mandatory shots and boarded their ships.  Those who were not vaccinated stateside received their shots on arriving at their camps.  Vaccination was not negotiable.  The obligation to live and fight for the nation trumped the freedom to be free to contract typhoid, suffer, and possibly die.  

“A Monster Soup Commonly Called Thames Water,” a mid 19th-century etching depicting the stew of disease-promoting organisms in the river that supplied drinking water to Londoners.

The vaccine dramatically reduced the incidents of typhoid, but it still wrought damage in field and base hospitals, especially among unvaccinated European troops who had been fighting since 1914.  American nurses who arrived in northern France and Belgium in advance of troops recalled their misery at being transferred to typhoid wards, which, as one recalled were “gloomy and dark.”  Another recalled a typhoid scourge that crippled her hospital and created an urgent need to find space outside the hospital for the typhoid patients.[8]

_______________________________

The current governors of Texas and Florida would surely look askance at the history of typhoid control, since a key aspect of it – allowing children on school premises to drink only water subjected to antimicrobial treatment – ignores parental freedom of choice.  Parents decide what their children eat, and they should be free to determine what kind of water they drink.   Children are not military enlistees obligated to remain healthy in the service of the nation.  What right do schools boards have to abrogate the freedom of parents to determine what kind of water their children drink?  Why should they be mandated to drink water subject to modern sanitary treatment that robs it of Salmonella typhi along with Vibrio cholerae, Poliovirus, and dysentery-causing Shigella?  Shouldn’t they be free to have their children partake of nature’s bounty, to drink fresh water from streams and rivers, not to mention untreated well water contaminated with human feces and the pathogens it harbors?

And here is the Covid connection.  If local school boards and municipal authorities lack the authority to safeguard children, to the extent possible, through obligatory wearing of facemasks, then surely they lack the authority to force them to drink water filtered through layers of state and federal regulation informed by modern science.  Let parents be free to parent; let their children pay the steep, even life-threatening price.      

Did I mention that young children, along with immune-compromised young adults, are at greatest risk for contracting typhoid?  Well, now you know, and now, perhaps, we can return to reality.  State governors who do not understand the legal and moral imperative of acting in the best interests of the child[9] are unfit for public office of any sort.  In point of fact, they are unfit. Who wants governors who, in denying adults the right to act responsibly in the best interests of children, sanction child abuse?  Let them crawl back into the existential dung heap whence they emerged.    


[1] https://www.who.int/news-room/fact-sheets/detail/drinking-water.

[2] https://www.cdc.gov/typhoid-fever/health-professional.html,

[3] https://www.cdc.gov/salmonella/2010/frozen-fruit-pulp-8-25-10.html.

[4] Victor C. Vaughan, A Doctor’s Memories (Indianapolis: Bobbs-Merrill, 1926), 369ff., 386.

[5] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 255-256.

[6] Gywn Macfarlane, Alexander Fleming: The Man and the Myth (Cambridge: Harvard University Press, 1984), 54-55.

[7] Victoria A. Harden, Inventing the NIH: Federal Biomedical Research Policy, 1887-1937 (Baltimore:  Johns Hopkins University Press, 1986), 41.

[8] Grace McDougall, A Nurse at the War:  Nursing Adventures in Belgium and France (NY: McBride, 1917), 111, 117; Alice Isaacson, Diary of 1917, Library & Archives, Canada, letter of 16 February 1917. 

[9] Joseph Goldstein, Anna Freud, et al., In the Best Interests of the Child (New York:  Free Press, 1986).

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

SHARE THIS POST:

Remembering Cholera in the Time of Covid

It came in crippling waves, the deadly intestinal infection that, in 1832 alone, killed 150,000 Americans.  Its telltale symptom was copious watery diarrhea (“rice water”) accompanied by heavy vomiting, loss of internal fluids and electrolytes, and dehydration; hence its name, cholera, from the Greek “flow of bile.”  Severe cases quickly proceeded to hypovolemic shock and could kill otherwise healthy persons, children and adults, within hours.  Historians refer to the cholera epidemics of the 19th century, but the epidemics, especially those of 1832, 1848, and 1866, were in fact pandemics, spreading from nation to nation and continent to content.

     Orthodox or “regular” physicians of the first half of the 19th century had no clue about its cause, mode of transmission, or remedy, so they speculated wildly in all directions.  Contagionists believed it spread from person to person.  Noncontagonists attributed it to poisonous fumes, termed miasma, which emanated from soil and   decaying matter of all kinds.  Some attributed it to atmospheric changes; others thought it a byproduct of fermentation.  Physician-moralists could be counted on to ascribe it to the moral failings of the underclass.  Withal, the regulars resorted, one and all, to “heroic” treatments, with bleeding and toxic chemicals, especially calomel (mercury) and laudanum (opium-laced alcohol) having pride of place. These treatments only hastened the death of seriously ill patients, which, given the extent of their suffering, may have been an unwitting act of mercy.  Physician-induced deaths resulted in the appeal of homeopathy and botanical medicine, far milder approaches that, so their practitioners avowed, caused far fewer deaths than the horrid regular remedies.  

Caricature of 1832, depicting cholera sufferer with nightmarish “remedies” of the day.

      The suffering public, seeing the baleful effects of conventional remedies, grew disgusted with doctors. The Cholera Bulletin, a newsletter put out by a group of New York physicians over the summer of 1832, grimly acknowledged in its July 23 issue, the “fierce onslaught” of cholera and doctors in felling the afflicted: “For Cholera kills and Doctors slay, and every foe will have its way!”  After a new wave of cholera reached American shores in 1848, New York’s Sunday Dispatch lambasted traditional medical science as “antiquated heathen humbug, utterly unworthy of the middle of the nineteenth century.”  “Cholera was a most terrible affliction,” chimed in the New York Herald a year later, “but bad doctors and bad drugs are worse.  The pestilence might come now and then; physicians we had always with us.”[1]

     And yet, amid such loathing about doctors and their so-called remedies, science marched on in the one domain in which forward movement was possible.  Throughout the second half of the 19th century, cholera was the catalyst that brought Europe and eventually America into the proto-modern era of public health management of infectious disease.  Then as now, measures that safeguarded the public from life-threatening infectious disease are good things.  In 1853, after another cholera epidemic reached Edinburgh, there was no political posturing about “rights” – presumably the right of British citizens to get sick and die.  Parliament, the “Big Government” of the day, resolved to go after the one major, recurrent infectious disease for which a vaccine was at hand:  smallpox.  The Vaccination Act of 1853 grew out of this resolve.  Among other things, it instituted compulsory smallpox vaccination, with all infants to be vaccinated within the first three months of life (infants in orphanages were given four months).  Physicians were obligated to send certificates of vaccination to local birth registrars, and parents who did not comply were subject to fines or imprisonment. The requirement was extended under the Vaccination Act of 1867.[2]

     New York City followed suit a decade later, when the state legislature created the Metropolitan Board of Health.  The Board responded to the outbreak of cholera in 1866 by mandating the isolation of cholera patients and disinfection of their excretions.  When a new epidemic, which travelled from India to Egypt, erupted in 1884, French and German teams descended on Egypt in search of the specific microorganism responsible for cholera.  The prize went to Robert Koch, who isolated the cholera vibrio in January 1884.  

     In 1892, when a cholera epidemic originating in Hamburg brought 100 cases to New York, the city mobilized with the full force of the new science of bacteriology.  The Board of Health lost no time in establishing a Division of Pathology, Bacteriology, and Disinfection, which included a state-of-the-art bacteriological laboratory under the direction of Hermann Biggs.  The lab, as we have seen, came into its own in the fight against diphtheria, but it was the threat of cholera that brought it into existence.  A year later, in 1893, Congress passed the National Quarantine Act, which created a national system of quarantine regulations that included specific procedures for the inspection of immigrants and cargos.  It was to be administered by the U.S. Marine Hospital Service, forerunner of the Public Health Service.

     In the late 1840s, the Bristol physician William Budd argued that contaminated sewage was the source of cholera, and in 1854 the surgeon John Snow traced the source of a cholera outbreak in his Soho, London neighborhood to contaminated well water.  But it was the Hamburg epidemic that proved beyond doubt that cholera was waterborne, and Koch himself demonstrated that water filtration was the key to its control.[3]  Now, we rarely hear of cholera, since water and waste management systems that came into existence in the last century eliminated it from the U.S. and Europe.[4]  Anti-vax libertarians would no doubt take exception to the Safe Drinking Water Act of 1974, which empowers the EPA to establish and enforce national water quality standards.  There it is again, the oppressive hand of Big Government, denying Americans the freedom to drink contaminated water and contract cholera.  Where has our freedom gone? 

Caricature of 1866, “Death’s Dispensary,” giving contaminated drinking water as a source of cholera.

     The gods will never stop laughing at the idiocy of humankind.  Here we are in 2021 and, thanks to the foundation laid down by 19th century scientists, gifted scientists of our own time have handed us, in astoundingly little time, an understanding of the Corona virus, its mode of transmission, and a pathway to prevention and containment.  We have in hand safe and effective vaccines that reduce the risk of infection to miniscule proportions and insure that, among the immunized, infection from potent new strains of the virus will be mild and tolerable, and certainly not life- threatening.   

     Yes, a small percentage of those who receive Covid vaccines will have reactions, and, among them, a tiny fraction will become ill enough to require treatment, even hospitalization.  But they will recover and enjoy immunity thereafter.  Such “risks” pale alongside those incurred by their forebears, who sought protection from smallpox in the time-tested manner of their forebears.  In America, a century before the discovery of Edward Jenner’s cowpox-derived vaccine, colonists protected themselves from recurrent smallpox epidemics through inoculation with human smallpox “matter.”  The procedure, termed variolation, originated in parts of Europe and the Ottoman Empire in the early 16th century, reaching Britain and America a century later, in 1721.  It involved inoculating the healthy with pus scraped from skin ulcers of those already infected, and was informed by the ancient observation that smallpox could be contracted only once in a lifetime.[5]  The variolated developed a mild case of smallpox which, so it was hoped, would confer protection against the ravages of future epidemics. 

     And they were essentially right: over 98% of the variolated survived the procedure and achieved immunity.[6]   To be sure, the risk of serious infection was greater with variolation than with Edward Jenner’s cowpox-derived vaccine, but the latter, which initially relied on the small population of English cows that contracted cowpox and person-to-person inoculation, was a long time in coming.  It took the United States most of the 19th century to maintain and distribute an adequate supply of Jennerian vaccine.  Long before the vaccine was widely available, when the death rate from naturally acquired smallpox was roughly 30%,[7] Americans joined Europeans, Asians, and Africans in accepting the risks of variolation. For George Washington, as we noted, the risks paled alongside the very real risk that the Continental Army would collapse from smallpox:  he had every soldier variolated before beginning military operations in Valley Forge in 1777.[8]

    But here we are in 2021, with many Americans unwilling to accept the possibility of any Covid vaccine reaction at all, however transient and tolerable.  In so doing, they turn their backs on more than two hundred years of scientific progress, of which the successful public health measures in Europe and America spurred by the cholera epidemics form an important chapter.  The triumph of public health, which antedated by decades the discovery of bacteria, accompanied increased life expectancy and vastly improved quality of life wrought by vaccine science, indeed, by science in general. 

     Witness Britain’s Great Exhibition of 1851, a scant three years after the cholera epidemic of 1848. Under the dome of the majestic Crystal Palace, science was celebrated in all its life-affirming possibilities.  In medicine alone, exhibits displayed mechanically enhanced prosthetic limbs, the first double stethoscope, microscopes, surgical instruments and appliances of every kind, and a plethora of pharmaceutical extracts and medicinal juices (including cod liver oil).[9] Topping it off was a complete model of the human body that comprised 1,700 parts.  Science promised better lives and a better future; scientific medicine, which by 1851 had begun to include public health measures, was integral to the promise.

     But here we are in 2021, replete with anti-vaccinationists who choose to endanger themselves, their children, and members of their communities.  They are anti-science primitives in our midst, and I behold them with the same incredulity that visitors to Jurassic Park beheld living, breathing dinosaurs.  Here are people who repudiate both public health measures (mask wearing, curfews, limits on group gatherings) and vaccination science in a time of global pandemic.  For them, liberty is a primal thing that antedates the social contract, of which our Constitution is a sterling example.  It apparently includes the prerogative to get sick and make others sick to the point of death.  The anti-vaccinationists, prideful in their ignorance and luxuriant in their fantasies of government control, remind me of what the pioneering British anthropologist Edward Tyler termed “survivals,” by which he meant remnants of cultural conditions and mindsets irrelevant to the present.  Dinosaurs, after all, didn’t care about the common good either.     


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

The War on Children’s Plague

In the early 19th century, doctors called it angina maligna (gangrenous pharyngitis) or “malignant sore throat.”  Then in 1826, the French physician Pierre-Fidele Bretonneau grouped both together as diphtherite.  It was a horrible childhood disease in which severe inflammation of the upper respiratory tract gave rise to a false membrane, a “pseudomembrane,” that covered the pharynx, larynx, or both.  The massive tissue growth prevented swallowing and blocked airways and often led to rapid death by asphyxiation.  It felled adults and children alike, but younger children were especially vulnerable.  Looking back on the epidemic that devastated New England in 1735-1736, the lexicographer Noah Webster termed it “literally the plague among children.”  It was the epidemic, he added, in which families often lost all, or all but one, of their children.

A century later, diphtheria epidemics continued to target the young, especially those in cities.  Diphtheria, not smallpox or cholera, was “the dreaded killer that stalked young children.”[1]   It was especially prevalent during the summer months, when children on hot urban streets readily contracted it from one another when they sneezed or coughed or spat.  The irony is that a relatively effective treatment for the disease was already in hand.

In 1882, Robert Koch’s assistant, Fredrich Loeffler, published a paper identifying the bacillus – the rod-shaped bacterium Corynebacterium diphtheria first identified by Edwin Klebs – as the cause of diphtheria.  German scientists immediately went to work, injecting rats, guinea pigs, and rabbits with live bacilli, and then injecting their blood serum – blood from which cells and clotting factor have been removed – into infected animals to see if the diluted serum could produce a cure.  Then they took blood from the “immunized” animal, reduced it to the cell-free blood liquid, and injected it into healthy animals. The latter, to their amazement, did not become ill when injected with diphtheria bacilli.  This finding was formalized in the classic paper of Emil von Behring and Shibasaburo Kitsato of 1890, “The Establishment of Diphtheria Immunity and Tetanus Immunity in Animals.”  For this, von Behring was awarded the very first Nobel Prize in Medicine in 1901.      

Thus the birth of blood serum therapy, precursor of modern vaccines and antibiotics alike.  By the early 1890s, Emile Roux and his associates at the Pasteur Institute discovered that infected horses, not the rabbits used by Behring and Kitsato, produced the most potent diphtheria serum of all.  Healthy horses injected with a heat-killed broth culture of diphtheria, it was found, could survive repeated inoculations with the live bacilli.  The serum, typically referred to as antitoxin, neutralized the highly poisonous substances – the exotoxins – secreted by diphtheria bacteria. 

And there was more:  horse serum provided a high degree of protection for another mammal, viz., human beings.  Among people who received an injection of antitoxin, only one in eight developed symptoms on exposure to diphtheritic individuals. In1895 two American drug companies, H. K. Mulford of Philadelphia and Parke Davis of Chicago, began manufacturing diphtheria antitoxin.  To be sure, their drug provided only short-term immunity, but it sufficed to cut the U.S. death rate among hospitalized diphtheria patients in half.  This fact, astonishing for its time, fueled the explosion of disease-specific antitoxins, some quite effective, some less so.  By 1904 Mulford alone had antitoxin preparations for anthrax, dysentery, meningitis, pneumonia, tetanus, streptococcus infections, and of course diphtheria. 

Colorful Mulford antitoxin ad from early 20th century, featuring, of course, the children

In the era of Covid-19, there are echoes all around of the time when diphtheria permeated the nation’s everyday consciousness. Brilliant scientists, then and now, deploying all the available resources of laboratory science, developed safe and effective cures for a dreaded disease.  But more than a century ago, the public’s reception of a new kind of preventive treatment – an injectable horse-derived antitoxin – was unsullied by the resistance of massed anti-vaccinationists whose anti-scientific claims are amplified by that great product of 1980s science, the internet. 

To be sure, in the 1890s and early 20th century, fringe Christian sects anticipated our own selectively anti-science Evangelicals.  It was sacrilegious, they claimed, to inject the blood product of beasts into human arms, a misgiving that did nothing to assuage their hunger for enormous quantities of beef, pork, and lamb.  Obviously, their God had given them a pass to ingest bloody animal flesh.  Saving children’s lives with animal blood serum was apparently a different matter. 

During the summer months, parents lived in anxious expectation of diphtheria every day their children ventured on to city streets.  Their fear was warranted and not subject to the denials of self-serving politicians.  In 1892, New York City’s Health Department established the first publicly funded bacteriological laboratory in the country, and between 1892 and the summer of 1894, the lab proved its worth by developing a bacteriological test for diagnosing diphtheria.  Infected children could now be sent to hospitals and barred from public schools.  Medical inspectors, armed with the new lab tests, went into the field to enforce a plethora of health department regulations. 

Matters were simplified still further in 1913, when the Viennese pediatrician Bela Schick published the results of experiments demonstrating how to test children for the presence or absence of diphtheria antitoxin without sending their blood to a city lab. Armed with the “Schick test,” public health physicians and nurses could quickly and painlessly determine whether or not a child was immune to diphtheria.  For the roughly 30% of New York City school children who had positive reactions, injections of antitoxin could be given on the spot.  A manageable program of diphtheria immunization in New York and other cities was now in place.    

What about public resistance to the new proto-vaccine?  There was very little outside of religious fringe elements.  In the tenement districts, residents welcomed public health inspectors into their flats.  Intrusion into their lives, it was understood, would keep their children healthy and alive, since it led to aggressive intervention under the aegis of the Health Department.[2]   And it was not only the city’s underserved, immigrants among them, who got behind the new initiative.  No sooner had Hamann Biggs, head of the city’s bacteriological laboratory, set in motion the lab’s inoculation of horses and preparation of antitoxin, than the New York Herald stepped forward with a fund-raising campaign that revolved around a series of articles dramatizing diphtheria and its “solution” in the form of antitoxin injections. The campaign raised sufficient funds to provide antitoxin for the William Parke Hospital, reserved for patients with communicable diseases, and for the city’s private physicians as well.  In short order, the city decided to provide antitoxin to the poor free of charge, and by 1906 the Health Department had 318 diphtheria antitoxin stations administering free shots in all five boroughs.[3][4]

A new campaign by New York City’s Diphtheria Prevention Commission was launched in 1929 and lasted two years.   As was the case three decades earlier, big government, represented by state and municipal public health authorities, was not the problem but the solution.  To make the point, the Commission’s publicity campaign adopted military metaphors.  The enemy was not government telling people what to do; it was the disease itself along with uncooperative physicians and recalcitrant parents.  “The very presence of diphtheria,” writes Evelynn Hammonds, “became a synonym for neglect.”[5]     

The problem with today’s Covid anti-vaccinationists is that their opposition to vaccination is erected on a foundation of life-preserving vaccination science of which they, their parents, their grandparents, and their children are beneficiaries.  They can shrug off the need for Covid-19 vaccination because they have been successfully immunized against the ravages of debilitating childhood diseases.  Unlike adults of the late-nineteenth and early-20th centuries, they have not experienced, up close and personal, the devastation wrought summer after summer, year after year, by the diphtheria bacillus.  Nor have they lost children to untreated smallpox, scarlet fever, cholera, tetanus, or typhus.  Nor, finally, have they, in their own lives, beheld the miraculous transition to a safer world in which children stopped contracting diphtheria en masse, and when those who did contract the disease were usually cured through antitoxin injections.

In the 1890s, the citizens of New York City had it all over the Covid vaccine resisters of today.  They realized that the enemy was not public health authorities infringing on their right to keep themselves and their children away from antitoxin-filled syringes. No, the enemy was the microorganism that caused them and especially their children to get sick and sometimes die. 

Hail the supremely common sense that led them forward, and pity those among us for whom the scientific sense of the past 150 years has given way to the frontier “medical freedom” of Jacksonian America.  Anti-vaccinationist rhetoric, invigorated by the disembodied comaraderie of internet chat groups, does not provide a wall of protection against Covid-19.  Delusory thinking is no less delusory because one insists, in concert with others, that infection can be avoided without the assistance of vaccination science. The anti-vaccinationists need to be vaccinated along with the rest of us.  A healthy dose of history wouldn’t hurt them either.         


[1] Judith Sealander, The Failed Century of the Child: Governing America’s Young in the Twentieth Century (Cambridge: Cambridge Univ. Press, 2003), p. 326.

[2] Evelynn Maxine Hammonds, Childhood’s Deadly Scourge: The Campaign To Control Diphtheria in New York City, 1880-1930 (Baltimore:Johns Hopkins University Press, 1999), 84-86.

[3] William H. Park, “The History of Diphtheria in New York, City,” Am. J. Dis. Child., 42:1439-1445, 1931.

[4] Marian Moser Jones, Protecting Public Health in New York City: Two Hundred Years of Leadership, 1805-2005 (NY: New York City Department of Health and Mental Hygiene, 2005), 20.                                     

[5] Hammonds, op cit., p. 206.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.