politics

(Review) How to Survive a Plague (2012), Dir. David France

 

howtosurviveaplagueIn November 2016, Knopf published David France’s How to Survive a Plague: The Inside Story of How Citizens and Science Tamed AIDS. While I certainly intend to read the book, my nightstand is currently overflowing with volumes. In the meantime, I opted to watch France’s documentary by the same title, which preceded the book by four years and inspired its publication.

The documentary covers the period from 1987-1995. As we progress in time, a counter records the number of worldwide AIDS-related deaths from year to year:

1987: 500,000
1988: 800,000
1989: 1.2 million
1990: 1.7 million
1991: 2.4 million
1992: 3.3 million
1993: 4.7 million
1994: 6.2 million
1995: 8.2 million

It was, as Larry Kramer shouts out during an argument with fellow activists, a modern plague. AIDS remains the most deadly pandemic since the 1918 outbreak of the H1N1 “Spanish” flu, which killed up to 5 percent of the world’s population.

How to Survive a Plague tells the story of the trials and triumphs of ACT UP (AIDS Coalition to Unleash Power). In 1987, NYC Mayor Ed Koch decried the group’s “fascist” tactics (e.g., sit-ins, blocking traffic, barricading themselves at government agencies). By 1989, the group’s disruptive methods had yielded institutional recognition, given a speaking position at the International AIDS Conference in San Francisco, and subsequently allowed to sit on NIH research committees.

The film, which is composed primarily of archival VCR footage with no voice-overs and only occasional contemporary interviews, immerses the viewer in the events on screen. We feel the same outrage as activists when in 1989 the Catholic Church takes a doctrinaire position against the use of condoms. We feel the same sense of hopelessness when the two most common drugs for treating AIDS are proved ineffectual in 1993. A prescription for one of the bunk drugs, AZT* (a highly toxic medication that had been abandoned as a cancer treatment due to high fatality rates), had cost as much as $10,000 per year—the most expensive drug in history. Burroughs Wellcome, the manufacturer of AZT, was pulling in profits of $230 million in the late 80s.

As events proceed, a regrettably common story line for minority civil rights organizations begins to take shape: buckling under mounting frustrations over set-backs and failures, the group’s anger turns inward. In the case of ACT UP, some members felt that the group’s Treatment and Data Committee (D&T) had become too cozy with the FDA, NIH, and pharmaceutical manufacturers. They called for a six month moratorium on meetings with drug companies, which, to the members of D&T, was tantamount to quickening death. Inevitably, the group split, with the D&T becoming TAG (Treatment Action Group).

The film tells a complicated story about how best to bring about change in the face of overwhelming structural opposition. While the civil disobedience that marked the early years of ACT UP brought widespread public recognition to the plight of AIDS victims, it was when the group traded in their characteristic black t-shirts and vests for suits and parleys with those in power that demands were translated into action. ACT UP’s activism brought about accelerated FDA approval of new medications, and by giving those infected with the virus a voice in clinical trials and research committees, researchers were able to produce drugs that would halt the replication of HIV by 1996. In so doing, HIV became a chronic condition akin diabetes, one that could not be cured but effectively managed. That such a treatment could be developed in the span of 15 years stands as one of the major victories of modern medical science.

And yet, one can only wonder how many lives might have been saved had organizations like ACT UP not had to force the U.S. government to take action in the first place. Given that Ronald Reagan continues to be lionized with increasing fervor by conservatives with each passing election cycle, it seems unlikely that widespread pubic opinion will ever pass judgment on the inhumanity he showed to AIDS victims. In the minds of many among the “Moral Majority,” all the right people were dying.

*To read the shameful history of AZT, I recommend Celia Farber’s 1989 exposé for SPIN Magazine.

Review: Black Earth: The Holocaust as History and Warning (2015) by Timothy Snyder

blackearthTimothy Snyder’s history of the Holocaust is situated within the framework of his academic speciality: (post)modern Eastern Europe. Snyder’s previous work of popular history, Bloodlands: Europe Between Hitler and Stalin (2013), focused on the hellish reality of Eastern European countries caught between two mass murdering empires in WWII. Black Earth can be read as an extension of that project, only from the perspective of the 5.5 million Jews murdered by Germany in the stateless zones of Eastern Europe. Perhaps unavoidably when one considers the tangled history of the states in question, the reader risks getting lost in the minutia of rival political parties, internal schisms, disputed borders, and ancient ethnic prejudices. Yet despite the expository density, the central tenets of the book are repeated frequently and clearly enough to be condensed easily.

The overwhelming majority of Jews who perished in the Holocaust were murdered in Eastern Europe. Black Earth is a retelling and interpretation of those events, seeking to answer why and how the “heartland of world Jewry” became the staging ground for its extermination. While typical historical analysis has attributed this phenomenon to the fanatical anti-Semitism of Eastern Europeans, Snyder argues that prejudice alone cannot account for why so many people began murdering their neighbors in 1941. Rather, he argues, the true culprit walked away from the war largely unscathed in public opinion and conventional historical narratives: the Soviet Union.

Building his thesis, Snyder first draws a distinction between states that experienced consecutive vs. double occupation. Poland, for example, was subjected to consecutive occupation: invaded by Germany and the USSR from both directions in 1939, and subsequently split between the two aggressors. By contrast, Baltic and Eastern European countries experienced double occupation: first occupied and annexed by the Soviet Union following the Molotov-Ribbentrop Pact of 1939, and then by Germany after Hitler betrayed Stalin and initiated Operation Barbarossa in 1941.

By the time the Germans began their occupation of Eastern Europe, the region had already been devastated by a Soviet campaign of annihilation that included mass murders by the NKVD and large-scale deportations to gulag camps. Indeed, the Soviet atrocities of WWII are much more difficult to fathom or scrutinize, given the capricious nature by which the state murdered and imprisoned millions of civilians. When the Germans arrived and repelled the Red Army, they could convincingly present themselves as liberators. The governments of these ravaged states having already been toppled by the Soviets, the conditions were ideal for Hitler to enact his Final Solution. It was only in such an expansive stateless zone that the Nazis could translate ideology (the Judeo-Bolshevik Myth) into politicized action (the extermination of all Jews—men, women, and children).

Eastern Europeans, anxious for a scapegoat that would obscure their own complicity in the Soviet horrors, gladly bought into the Nazi credo: all Jews are communists and all communists are Jews. This created an arena for performative Nazism, with Eastern Europeans murdering Jews in order to prove themselves to the new regime. Imprisoned Soviet collaborators were told that they could buy their freedom and redeem themselves by killing a single Jew. Greed was another common motivation, as many of the murderers were able to plunder Jewish capital already confiscated by the Soviets. Somewhat less convincingly, Snyder promotes a psychoanalytic interpretation, suggesting that many Eastern Europeans participated in the mass killing of Jews so as to absolve their own conscience of the sins of Soviet collaboration.

Snyder should consider releasing the concluding chapter of Black Earth as a long-form essay. It is in these final pages that he issues the warning of his subtitle, parsing out similarities between our own time and the late 1930s. The modern Right has been radicalized, deriving from its anti-government extremism an erroneous belief that freedom comes from the dismantling of state authority. By way of example, Snyder cites the American invasion and toppling of a sovereign government in Iraq—a march of folly marketed as a campaign to bring freedom to the Iraqi people, but in actuality unleashing total chaos and laying the foundation for the rise of Islamic fascism. Across the ideological spectrum, the Left has been caught in a rising tide of anarchism, exemplified by the various “Occupy” movements. Both factions now crave the destruction of order. As Snyder eloquently observes: “A common ideological reflex has been postmodernity: a preference for the small over the large, the fragment over the structure, the glimpse over the view, the feeling over the fact.” Snyder writes most ominously about climate change, ecological disasters, and imminent conflicts over land and food. However, in the wake of the 2016 U.S. presidential election, his warnings about Vladimir Putin’s ongoing assaults on the postwar order (then isolated to Western Europe and Crimea) feel the most immediately dire. We are undoubtedly entering a new cycle of extremism and global unrest, with challenges that in many ways exceed those of the interwar period. It remains to be seen whether or not we have learned the lessons of history.

Review: Achieving Our Country: Leftist Thought in Twentieth-Century America (1999) by Richard Rorty

86103Richard Rorty’s William E. Massey Sr. Lectures in the History of American Civilization have been rediscovered, almost two decades after they were delivered, in light of the election of Donald Trump as the 45th President of the United States. And with good reason. For those of us who watched the 2016 election night results materialize in utter stupefaction and horror, Rorty would inveigh that the Left has been flippantly ignoring the potential for a Trump presidency since the collapse of the leftist reform movement in the mid-1960s. This will seem counter-intuitive to many, as the 1960s are now ingrained in the cultural consciousness as the apotheosis of the American Left. Rorty, however, draws a distinction between earlier 20th century iterations of leftist activism (which were were bonded with workers’ unions and won political victories that furthered social progress and fair standards of labor) with the cultural politics that became the driving force of the Counter-Culture Revolution, and which dominates leftist discourse to this day. The crucial distinction, in Rorty’s words, is between “agents and spectators.”

In the first lecture, “American National Pride: Whitman and Dewey,” Rorty uses those two eminent American philosophers as counterweights to the pervading modern influence of fatalist Europeans such as Foucault, Heidegger, Lacan, and Derrida. What Rorty sees as a source of inspiration in Whitman and Dewey is their ability to reconcile the potential for American national pride with their secular liberal humanism. Rorty makes what I found to be a provoking argument: that leftists who view the most shameful events of American history, such as the American Indian genocide or the killing of over a million Vietnamese, as irrevocable stains on the United States, events so grave atonement is impossible, fall into the trappings of a metaphysical sinner complex better suited to the followers of St. Augustine. Dewey, by contrast, “wanted Americans to share a civic religion that substituted utopian strivings for claims to theological knowledge” (38). Instead, on the wings of Foucault and Lacan, American leftists have become apocalyptic martyrs and the rationalizers of hopelessness.

The second lecture, “The Eclipse of the Reformist Left,” is a retelling through Rorty’s eyes of how the cultural Left, fueled by university unrest, usurped the reformist Left in the mid-60s. He opens with a bold proclamation: Marxism at the end of the 20th century is as morally bankrupt as Roman Catholicism at the end of the 17th century. Rorty proudly proclaims himself an anti-Marxist, and levels language as strong as any right-winger’s against Stalin’s “evil empire” and its insidious global influence. He offers a defense of the Congress for Cultural Freedom (of which his father was a member), which the New York Times exposed as a CIA outfit in 1966. The CCF’s endowment is still a point of contention for cultural leftists who were molded intellectually in the mid-20th century (I most recently came across an ominous allusion in a review of Matthew Spender’s 2015 memoir, in reference to his father the poet Stephen Spender, editor of the CCF-funded literary magazine Encounter). Rorty agrees with Todd Gitlin that the watershed moment for the splintering of the Left occurred in August 1964, when the Mississippi Freedom Democratic Party was denied seats at the DNC in Atlantic City, and Congress passed the Tonkin Gulf Resolution. Young leftists were left with an impression of their country as inherently stained, corrupt, and irreparable, thus bringing an end to the leftist reformism that defined the so-called Progressive Era. In place of reforms, the New Left called for revolution. This type of thinking is perhaps best illustrated in Rorty’s critique of Christopher Lasch’s 1969 polemic, The Agony of the American Left, a book which “made it easy to stop thinking of oneself as a member of a community, as a citizen with civil responsibilities. For if you turn out to be living in an evil empire (rather than, as you had been told, a democracy fighting an evil empire), then you have no responsibility to your country, you are accountable only to humanity” (66).

For those readers who are most curious about Rorty’s anticipation of Trump, skip to the third and final lecture, “A Cultural Left.” This lecture will be a bitter pill for many young academics to swallow, ensconced as they are in the rhetoric of identity politics (I write this as a recent humanities Ph.D. dropout). Rorty’s diagnosis of the current malaise requires something the cultural Left deplores: nuance. While he praises the advances the New Left has made in curbing the dominance of stigma and sadism (racism, sexism, homophobia, etc.) in American culture, he draws attention to the glaring fact that, while we have made much headway in social equality since the 1960s, economic inequality has increased in tandem. It’s as if the Left lacks the focus to address dual initiatives. While recent achievements in socio-cultural progress have been nothing short of heroic, by taking our eyes off the fight for socialist economic reforms, we were effectively asleep at the wheel in the 1980s as the Reaganite neo-liberals waged their campaign of annihilation on the meager welfare state accomplished by the reformist Left. By the end of the century, we had also lost the minds of the voters, and the only democrats who stood a chance at national election were the advocates of lukewarm centrism like Bill Clinton.

So, the reader must ask, if the Left has proven historically unable to pursue multiple objectives simultaneously, then which is more important: cultural or economic reforms? Rorty, while not dismissing the fight against all forms of cultural prejudice, warns that a globalized economy run by an entrepreneurial elite is far more likely to upend the American political system and augur an authoritarian future. For, as he rightly points out, many of the Progressive Era champions of socialist economic reforms—labor union members, farmers, unskilled workers—undoubtedly included many racists, sexists, and homophobes. However, by coalescing those groups around the banner of shared economic interests, leftists were able to achieve upward social mobility for all Americans. By contrast, in the absence of a strong, politicized movement for workers’ rights and fair wages, the white working class, prone as we all are to tribalism, will inevitably fall under the spell of populism. Enter Trump:

…members of labor unions, and unorganized unskilled workers, will sooner or later realize that their government is not even trying to prevent wages from sinking or to prevent jobs from being exported. Around the same time, they will realize that suburban white-collar workers—themselves desperately afraid of being downsized—are not going to let themselves be taxed to provide social benefits for anyone else.

At that point, something will crack. The nonsuburban electorate will decide that the system has failed and start looking around for a strongman to vote for—someone willing to assure them that, once he is elected, the smug bureaucrats, tricky lawyers, overpaid bond salesmen, and postmodernist professors will no longer be calling the shots. A scenario like that of Sinclair Lewis’ novel It Can’t Happen Here may then be played out. For once such a strongman takes office, nobody can predict what will happen. In 1932, most of the predictions made about what would happen if Hindenburg named Hitler chancellor were wildly optimistic. (90)

The election of Donald Trump is a travesty, though one we should have seen coming. My initial reaction (disgust, hopelessness, defeat) is emblematic of the dangerous sense of resignation that made it possible for the Far-right to consume every branch of the American government in the first place. Those of us on the Left now have two options: give in to our anguish and become spectators to the dismantling of a century’s worth of progress, or once again become agents for change. We should never forget the more sinister chapters of America’s past, but in recognizing the potential for even darker days ahead, our energy is best spent not in doleful reflection, but in working to achieve a country worthier of our pride. As Rorty writes, “we should not let the abstractly described best be the enemy of the better” (105).

Perhaps mercifully, Richard Rorty did not live to see the events he predicted transpire. He finished his career at Stanford University, and died in 2007 at the age of 75.