Review: The Downing Street Years (1993) by Margaret Thatcher

51X0DkGblnLIn her book All Must Have Prizes, the journalist Melanie Phillips made what I once considered a very strange assertion for a right-winger: that is, essentially, that Margaret Thatcher was not really a conservative. Here it is important to note the lower case -c, as opposed to the proper noun Conservative Party. Indeed, there are plenty (perhaps even a majority) of Conservatives in the British Parliamentary Party who are not conservatives. But how could any astute political observer reach the conclusion that Mrs. Thatcher was anything but the archest of conservatives, if not the mother of the movement? In fact, after reading The Downing Street Years, I see that Ms. Phillips got it quite right—a point I’ll return to in the conclusion of this (far too lengthy) review.

It must be stated outright that, regardless of one’s own politics, Mrs. Thatcher’s political memoirs do not make for easy or even pleasant reading. The reader is made instantly aware that this is a scientific, analytically-oriented brain at work (Mrs. Thatcher read chemistry at Oxford and worked in the lab that invented soft serve ice cream), as from the prologue one is immersed in highly technocratic jargon and a very dry recitation of chronological events. Due to the remarkably lengthy term of Mrs. Thatcher’s premiership (May 1979-November 1990), the book spans almost 900 pages. Precisely recalled statistics, dates, and acronyms pile upon one another so outrageously as to almost seem ironic. Witness:

Geoffrey Howe was able to demonstrate that to reduce the top rate of income tax to 60 per cent (from 83 per cent), the basic rate to 30 per cent (from 33 per cent), and the PSBR to about £8 billion (a figure we felt we could fund and afford) would require an increase in the two rates of VAT of 8 per cent and 12.5 per cent to a unified rate of 15 per cent. (The zero rate on food and other basics would be unchanged.) I was naturally concerned that this large shift from direct to indirect taxation would add about four percentage points onto the Retail Price Index (RPI).

Or, take for example this reflection on selecting a date for the 1983 general election: “Therefore, if we went in June it would have to be the 9th, rather than the 16th or 23rd.” These examples are excruciatingly emblematic. Not surprisingly, there is a four page list of acronyms and abbreviations attached as an appendix. Compounding the abstruseness of the narrative is a near total lack of pathos, self-reflection, or humor that isn’t biting. Mrs. Thatcher’s only expressed regrets are times when she should have been even more unyielding, ruthless in battle, and secure in the “Tightness” of her positions. There is absolutely nothing to connect with here on an emotional level. The book can be recommended only on the grounds that one is seeking a meticulous—if cold-blooded— synopsis of the major events of the Thatcher era and, by extension, the 1980s as a whole.    

Though the reader risks getting lost in the weeds, Mrs. Thatcher’s focus over these many pages can be boiled down to four primary challenges which she saw facing Britain and the world when she assumed office in 1979: longterm economic decline, the debilitating effects of socialism, the growing Soviet threat, and the inexorable trudge toward economic and monetary union (EMU) for the European Community. I will organize my review accordingly.

Mrs. Thatcher vs. “Managed Decline”

I preferred disorderly resistance to decline rather than comfortable accommodation to it.

Only the most partisan and deluded of her critics will deny that the country Mrs. Thatcher inherited in 1979 was in shambles. Her election victory came on the heels of Britain’s “Winter of Discontent”—a period between 1978-79 when strikes by public sector trade unions brought the country to its knees. But the struggles had begun much earlier. The British economy was chronically ill throughout the 1970s—so bad by 1974 that Foreign Secretary James Callaghan warned of an impending “breakdown of democracy.” Inflation reached a crippling 26.9% in late 1975, leading Harold Wilson’s Labour government to adopt an incomes policy that capped pay increases for public sector workers at government-mandated limits. Sanctions were levied to persuade private companies to follow suit. But while inflation had halved by 1978, Mr. Callaghan (now the PM) and his minority Labour government kept wage increases capped below 5%. The Trades Union Congress (TUC), which had played nice with their Labour allies for three years, finally revolted. When Mr. Callaghan announced that the general election anticipated for September would be postponed until the next year, he set off the largest disruption of British labor since 1926.

What began with strikes by Ford workers and lorry drivers soon grew to encompass a wide cross section of the public sector: railwaymen, nurses, ambulance drivers, waste collectors, gravediggers. Social services ground to a halt as hospitals were staffed to treat only emergency cases. The Army was called up to provide emergency response services. Rubbish accumulated at such volumes that it had to be stored in public parks and squares, attracting vermin. Corpses went unburied while local councils seriously considered mass burials at sea or allowing bereaved family members to dig graves for their own deceased. To add insult to injury, blizzards and the coldest winter in 16 years depressed retail operations and weakened the economy still further. As one head of the British Civil Service reportedly remarked some years earlier, the best the country could hope for now was the “orderly management of decline.”                 

The Conservative Party won a 43 seat majority in May 1979 on a 5.2% swing, the largest since Clement Attlee ousted Winston Churchill in 1945. Mrs. Thatcher became Europe’s first elected female head of government. She came in like a bull in a china shop, asking: “What great cause would have been fought and won under the banner: ‘I stand for consensus’?” Only a character of remarkable self-confidence would seek to pick up the mantle of leadership during such troubled times. When reflecting upon those tumultuous days early in her premiership, Mrs. Thatcher recalls a famous quote by William Pitt the Elder: “My Lord, I am sure I can save this country, and no else can.” She does not feign modesty: “It would have been presumptuous to have compared myself to Chatham. But if I am honest, I must admit that my exhiliration came from a similar inner conviction.” A reader of American history is reminded of William L. Yancey’s encomium upon the election of Jefferson Davis as President of the Confederacy: “The man and the hour have met.”     

Mrs. Thatcher vs. Socialism

To cure the British disease with socialism was like trying to cure leukaemia with leeches.

The historian Kenneth O. Morgan has written that the spectacular fall of Labour (cursed not to regain power for 18 years) and the rise of Mrs. Thatcher “meant the end of an ancien régime, a system of corporatism, Keynesian spending programmes, subsidised welfare, and trade union power.” By 1979, the United Kingdom, like much of Western Europe, had acquiesced to what seemed the inevitable advance toward comprehensive socialism. In Britain’s case, the project began in 1945 when politicians of Left and Right, seeking unanimity and balance as they set about rebuilding their country, acceded to a so-called “post-war consensus” (exemplified by the somewhat satirical term “Butskellism”). Under this general agreement, Conservatives colluded with Labour in a far-reaching program of nationalization, regulation, high taxation, and an oversized welfare state. To fully fathom the insurgent, quasi-revolutionary quality of Mrs. Thatcher’s rise to power, one must appreciate that her election marked the end of a post-war order that had begun some 30 years earlier and was by that point firmly entrenched in the British psyche.

The Thatcher manifesto called for decentralization, deregulation, privatization, busting up unions, curbing inflation via interest rate manipulation and a tight control of the money supply, income tax cuts for top rates, and, perhaps most alarmingly, austerity measures. The latter policy was entirely against the pervading economic logic of the day. It was widely accepted that reducing both expenditures and borrowing during times of recession was a recipe for disaster, but Mrs. Thatcher was characteristically dismissive of “those who had not heard that Keynes was dead.” The PM was an avowed acolyte of Milton Friedman, and her government was betting on the maturity of the British public: things would have to get worse before they could get better. The overarching economic goal of the first Thatcher Parliament can be fairly succinctly stated (though it is a bit of a tongue twister): reduce the deficit, which in turn will reduce inflation and thus mitigate the need to fund future deficits via inflation.    

The economic record of Thatcherism is mixed and still a matter of fraught debate. That said, I discern four key areas where hard facts may be gleaned. [This is the long version, for a quick summation, skip to the next paragraph.] First, Mrs. Thatcher’s budget measures proved remarkably effective at bringing down inflation (it ticked up again following the economic boom of the late 80s—though in The Downing Street Years, she places practically all of the blame on Chancellor Nigel Lawson’s decision to abandon monetarism for shadowing the exchange rate with the Deutschmark). Second, unemployment grew much worse during the years of recession, exceeding 3 million in 1983. It experienced a gradual decline with the boom years beginning in 1987, but never recovered to pre-1980 levels. Third, GDP plummeted into negative territories during 1980-81 while the Government pursued its stringent anti-inflationary policies. Recovery began in 1982 and growth picked up in earnest in 1985 as the boom accelerated. Fourth, interest rates were raised to an absurd high of 17% in 1979, but the desired effect was accomplished: inflation began a steady decline.

Moral of the story: Mrs. Thatcher’s budgetary measures brought down inflation to a steady 4-5% throughout most of her premiership. Those same policies led to a fairly consistent unemployment rate that hovered around 10%. If you were middle class, upwardly mobile, and primarily concerned with matters of consumerism, then her policies dramatically improved your quality of life over the socialist codes of Labour. If you were lower class, employed in state-supported industries, or historically dependent upon the welfare system, Thatcherism was a harsh pill to swallow.

The aggressive record of privatization is almost mind boggling, and on this score it’s difficult to find fault. Consider that all of these firms were at least partially owned by the state until Mrs. Thatcher sold them off: British Aerospace, British Petroleum, British Steel, Britoil, British Gas, British Leyland (which included Jaguar and Land Rover), and Rolls Royce. Many of these companies have gone on to become very successful private firms. Moreover, their privatization freed up government money that otherwise would have necessitated further spending cuts or taxation, and tax payers were no longer burdened with propping up failing or wasteful industries.

Mrs. Thatcher’s great battle with the National Union of Mineworkers (TUM) in 1984 is now the stuff of legend. She writes that “history intertwined with myth seemed to have made coal mining in Britain a special case: it had become an industry where reason simply did not apply.” She was unwilling to relent on coal pit closures on economic grounds, as indeed the Labour government had closed 32 pits between 1974-79. Suffice it to say that a similar strike had toppled the Conservative government of Ted Heath some 10 years earlier. By contrast, the miners gradually returned to work in 1985 having won no concessions from the Thatcher government. The TUM was permanently hobbled, uneconomic pits were closed, and Mrs. Thatcher won over the greater part of public opinion. As the years progressed (due largely to the continued efforts of Norman Tebbit), the TUC was no longer in a position to cripple industry or public services with strikes. And, despite the rhetoric which sough to cast Labour and the unions as noble defenders of the working man, many of the Thatcher government’s reforms were undeniably beneficial to workers’ rights—e.g., state-subsidized mail ballots, which prevented union leaders from intimidating workers into supporting strikes with public votes.     

Mrs. Thatcher’s great gamble—and the one time when her thinking was not reflective of the majority of the British public—was her attempt to dispense with property taxes used to fund local government and replace them with a so-called “community charge” (which in the popular vernacular quickly became known as the “poll tax”). This was a fairly shameful act of hubris, whereby the Thatcher government called on every adult to pay a flat-rate per capita tax, irrespective of property value or personal wealth. Mrs. Thatcher laments in her memoirs that she did not take adequate measures to prevent Labour councils from hiking up the charge in order to damage the government, which she blames for the resulting fiasco. Indeed, here she concedes to a perverse undermining of her own guiding principle: stronger powers should have been allotted to central government. In reality, the policy itself was patently absurd and strongly suggests that by the later years of her premiership Mrs. Thatcher’s ego was turning her into a real version of the caricature her enemies had always painted. In her memoirs she claims to regret that the policy meant that “the very people who had always looked to me for protection from exploitation by the socialist state were those who were suffering most.” At the time, however, she refused to relent until it was already too late. The public outcry bridged political ideologies, and the fears of Conservative backbenchers and her cabinet members alike that a public rebuke at the polls was brewing proved the underlying force behind Mrs. Thatcher’s ouster in late 1990. Her successor, John Major, announced the abolition of the community charge the following year and the Conservatives won their fourth consecutive general election in 1992.                              

Mrs. Thatcher vs. Communism

All my reading and experience has taught me that once the state plays fast and loose with economic freedom, political freedom risks being the next casualty.

The Soviet aim, only thinly disguised, right up until the time when a united Germany remained in NATO, was to drive a wedge between America and her European allies. I always regarded it as one of Britain’s most important roles to see that such a strategy failed.

By the time Mrs. Thatcher came to power, the Cold War was more than 30 years old and Europe had experienced a shift in public opinion. In the decades of peace which followed the end of the Second World War, Western Europe had grown prosperous and complacent. American-financed security had allowed for wide scale investment of public monies in the development of generous welfare states, which naturally entrenched the reflexive bent toward socialism and the Leftist worldview. Though heads of government recognized the need for (and in many cases covertly supported the preservation of) American military patronage, the publics were under the sway of a programmatic anti-Americanism, and the zeitgeist called for the disentangling of Western European affairs from American foreign policy and perceived warmongering.

Mrs. Thatcher attributed much of this rhetoric to the success of Soviet propaganda. As an unabashed anti-communist, she was by the 1980s an odd woman out. While the rest of Western Europe distanced itself from the United States, she sought to foment closer ties in the trans-Atlantic partnership. To that end, she was aided in her much lauded friendship with Ronald Reagan. At a time when even many American conservatives were calling for unilateral disarmament, Mrs. Thatcher was a stolid proponent of the Reagan Administration’s aggressive military posture against the Soviet Union, and she welcomed a larger, state-of-the-art American nuclear arsenal stationed in Europe.

[This section under construction: More to come soon…]   

Mrs. Thatcher vs. the European Community

I had said at the beginning of the government ‘give me six strong men and true, and I will get through.’ Very rarely did I have as many as six.

As an evangelist for free markets and trade, Mrs. Thatcher was a natural proponent of Britain’s entry into the European Economic Community (EEC) in 1973. It was only when federalists like Jacques Delors seized control and began pushing the Community toward EMU that Mrs. Thatcher became the most outspoken of Euroskeptics. Of course, she had gotten off to a rough start in European relations: at her first EEC conference in 1979 she had obstinately demanded (and eventually won) a rebate for Britain’s excessive contributions to the Community’s budget. But she was nevertheless committed to the project on economic terms. It was not until she discerned the early trappings of the European Union that she turned outright hostile, and by then it was too late.

Mrs. Thatcher’s vociferous opposition to EMU and her Cassandra-like warnings that Germany was effectively conquering Europe via economic Anschluss might be dismissed as the ravings of a xenophobe whose anti-German prejudices were forged in the fires of her WWII-era girlhood.* But this misses the point entirely. She was just as conscious that EMU would be ruinous for the Germans. What Mrs. Thatcher instinctively perceived (and amazingly she was one of the few who did) was the bipolar makeup of Europe: the debt-prone countries of the South (she prophetically hones in on Greece in particular) would inevitably find themselves debtors to the wealthy North—and, due to its large population and industry (what Mrs. Thatcher calls its “preponderance”), Germany in particular. Likewise, with all of Europe shackled to a single currency, profligate governments could not be allowed to fail. Thus it would fall again and again to German tax payers, whose prosperity had been won through the diligence of their work-oriented culture, to squander their hard-earned wealth on propping up degenerate socialist economies. Surely not even the most fervent Europhiles or anti-Thatcherites would now deny the soundness of this analysis, or indeed that so much of what she predicted has come to pass.

And yet it was the members of her own Parliamentary Party—and her cabinet especially—who pressed for greater European integration. Why was this the case? Mrs. Thatcher lays much of the blame on the Foreign Office. Across Western Europe, foreign ministers relished the prospects of greater power and prestige for their own offices should EMU come to pass. An integrated European economy would lead to increased political clout, allowing European politicians to rival their American counterparts in global influence. According to Mrs. Thatcher, as Foreign Secretary, “Geoffrey [Howe] harbored an almost romantic longing for Britain to become part of some grandiose European consensus.” If this meant surrendering some of the sovereignty of national parliaments, none of these men seemed to care very much.

Mrs. Thatcher recalls her battles with (and the double dealings of) Howe and Lawson in particular—each of whom was clamorously calling for Britain’s entry into the European Exchange Rate Mechanism (ERM). She admits that her use of vague language (always maintaining in public statements and cabinet meetings that Britain would join “when the time was ripe”) was a stalling tactic. By the late 1980s, she had become convinced that Britain should never join, but she knew that hers was a minority position within her own government, and that she couldn’t hold off the pressure forever. Tellingly, Mrs. Thatcher writes about these events in military terms—e.g., “the assault at Madrid.” Isolated and under fire, she capitulated and the pound sterling entered the ERM in October 1990, one month before Mrs. Thatcher’s own downfall.

Mrs. Thatcher’s political career was mortally wounded when both Lawson and Howe resigned, ostensibly over the ERM debate. She suggests that Lawson, who favored the ERM but was equally opposed to EMU, quit the Treasury because he knew he’d wrecked the economy with soaring inflation by subverting her own traditional economic strategy, which called for interest rate manipulation and monetarism. He left the mess for someone else to clean up. Howe, by contrast, was a zealous Europhile, but his resignation was just as much predicated upon the personal enmity that had developed between the two of them over 11 years of close partnership. Howe addressed his resignation in a speech before the House that was a carefully crafted exercise in character assassination. If he had hoped to bolster his own image, history has not been kind. As Mrs. Thatcher sums up the spectacle:

…Geoffrey Howe from this point on would be remembered not for his staunchness as Chancellor, nor for his skillful diplomacy as Foreign Secretary, but for this final act of bile and treachery. The very brilliance with which he wielded the dagger ensured that the character he assassinated was in the end his own.

As her troubles mounted, Mrs. Thatcher was challenged for leadership of the Conservative Party by an old foe: Michael Heseltine. She missed the required majority of 15% to defeat him in the first round by just two votes. She announced her intention to stand for the second ballot, but one by one her cabinet minsters turned against her. The last chapter of the book has the quality of a Greek tragedy as Mrs. Thatcher and her lone band of supporters (foremost among them Tebbit and Michael Portillo) frantically beseech old friends and colleagues for support, while all the while it becomes increasingly clear that the PM has been abandoned and her days are numbered. John Wakeham was incapable even of putting together a campaign team to support her in the second ballot. Mrs. Thatcher knew it was the end, writing that in the immediacy of the moment it was not so much her fall from power as the duplicity of the men whom she had lifted up with her that stung the most:

I was sick at heart. I could have resisted the opposition of opponents and potential rivals and even respected them for it; but what grieved me was the desertion of those I had always considered friends and allies and the weasel words whereby they had transmuted their betrayal into frank advice and concern for my fate.   

Mrs. Thatcher resigned before the second ballot and Major was elected Leader, thus becoming her successor as Prime Minister. He secured a fourth consecutive general election victory for the Conservatives in April 1992, albeit with a reduced majority. Five months later, unable to keep the pound from depreciating below its agreed limit, he was forced to withdraw sterling from the ERM. The treasury estimated the cost at more than £3 billion, while trading losses were estimated at £800 million. The economy entered recession and the housing market crashed. However, now free of the ERM, the Major government decamped to the old standards of Thatcherism and, in large measure, righted the ship via inflation targeting. The economy was much improved by 1997 when Tony Blair and his “New” Labour Party swept into No. 10 in a landslide.          

*However, Mrs. Thatcher’s opposition to German reunification following the fall of the Berlin Wall does bely a fundamental (if historically practical) Germanophobia.                                      

In Conclusion

In all this, my problem was simple. There was a revolution still to be made, but too few revolutionaries.

To return to the question that opened my review: Was Mrs. Thatcher a conservative? Here is the definition of “conservative,” courtesy of Merriam-Webster:

a : tending or disposed to maintaining existing views, conditions, or institutions

b : marked by moderation or caution

Mrs. Thatcher was neither of these things. In fact, she personified the antonym of conservative: radical. Specifically, a radical liberal of the classical, Adam Smith variety. She writes as much when she states that “the one thing you never get from parties which deliberately seek the middle way between left and right is new ideas and radical initiatives. We were the mould-breakers, they the mould.”  

In All Must Have Prizes, Ms. Phillips writes: “The Conservative government under Margaret Thatcher…institutionalized through its political program the no blame, no shame, no pain society. And in the process, it helped the disintegration of British culture itself.” It’s hard to argue against the notion that Mrs. Thatcher possessed a very limited idea of how the world worked: there was right and wrong, good and evil. This simplicity of focus could be a quality of great leadership—e.g., emboldening her decisiveness in times of war. But in more immediate terms it meant that she saw practically everything in terms of economics. She ran the economy of a G-7 nation based on the same logic of her father’s grocery, ever conscious that good business meant turning a small profit and keeping the books balanced. But even her staunchest defenders must concede that her rise to power coincided with widespread cultural unrest and growing discord, and at the root were problems which had festered for generations and could not be corrected by market forces. As the more socially conservative Ms. Phillips would have it, Mrs. Thatcher’s premiership further deteriorated the already eroding bonds of trust and fellowship in British culture, nor did it stem the advancement of secularism. Another conservative intellectual, Peter Hitchens, has targeted Mrs. Thatcher’s record on similar grounds, calling her a “noble failure” and criticizing specifically the carelessness with which she unleashed the markets, blinded in her fanaticism to their destructive effects on communities, culture, and other such intangibles.

If one accepts these critiques, then Mrs. Thatcher’s policies brought about a fearful symmetry between Right and Left: advancing the preeminence of the Individual over historically transmitted obligations to community and shared values. If either the market or the state is to prevail, then all forms of inherited tradition—family, education, morality—and external authority must be done away with. In both systems, it is a selfish, institutionalized individualism that reigns supreme. In Mrs. Thatcher’s society the shibboleth is “I want”; in that of her socialist adversaries: “I feel.” Both are empires of the first person singular.   

[This section under construction: More to come soon…

Epilogue: The Feminine Mystique

My experience is that a number of the men I have dealt with in politics demonstrate precisely those characteristics which they attribute to women — vanity and an inability to make tough decisions. There are also certain kinds of men who simply cannot abide working for a woman. They are quite prepared to make every allowance for ‘the weaker sex’: but if a woman asks no special privileges and expects to be judged solely by what she is and does, this is found gravely and unforgivably disorienting. Of course, in the eyes of the ‘wet’ Tory establishment I was not only a woman, but ‘that woman’, someone not just of a different sex, but of a different class, a person with an alarming conviction that the values and virtues of middle England should be brought to bear on the problems which the establishment consensus had created. I offended on many counts.

[This section under construction: More to come soon…

Review: Fates and Furies (2015) by Lauren Groff

fatesDespite the considerable critical praise, I approached Lauren Groff’s third novel with some trepidation. I felt exhausted from the outset by the prospects of another study on the vicissitudes of upper middle class Manhattanite matrimony. I was mistaken. Rather than wallowing in the banal realism that would characterize a lesser work of fiction (characters moving from optimism to set-backs, parenthood to infidelity and divorce, somber old age), Fates and Furies is a far darker narrative, made especially ominous by parenthetical voices that interrupt the narrative in spasms to offer brusque commentary, shed light on interior motives, and cast fortunes for our two protagonists, Lancelot (“Lotto”) and Mathilde Satterwhite.

These cold interpolations belong to the ancient narrators invoked in the novel’s title: the Moirai (Fates) and the Erinyes (Furies). They recite chilly morals: “Grief is for the strong, who use it as fuel for burning.” They are callous, as when reflecting on fireworks at a Fourth of July party: “Doomed people celebrate peace with sky bombs.” Their omnipotence means that nothing is off limits to the reader, in one instance extending even into the mind of a cat. Most brilliant is a scene in which Lotto, Mathilde, and a chorus of family and friends sing carols round a Christmas tree, catching the attention through their window of a passerby from the street, who will hold onto the image for the rest of his life: “All those years, the singers in the soft light in the basement apartment crystallized in his mind, became the very idea of what happiness should look like.”

The novel is read as two books. Fates tells the story of the husband Lotto, his childhood, teen years, and his married life with Mathilde. Furies resets the plot and reexamines many of the same events from Mathilde’s perspective, while also filling in the gaps of her own childhood. Lotto is an innocent narcissist, self-defeatist but favored by the Fates and everyone who meets him. In her own narrative, Mathilde, whose life was shaped by a fatal event in early childhood, unsheathes the fury at her core and shocks the reader.

For all of its stylistic tricks, Fates and Furies never feels gimmicky. The prose is florid but not self-indulgent.The abundant references to canonical Western literature are employed meaningfully and not as pretentious accouterments. The writing does not strain for eloquence or profound ontological insight. They come naturally. While the novel is grounded in traditions dating back to antiquity, it still manages to say something new and revelatory about the meaning of human existence. In so doing, Groff performs two subversive feats. First, the narrative challenges the primacy of free will in the Judeo-Christian worldview. Agency is axiomatic in modern thinking, but the Ancient Greek cosmology was ordered by a strong theological determinism—after all, even the gods of Olympus could not unspool the weaving of the Fates. Second, Mathilde is emblematic of what might be considered a new trend in fiction: the literature of female rage. Elena and Lila, heroines of Elena Ferrante’s Neapolitan Novels, belong to this same class. In these worlds (almost always acutely domestic), women suffer many indignities and are rarely recognized for their abilities, but their yielding exteriors mask an almost animalistic will to survive. Self-immolation is not in the cards. Mathilde would sooner poison Monsieur Bovary or push Karenin in front of the train.

(Review) How to Survive a Plague (2012), Dir. David France

 

howtosurviveaplagueIn November 2016, Knopf published David France’s How to Survive a Plague: The Inside Story of How Citizens and Science Tamed AIDS. While I certainly intend to read the book, my nightstand is currently overflowing with volumes. In the meantime, I opted to watch France’s documentary by the same title, which preceded the book by four years and inspired its publication.

The documentary covers the period from 1987-1995. As we progress in time, a counter records the number of worldwide AIDS-related deaths from year to year:

1987: 500,000
1988: 800,000
1989: 1.2 million
1990: 1.7 million
1991: 2.4 million
1992: 3.3 million
1993: 4.7 million
1994: 6.2 million
1995: 8.2 million

It was, as Larry Kramer shouts out during an argument with fellow activists, a modern plague. AIDS remains the most deadly pandemic since the 1918 outbreak of the H1N1 “Spanish” flu, which killed up to 5 percent of the world’s population.

How to Survive a Plague tells the story of the trials and triumphs of ACT UP (AIDS Coalition to Unleash Power). In 1987, NYC Mayor Ed Koch decried the group’s “fascist” tactics (e.g., sit-ins, blocking traffic, barricading themselves at government agencies). By 1989, the group’s disruptive methods had yielded institutional recognition, given a speaking position at the International AIDS Conference in San Francisco, and subsequently allowed to sit on NIH research committees.

The film, which is composed primarily of archival VCR footage with no voice-overs and only occasional contemporary interviews, immerses the viewer in the events on screen. We feel the same outrage as activists when in 1989 the Catholic Church takes a doctrinaire position against the use of condoms. We feel the same sense of hopelessness when the two most common drugs for treating AIDS are proved ineffectual in 1993. A prescription for one of the bunk drugs, AZT* (a highly toxic medication that had been abandoned as a cancer treatment due to high fatality rates), had cost as much as $10,000 per year—the most expensive drug in history. Burroughs Wellcome, the manufacturer of AZT, was pulling in profits of $230 million in the late 80s.

As events proceed, a regrettably common story line for minority civil rights organizations begins to take shape: buckling under mounting frustrations over set-backs and failures, the group’s anger turns inward. In the case of ACT UP, some members felt that the group’s Treatment and Data Committee (D&T) had become too cozy with the FDA, NIH, and pharmaceutical manufacturers. They called for a six month moratorium on meetings with drug companies, which, to the members of D&T, was tantamount to quickening death. Inevitably, the group split, with the D&T becoming TAG (Treatment Action Group).

The film tells a complicated story about how best to bring about change in the face of overwhelming structural opposition. While the civil disobedience that marked the early years of ACT UP brought widespread public recognition to the plight of AIDS victims, it was when the group traded in their characteristic black t-shirts and vests for suits and parleys with those in power that demands were translated into action. ACT UP’s activism brought about accelerated FDA approval of new medications, and by giving those infected with the virus a voice in clinical trials and research committees, researchers were able to produce drugs that would halt the replication of HIV by 1996. In so doing, HIV became a chronic condition akin diabetes, one that could not be cured but effectively managed. That such a treatment could be developed in the span of 15 years stands as one of the major victories of modern medical science.

And yet, one can only wonder how many lives might have been saved had organizations like ACT UP not had to force the U.S. government to take action in the first place. Given that Ronald Reagan continues to be lionized with increasing fervor by conservatives with each passing election cycle, it seems unlikely that widespread pubic opinion will ever pass judgment on the inhumanity he showed to AIDS victims. In the minds of many among the “Moral Majority,” all the right people were dying.

*To read the shameful history of AZT, I recommend Celia Farber’s 1989 exposé for SPIN Magazine.

Review: Black Earth: The Holocaust as History and Warning (2015) by Timothy Snyder

blackearthTimothy Snyder’s history of the Holocaust is situated within the framework of his academic speciality: (post)modern Eastern Europe. Snyder’s previous work of popular history, Bloodlands: Europe Between Hitler and Stalin (2013), focused on the hellish reality of Eastern European countries caught between two mass murdering empires in WWII. Black Earth can be read as an extension of that project, only from the perspective of the 5.5 million Jews murdered by Germany in the stateless zones of Eastern Europe. Perhaps unavoidably when one considers the tangled history of the states in question, the reader risks getting lost in the minutia of rival political parties, internal schisms, disputed borders, and ancient ethnic prejudices. Yet despite the expository density, the central tenets of the book are repeated frequently and clearly enough to be condensed easily.

The overwhelming majority of Jews who perished in the Holocaust were murdered in Eastern Europe. Black Earth is a retelling and interpretation of those events, seeking to answer why and how the “heartland of world Jewry” became the staging ground for its extermination. While typical historical analysis has attributed this phenomenon to the fanatical anti-Semitism of Eastern Europeans, Snyder argues that prejudice alone cannot account for why so many people began murdering their neighbors in 1941. Rather, he argues, the true culprit walked away from the war largely unscathed in public opinion and conventional historical narratives: the Soviet Union.

Building his thesis, Snyder first draws a distinction between states that experienced consecutive vs. double occupation. Poland, for example, was subjected to consecutive occupation: invaded by Germany and the USSR from both directions in 1939, and subsequently split between the two aggressors. By contrast, Baltic and Eastern European countries experienced double occupation: first occupied and annexed by the Soviet Union following the Molotov-Ribbentrop Pact of 1939, and then by Germany after Hitler betrayed Stalin and initiated Operation Barbarossa in 1941.

By the time the Germans began their occupation of Eastern Europe, the region had already been devastated by a Soviet campaign of annihilation that included mass murders by the NKVD and large-scale deportations to gulag camps. Indeed, the Soviet atrocities of WWII are much more difficult to fathom or scrutinize, given the capricious nature by which the state murdered and imprisoned millions of civilians. When the Germans arrived and repelled the Red Army, they could convincingly present themselves as liberators. The governments of these ravaged states having already been toppled by the Soviets, the conditions were ideal for Hitler to enact his Final Solution. It was only in such an expansive stateless zone that the Nazis could translate ideology (the Judeo-Bolshevik Myth) into politicized action (the extermination of all Jews—men, women, and children).

Eastern Europeans, anxious for a scapegoat that would obscure their own complicity in the Soviet horrors, gladly bought into the Nazi credo: all Jews are communists and all communists are Jews. This created an arena for performative Nazism, with Eastern Europeans murdering Jews in order to prove themselves to the new regime. Imprisoned Soviet collaborators were told that they could buy their freedom and redeem themselves by killing a single Jew. Greed was another common motivation, as many of the murderers were able to plunder Jewish capital already confiscated by the Soviets. Somewhat less convincingly, Snyder promotes a psychoanalytic interpretation, suggesting that many Eastern Europeans participated in the mass killing of Jews so as to absolve their own conscience of the sins of Soviet collaboration.

Snyder should consider releasing the concluding chapter of Black Earth as a long-form essay. It is in these final pages that he issues the warning of his subtitle, parsing out similarities between our own time and the late 1930s. The modern Right has been radicalized, deriving from its anti-government extremism an erroneous belief that freedom comes from the dismantling of state authority. By way of example, Snyder cites the American invasion and toppling of a sovereign government in Iraq—a march of folly marketed as a campaign to bring freedom to the Iraqi people, but in actuality unleashing total chaos and laying the foundation for the rise of Islamic fascism. Across the ideological spectrum, the Left has been caught in a rising tide of anarchism, exemplified by the various “Occupy” movements. Both factions now crave the destruction of order. As Snyder eloquently observes: “A common ideological reflex has been postmodernity: a preference for the small over the large, the fragment over the structure, the glimpse over the view, the feeling over the fact.” Snyder writes most ominously about climate change, ecological disasters, and imminent conflicts over land and food. However, in the wake of the 2016 U.S. presidential election, his warnings about Vladimir Putin’s ongoing assaults on the postwar order (then isolated to Western Europe and Crimea) feel the most immediately dire. We are undoubtedly entering a new cycle of extremism and global unrest, with challenges that in many ways exceed those of the interwar period. It remains to be seen whether or not we have learned the lessons of history.

Review: Alexis (1929) by Marguerite Yourcenar (Tr. Walter Kaiser)

alexisSuffering turns us into egotists, for it absorbs us completely: it is later, in the form of memory, that it teaches us compassion.

Alexis, or the Treatise of Vain Struggle, published in 1929, was Marguerite Yourcenar’s first novel. The book was not published in English translation until 1984, well after the success of Memoirs of Hadrian. It seems safe to assume this work is now largely forgotten, with copies being produced by FSG on demand (the last page indicates that my copy was printed on 29 November 2016—the date I ordered it).

In more accurate terms, the book is an epistolary novella, taking the form of an extended letter written by the title character to his estranged wife, Monique. The letter is a confession in which Alexis recounts his lifelong struggle to accept his homosexuality. The internal world that Alexis describes is one of fundamental binary oppositions (instinct vs intellect, pleasure vs. suffering, body vs. soul), making the novel an optimal text for structuralist analysis.

One fundamental question that preoccupies Alexis continues to fuel most modern debates on homosexuality: are his urges the result of nature or “external influences”? In support of the latter, he notes an unhealthy attachment to his mother and sisters in adolescence, which led to a “veneration” of women that made it impossible to love them. These conditions, he writes, certainly influenced his temptations, “yet I see clearly that one should always go back to much more intimate explanations, much more obscure ones, which we understand imperfectly because they are hidden within us.” Thus Alexis interprets his instincts as conditioned behavior, compounded by subconscious forces. In her 1963 preface, Yourcenar writes that she no longer remembers whether or not she subscribed to Alexis’s environmental view of homosexuality at the time of writing, but that she has since rejected any interpretation that situates sexual orientation within the “psychological methodology of our age.”

Any reader shocked that a book of such forthright (albeit abstract) inquiry into the nature of same-sex attraction could be published in 1929 should recall that André Gide published the last of his Corydon dialogues in 1920 (the subtitle of Alexis alludes to an earlier lyrical work of Gide, La tentative amoureuse, ou le traité du vain désirThe Attempt at Love, or the Treatise of Vain Desire). By the interwar period, homosexuality was a topic of open conversation and debate, and Yourcenar was writing in the freest atmosphere for exploring such questions since the capitulation of the Greco-Roman world to monotheism. Yourcenar writes that not only was the subject “in the air at the moment,” but that it was also in the “fabric of a life”—that is, her own life. Here one draws a comparison with the British historical novelist Mary Renault. Not only did both women feel the allure of antiquity, but each shared her life with a female partner, and each wrote a modernist novel (in Renault’s case, The Charioteer) exploring homosexuality through the eyes of a male subject.

Yourcenar herself described the prose of Alexis as “decanted language.” This austere delivery, which so perfectly channeled the voice of a fading Emperor Hadrian, feels slightly overdone in the case of Alexis. That said, the asceticism of the novel is in keeping with the essence of Yourcenar’s oeuvre: a world of melancholy stoicism in which characters express themselves not through dialog, but intense introspection. With subsequent readings, this characteristic formality does not impede emotional response, but rather enhances it. We find in Yourcenar’s philosophic prose reverence for the durability of corporeal man in the face of spiritual despair. Alexis, for his part, embraces the lessons of suffering, and praises his body, “which cured me from having a soul.”

Review: The Glass Menagerie (1945) by Tennessee Williams

The play is memory. Being a memory play, it is dimly lighted, it is sentimental, it is not realistic. In memory everything seems to happen to music. That explains the fiddle on the wings.

41dxoww0bslIn a couple of months, I’ll be spending the night of my 30th birthday at a preview performance of Broadway’s seventh revival of Tennessee Williams’s The Glass Menagerie. Sam Gold will direct Sally Field as Amanda Wingfield. This production opens just three years after the most recent revival closed on the afternoon of my birthday in 2014. In anticipation of the upcoming performance, I decided to revisit the reading text for the play that made Tennessee Williams a household name.

The Glass Menagerie is, in Williams’s own terminology, a “memory play.” Each scene constitutes a murky flashback conjured from the mind of our narrator, Tom Wingfield, as he recalls the final weeks before he abandoned his dependent mother and sister in their St. Louis tenement in 1937. The mother, Amanda, is an aging southern belle who, like her son, is caught in a vise of memories and regrets. She obsessively rehashes the glory days of her debutante youth when she could attract seventeen gentlemen callers in a single day, and hopes to redeem her own degradation by securing the future of her daughter, Laura. By contrast to the two Wingfield pugilists who rage at life’s inequities (and one another), Laura is a quiet girl, painfully shy and overcome with insecurities due to a minor limp resulting from a short leg. She leads a hermetic life, devoting all her attention to a cherished collection of glass figurines. The aspiring poet Tom supports the family with a dead-end job at a shoe warehouse, scribbling verses on box lids by day and escaping every night into a world of adventure at the movies. A large photograph of the Wingfield patriarch, “a telephone man who fell in love with long distances” and skipped town, observes their misery from above the mantel.

After an explosive argument between mother and son in Scene 3, Amanda makes a plea to Tom: find a nice, “clean-living” coworker to introduce to Laura. Tom relents and brings home Jim O’Connor, the fabled Gentleman Caller. Jim’s appearance in Scene 6 shocks the reader out of the bottled up, self-fabricated world of the Wingfields—characters whose lives form what Robert Bray calls a “triangle of quiet desperation.” Jim is an agent of insurgent banality. The reader is recalled to reality by the plainness of his speech and unaffected manner. Jim’s enthusiasm for life stands in stark contrast to the fatalism of the Wingfields, and his faith in technology and progress contradicts the family’s self-imposed displacement from time. Tom’s opening monologue confirms Jim’s disruptive normality in the hallucinatory universe of the play’s triad: “He is the most realistic character in the play, being an emissary from a world of reality that we were somehow set apart from.”

What follows in Scene 7, as Laura, left alone with Jim by candlelight, is drawn out of her anxious mistrust of the world and blossoms before our eyes, only to have her hopes crushed, is among the most tragic scenes in theater. What reader, regardless of his familiarity with the plot, can help but feel some spontaneous elation at the breaking of the glass unicorn—only to realize its symbology is not liberation, but final defeat? Moreover, if the play is Tom’s memory, and given that Tom is not present during the intimate meeting between Laura and Jim, then the substance of the encounter is suspect. Perhaps in reality Laura passed the evening alone, too afraid to join the dinner party and never experiencing even a momentary glimpse at happiness. I’m not sure which is more heartbreaking.

Tennessee Williams once said that his only influences were Chekhov, D. H. Lawrence, and his own life. The Glass Menagerie, more than any other of his works, draws extensively on autobiographical elements. Williams’s real name was Thomas, thus Tom (whose fiery confrontation with Amanda in Scene 3 is precipitated by her disposing of his copy of Lawrence, which she deems filth). Williams’s schizophrenic sister Rose was the inspiration for Laura, whose high school nickname was “Blue Roses.” In 1943, the year before the play premiered, Rose was lobotomized and spent the rest of her life in an institution. In light of this personal tragedy, one better understands the 1947 essay that Williams wrote for the New York Times on “The Catastrophe of Success,” which serves as the epilogue to the New Directions reading text. While the wild success of The Glass Menagerie brought him fame and money, Williams found the security of wealth “a kind of death,” and determined that privation led to “compassion and moral conviction,” and was thus the foundation of authentic artistic expression. Appropriately, Williams turned the success of a play about wasted time into a warning: “…time is short and it doesn’t return again. It is slipping away while I write this and while you read it, and the monosyllable of the clock is Loss, loss, loss, unless you devote your heart to its opposition.”

Review: Achieving Our Country: Leftist Thought in Twentieth-Century America (1999) by Richard Rorty

86103Richard Rorty’s William E. Massey Sr. Lectures in the History of American Civilization have been rediscovered, almost two decades after they were delivered, in light of the election of Donald Trump as the 45th President of the United States. And with good reason. For those of us who watched the 2016 election night results materialize in utter stupefaction and horror, Rorty would inveigh that the Left has been flippantly ignoring the potential for a Trump presidency since the collapse of the leftist reform movement in the mid-1960s. This will seem counter-intuitive to many, as the 1960s are now ingrained in the cultural consciousness as the apotheosis of the American Left. Rorty, however, draws a distinction between earlier 20th century iterations of leftist activism (which were were bonded with workers’ unions and won political victories that furthered social progress and fair standards of labor) with the cultural politics that became the driving force of the Counter-Culture Revolution, and which dominates leftist discourse to this day. The crucial distinction, in Rorty’s words, is between “agents and spectators.”

In the first lecture, “American National Pride: Whitman and Dewey,” Rorty uses those two eminent American philosophers as counterweights to the pervading modern influence of fatalist Europeans such as Foucault, Heidegger, Lacan, and Derrida. What Rorty sees as a source of inspiration in Whitman and Dewey is their ability to reconcile the potential for American national pride with their secular liberal humanism. Rorty makes what I found to be a provoking argument: that leftists who view the most shameful events of American history, such as the American Indian genocide or the killing of over a million Vietnamese, as irrevocable stains on the United States, events so grave atonement is impossible, fall into the trappings of a metaphysical sinner complex better suited to the followers of St. Augustine. Dewey, by contrast, “wanted Americans to share a civic religion that substituted utopian strivings for claims to theological knowledge” (38). Instead, on the wings of Foucault and Lacan, American leftists have become apocalyptic martyrs and the rationalizers of hopelessness.

The second lecture, “The Eclipse of the Reformist Left,” is a retelling through Rorty’s eyes of how the cultural Left, fueled by university unrest, usurped the reformist Left in the mid-60s. He opens with a bold proclamation: Marxism at the end of the 20th century is as morally bankrupt as Roman Catholicism at the end of the 17th century. Rorty proudly proclaims himself an anti-Marxist, and levels language as strong as any right-winger’s against Stalin’s “evil empire” and its insidious global influence. He offers a defense of the Congress for Cultural Freedom (of which his father was a member), which the New York Times exposed as a CIA outfit in 1966. The CCF’s endowment is still a point of contention for cultural leftists who were molded intellectually in the mid-20th century (I most recently came across an ominous allusion in a review of Matthew Spender’s 2015 memoir, in reference to his father the poet Stephen Spender, editor of the CCF-funded literary magazine Encounter). Rorty agrees with Todd Gitlin that the watershed moment for the splintering of the Left occurred in August 1964, when the Mississippi Freedom Democratic Party was denied seats at the DNC in Atlantic City, and Congress passed the Tonkin Gulf Resolution. Young leftists were left with an impression of their country as inherently stained, corrupt, and irreparable, thus bringing an end to the leftist reformism that defined the so-called Progressive Era. In place of reforms, the New Left called for revolution. This type of thinking is perhaps best illustrated in Rorty’s critique of Christopher Lasch’s 1969 polemic, The Agony of the American Left, a book which “made it easy to stop thinking of oneself as a member of a community, as a citizen with civil responsibilities. For if you turn out to be living in an evil empire (rather than, as you had been told, a democracy fighting an evil empire), then you have no responsibility to your country, you are accountable only to humanity” (66).

For those readers who are most curious about Rorty’s anticipation of Trump, skip to the third and final lecture, “A Cultural Left.” This lecture will be a bitter pill for many young academics to swallow, ensconced as they are in the rhetoric of identity politics (I write this as a recent humanities Ph.D. dropout). Rorty’s diagnosis of the current malaise requires something the cultural Left deplores: nuance. While he praises the advances the New Left has made in curbing the dominance of stigma and sadism (racism, sexism, homophobia, etc.) in American culture, he draws attention to the glaring fact that, while we have made much headway in social equality since the 1960s, economic inequality has increased in tandem. It’s as if the Left lacks the focus to address dual initiatives. While recent achievements in socio-cultural progress have been nothing short of heroic, by taking our eyes off the fight for socialist economic reforms, we were effectively asleep at the wheel in the 1980s as the Reaganite neo-liberals waged their campaign of annihilation on the meager welfare state accomplished by the reformist Left. By the end of the century, we had also lost the minds of the voters, and the only democrats who stood a chance at national election were the advocates of lukewarm centrism like Bill Clinton.

So, the reader must ask, if the Left has proven historically unable to pursue multiple objectives simultaneously, then which is more important: cultural or economic reforms? Rorty, while not dismissing the fight against all forms of cultural prejudice, warns that a globalized economy run by an entrepreneurial elite is far more likely to upend the American political system and augur an authoritarian future. For, as he rightly points out, many of the Progressive Era champions of socialist economic reforms—labor union members, farmers, unskilled workers—undoubtedly included many racists, sexists, and homophobes. However, by coalescing those groups around the banner of shared economic interests, leftists were able to achieve upward social mobility for all Americans. By contrast, in the absence of a strong, politicized movement for workers’ rights and fair wages, the white working class, prone as we all are to tribalism, will inevitably fall under the spell of populism. Enter Trump:

…members of labor unions, and unorganized unskilled workers, will sooner or later realize that their government is not even trying to prevent wages from sinking or to prevent jobs from being exported. Around the same time, they will realize that suburban white-collar workers—themselves desperately afraid of being downsized—are not going to let themselves be taxed to provide social benefits for anyone else.

At that point, something will crack. The nonsuburban electorate will decide that the system has failed and start looking around for a strongman to vote for—someone willing to assure them that, once he is elected, the smug bureaucrats, tricky lawyers, overpaid bond salesmen, and postmodernist professors will no longer be calling the shots. A scenario like that of Sinclair Lewis’ novel It Can’t Happen Here may then be played out. For once such a strongman takes office, nobody can predict what will happen. In 1932, most of the predictions made about what would happen if Hindenburg named Hitler chancellor were wildly optimistic. (90)

The election of Donald Trump is a travesty, though one we should have seen coming. My initial reaction (disgust, hopelessness, defeat) is emblematic of the dangerous sense of resignation that made it possible for the Far-right to consume every branch of the American government in the first place. Those of us on the Left now have two options: give in to our anguish and become spectators to the dismantling of a century’s worth of progress, or once again become agents for change. We should never forget the more sinister chapters of America’s past, but in recognizing the potential for even darker days ahead, our energy is best spent not in doleful reflection, but in working to achieve a country worthier of our pride. As Rorty writes, “we should not let the abstractly described best be the enemy of the better” (105).

Perhaps mercifully, Richard Rorty did not live to see the events he predicted transpire. He finished his career at Stanford University, and died in 2007 at the age of 75.