Humanities Hub

Jan. 6, 2021: “Mad King George,” or Charles the First?

Anonymous White House insider reports claim that Donald Trump is behaving like “Mad King George.” In private, maybe he is; in public, though, on Wednesday, January 6, 2021, he acted like Charles the First. Less familiar to Americans today, Charles’s actions in 1640s England inform the eighteenth-century Constitution of the United States.

Infuriated by Puritans in the House of Commons, Charles the First told the Lord Mayor of London not to call up any troops, and, in the absence of any armed opposition, led a march across the capital city, from Whitehall to Westminster. There, Charles entered Parliament with a group of 80 armed soldiers, and attempted to arrest five prominent members of Parliament who were opposed to him. A marching progress across London was no secret to Parliament, so the Five Members, as they became known, had already evacuated to a secure location, a barge in the Thames, famously undisclosed by the Speaker of the House, William Lenthall, who told the King, “I have neither eyes to see nor tongue to speak in this place but as this House is pleased to direct me.”*

The king had entered Parliament, attempted to arrest legislators, and failed. It was a stupendous overreach, and disastrous. Within months, England was at war with itself; within years, Charles had been executed by Parliament, and England became a republic. No reigning monarch has entered the House of Commons since that day in 1642. Remarkably, that day was January 4th, technically in 1641, on the Julian calendar the United Kingdom (and its colonies) used until 1752. That is, Charles the First laid seige to Parliament nearly exactly 380 years to the day before Donald Trump told a “Save America March” rally crowd which was chanting ”Fight for Trump! Fight for Trump!,” “after this, we’re going to walk down and I’ll be there with you. We’re going to walk down–We’re going to walk down. Anyone you want, but I think right here, we’re going to walk down to the Capitol.”

Of course, there are differences between the two events. For one thing, as it turned out, Donald Trump was not there with them; he was back in the White House watching it all play out on TV. By contrast, Charles actually led troops into Parliament, and, by contrast, kept them out of Commons when he went in. In 1642, it was King Charles who sat in the Speaker Chair, but only after asking to do so; by contrast, it was members of the pro-Trump mob who having chased Congress out of their chambers then sat for selfies in the Speaker’s Chair. Charles and his troops did not damage Parliament; if anything, they damaged the monarchy. Of course, by contrast, those who stormed Congress smashed windows, tore down signs, and, reportedly, relieved—one might say “expressed”—themselves in the hallways and offices of elected representatives.  And, in 1642, no one died.

We are all still sorting what else Trump and his supporters damaged when they laid siege to the Capitol—possibilities include art works, carpeting, desks, democracy, international reputation, and the future itself.  What happened between the White House and the Capitol on the afternoon of January 6 is not unprecedented; it is instead loaded with precedents, which is what makes it so powerfully significant.  Every event is unique, and developing a completely accurate picture–every angle of every participant before, during, and after, plus all the angles not available to participants—is beyond us. The events of 1642 vex historians to this day, at a four-century remove. What used to be called the Puritan Revolution became known as the English Revolution, then the English Civil War, then the English Civil Wars, and, most recently, the War of Three Kingdoms.  In any case, by definition, historical analogies cannot be exact, and, therefore, in that sense, history cannot be a guide.  However, the examination of similarity and difference is one of the advantages of analogies; they constitute a means of a cumulative measurement.

The most important disadvantage of historical analogy in particular is the contextual narrative in which the historical example is embedded, and thus comes with. In the familiar narrative, for example, George the Third precedes a Revolution. As we saw on Wednesday the 6th, the siege takers saw themselves in George the Third terms: taking on the narrative implied by the analogy, they were re-doing the American revolution, complete with “Don’t Tread on Me,” and with Mike Pence and Nancy Pelosi as the malevolent tyrants, apparently. (And like Charles, they arrived ready to arrest their five, albeit with plastic zip ties this time.) Here, too, the glaring inequity in treatment of Black Lives Matter protests and the Capitol siege involves the same racist logic seen in the analogical eighteenth-century ‘revolution’ against the Mad King George. As Samuel Johnson noted at the time, in his 1775 essay Taxation No Tyranny, “we hear the loudest yelps for liberty among the drivers of negroes.” Such iniquitous treatment in the defense of freedom is not unprecedented; it is built into the analogy. Nor, indeed, is racialized mob violence unprecedented. What happened to Congress on Wednesday, January 6, 2021, has happened at countless county court houses in the Jim Crow South, i.e., in America. It just hadn’t happened at Congress before. Fortunately, the stormtroopers were not able to find the people they apparently wanted to place in the nooses and scaffold they set up around the building. This time.

Despite all the disadvantages of the extraordinarily complicated contextual narrative of Charles the First (two civil wars, execution, Irish conquest, republic, Restoration, and thus interregnum), that analogy does highlight a central constitutional issue, then and now. Not only did the January 6 siege succeed in delaying, if only for a few hours, Congress’s certifying the election results, and thus in breaking the peaceful transfer of power in the US at the federal level. By doing so, the siege obscured the fact that the executive branch supported—seemingly even incited—an attack on the legislative branch, thus breaking the separation of powers at the heart of the US Constitution. Presumably permanently: if not formally rebuked, all future presidents can now hold rallies outside Congress, demanding their followers go there, “fight” and be “strong,” if there’s a vote whose results they don’t like.

Since 1642, no reigning monarch has entered the House of Commons, such a violation of the separation of powers was Charles’ arrival understood to be. The framers seeing the English experience with inherited monarchy, tweaked the separation of powers implicit in the English system: courts, a bicameral legislature, and an executive. It was, so to speak, an analogy, a salvage effort. As with all analogies, differences were also highlighted. Instead of a king, a President is elected, which is to say removable, without the execution that ended Charles’ reign. Unfortunately, the Constitutional Convention created and we have retained an indirect and thus analogically aristocratic form of electing the head of the executive branch, and it was that very feature of the eighteenth-century Constitution (in its late nineteenth century codification) which was in process at the time of the January 6 attack. The role of the electoral college will continue to be reexamined as a result. Through it all, though, the president of the United States has always required an invitation before addressing Congress (even for the state of the union address mandated by the Constitution), a separation-of-powers legacy of the English system, and Charles’s shocking disregard for it. On Wednesday, a president’s supporters took the building over, after he told them he would be there with them when they got there. It turns out, that last bit was fake news, straight from the head of the executive branch, but it sure looks like they went in to represent him. The case of Charles the First means both the integrity of US democratic elections and the constitutional separation of powers are both at stake in the aftermath of the invasion of the Capitol from the other end of Pennsylvania Avenue.

 

*Schama, Simon.  A History of Britain.  Volume II.  The Wars of the British, 1603-1776.  (New York: Hyperion, 2001), 123.

January 6, 2021, Epiphany.

[Author’s note: Lee Morrissey, Founding Director of the Humanities Hub, and English Professor at Clemson, on media, civil war, and democracy in seventeenth-century England, and early 2021. This is Clemson Humanities Now.]

In 1995, when I was still a graduate student, I was lucky enough to get a job as an assistant professor of English, and I moved to Clemson from Manhattan, where I had been studying. Over the years, people have understandably inquired about that transition, sometimes out of curiosity, sometimes with a presumptuous knowing chuckle. Of course, there were many differences in my experiences in the nation’s largest city, and a town of 11,000 year-round residents, and some of them were improvements (if only because living on a salary in Clemson is better than living on a $10,000 stipend in upper Manhattan).

One difference, though, that no one has been able to anticipate has weighed on me, for decades now—the utopic excitement about the World Wide Web I encountered when I joined Clemson’s Department of English. The internet was twenty years old when I started at Clemson, although the World Wide Web was nearly brand new. A swath of my new colleagues were convinced not only that it represented the future, which it did, but more importantly that it boded well for democracy.

My own research in English literature spanned the seventeenth and eighteenth centuries, a period that goes from Civil Wars (culminating in the execution of a monarch) to the Enlightenment. In other words, my ‘field’ covered the protracted emergence of representative democracy from the ashes of a civil war and interregnum. As a result, my colleagues’ faith in this new technology struck me as naïve at best, even as I share their hopes for democracy.

Nothing about the democratization of print in the 1640s suggested that the World Wide Web was going to be an unvarnished good for democracy as my colleagues understood it. The experience with my colleagues changed my career: I wrote a book—The Constitution of Literature (Stanford UP, 2008)–on the topic, telling the story across nearly a century and a half, exploring different ideas of democracy as they unfolded in that period, and arguing in the end that “today’s discussion of the World Wide Web needs to consider each of these facets of the development of literary criticism. For the potential political instability, the possible emergence of critical protocols, and debates over models of democracy will all unfold for digital media, and the constitution of literature can provide a powerful, instructive analogue” (197).

Legacy broadcast media post-Fairness Doctrine played its part, but the impact and tenor of the Trump presidency have been facilitated by the openness and anonymity of the World Wide Web, especially what was once called the Web 2.0, the web that ‘you’ could create, which was, it was said, the most democratic form (non-linear and non-hierarchical, they said) of this most democratic technology. At the time, pointing out that there might be a problem with such an understanding of democracy was mistakenly seen as conservative, even though one could also already see, first, that the technology had system requirements that priced participation out for many people, and second, that the architecture was shaping the so-called choices, and third, that the algorithms were replicating real world biases (even as anonymous ‘users’ were enforcing them).

For those of us–not so many of us, perhaps–who study seventeenth-century English history and literature knew, the English Civil Wars coincided with, and were accelerated by, a new openness and accessibility of print. Eventually, the English political system stabilized, and a series of legal conventions (copyright, free speech protections, balanced by defamation accountability, etc.) developed to stabilize the political effects of print, too. As I wrote in The Constitution of Literature, “Eighteenth-century literary critics may have prioritized stability over openness. But early English literary critics describe their visions of reading against the backdrop of the profound instability created, they believe, by other modes of reading. It must also be said that stability is likely preferable to protracted Civil War; even the most libertarian digerati is probably concerned about the democratizing effect the internet seems to have offered global Jihadists. Moreover, within that eighteenth-century preference for stability there is a wide range of differing opinions on how to organize the constitution of literature, on how to understand literacy and political philosophy, and even on how to understand stability itself” (196-7).

This analogue, from print, has been on my mind throughout the Trump presidency, increasingly during 2020, but no more so than January 6, 2021, Epiphany Day, when Twitter, Facebook, and Youtube banned, albeit temporarily, the President of the United States from participating in the Web 2.0. In order to preserve one form of democracy—the deliberative, representative, and established one—these companies needed to suspend their form of anti-procedural participatory, rules-free democracy. Or, in the end the democracy offered by Web 2.0 had been revealed as a threat to democracy IRL. Of course, we had long suspected that it was a threat, truthiness and whataboutism having replacing reflection, even before the 2016 election hack by Cambridge Analytica and Russia’s IRA. But this is what made January 6, 2021 an epiphany: a new world is struggling to be born, and the supposedly democratizing technologies were enlisted in preventing it (in a seemingly non-linear and non-hierarchical way). Imagine if the three wise men had been sent to Bethlehem by Herod, Twitter, and GPS to trample through the inn, scare away the family, and take selfies.

The question, unfortunately, is whether yesterday represents an end of such behavior, or its beginning, the culmination, or the starter’s gun. Which epiphany or revelation it turns out to be will depend not only on whether or not to treat the president as if he were only the country’s ‘cranky uncle.’ It will also depend on a reexamination of technologies of representation. The good news is that the seventeenth- and eighteenth-century example shows us we have been here before. Regrettably for us, it took England a century before the final Jacobite Rebellion was quelled, and nearly as long for early copyright laws to be created. Fortunately, though, we have the lessons of their example to draw on; there are vaccines, already developed for print, and offering readily adaptable treatments today.

Earlier, I mentioned how arriving at Clemson from Manhattan and finding colleagues in an English Department who held techno-utopic hopes for the democratic possibilities of digitized representation changed my career. I mean, for the better. If you had asked me in my earliest post-doctoral years what I expected to be researching, I would have told you about a continuation of my dissertation, on authors who were also architects, topics in which I remain interested. It would be several years before The Constitution of Literature formed as a project for me. It is today my most-cited monograph, and it would not exist had I not been so surprised by my colleagues’ conviction that we were about to enter a golden age of the democratic word. Two decades later, the storming of the Capitol yesterday, by people egged on by the president, and by QAnon, shows how much we all need to sift these new media. My own experience, studying civil war and interregnum England, also shows how much these media were not new.

“The Hidden Pandemic”

Kapreece Dorrah, a Junior English Major English from Greenville, South Carolina, is a member of the We CU Video Campaign ), for which he made this video poem.  This is Clemson Humanities Now.

How do we do research in the humanities now?

(Director’s note: Gabriel Hankins teaches English literature and digital humanities in the Department of English at Clemson University.  His research interests include modernism, and digital literary studies.  His book, Interwar Modernism and Liberal World Order, has been published by Cambridge University Press.  He is co-editor of a new series on Digital Literary Studies, also published by Cambridge University Press.  As he and I corresponded this summer about online teaching tools suitable for close reading and group discussions, I asked him if he might contribute to this series.  He said yes, and this is the result.  This is Clemson Humanities Now.)

How do we do research in the humanities now?

Once, this might have been the title of a quietly confident survey of changing research practices in the humanities, some gradual, some sudden, with the confidence that familiar pathways and byways of humanities research remain largely unchanged. That confidence in the stability and continuity of the present has ended. Now the question has a slight tone of desperation: how are we, in the conditions imposed by a viral pandemic, going to do our research? Particularly in historical fields built around books, libraries, and archives, not to mention conferences, the conditions of our work seem to have changed utterly.

My response to the titular question is contradictory, like life in the present crisis, but I hope we can learn something by thinking through the contradictions.

We are all digital researchers now.

Within the digital humanities community, this is a familiar claim made by Lincoln Mullen, among others. I have refined a version of the claim for modernist studies in particular, in which I argue that literary fields are now contiguous with “digital humanities” methods and approaches along all their fractal boundaries. But, as I conclude in that essay, if we are all digital humanists now, we don’t yet know it. In the time of coronavirus, that knowledge is inescapable. Neither our teaching, research, nor service can escape a confrontation with digital mediation and its consequences. How we confront the digital mediation of our fields cannot be predetermined: in some modes, the dialectic of the digital and humanist plays out in terms of conflict, resistance, or refusal.

The middle matters: digital mediation is a problem of proximity, distance, and transmission.

Since T. S. Eliot at least, literary studies has seen its problem as that of positioning the individual experience in relation to inherited traditions, cultures, and texts, even if those texts and cultures are “inherited” from our contemporaries. Our method of constituting our traditions has everything to do with the historical moment we are in: Eliot’s particular mode of historical thinking arose out of a crisis in Western political order, a chaotic war, and a viral pandemic that recalls our own. At first the pandemic might seem to separate us from the archives, libraries, and conferences that formed the structure of research in the humanities, as it separates us from a livable past and a thinkable present. Digital mediation of those resources is experienced as distance and remove from the original. But in several senses digital archives and methods bring us unbearably close to our objects and interests, even as they always transmit, mediate, and thereby omit something of the original. All of Shakespeare is now available to the panoptic queries of the researcher; even more unbearably, the entire corpus of eighteenth-century Anglophone fiction, or nineteenth-century women novelists, or detective fictions of the 1930s. Certain archives, too, are available in digital form as they never were before – I am thinking here of queer fanfictions, but also the letters of Robert Southey, uniquely captured in digital form. The problem of researching these materials opens onto the more general problem of a self-reflective research practice, in which the modes and materials by which we do our work do not simply disappear in the background.

Digital research has to begin and end with the questions, objects, and interests of the humanities.

This may seem obvious, but it has not been clear for some generations of digital researchers. Our research questions and interests must rise out of the vibrant cultures, languages, traditions, and materials that inspire our work. We can and must think through the implications of our current digital condition, and the ways in which digital mediations of all kinds shape the questions we can ask and the canons we consider. But our questions should remain framed by our attachment to the primary questions of the humanities: what are the ways in which we make meaning, how is that meaning grounded in a history, a time and a place, a specific culture? How does a material history, a technological moment, a set of canonical texts, determine and enliven our response to a given textual and historical situation? What do we want our always-already digital research to transmit from the past to a more human future? In answering these questions, we move from the anxiety of a crisis-ridden present to our grounding in a meaningful past and our work towards a livable future.

Living in History

(Director’s note: Walt Hunter, author of Forms of a World: Contemporary Poetry and the Making of Globalization (Fordham UP, 2019) and co-translator, with Lindsay Turner, of Frédéric Neyrat’s Atopias: Manifesto for a Radical Existentialism (Fordham UP, 2017), is an associate professor of World Literature and associate chair of Clemson English.  Current events have him thinking about this moment in history, our moment in history, and our relationship to it.)

Social scientists and economists have singled out 1973 as a pivotal year in the history of global capitalism—the end of the post-war, Bretton Woods model of “embedded liberalism” and the beginning of the “long downturn”—but could the humanities add something new to this narrative? I’m interested in the cultural shifts in literature and art in the early 1970s that coincide with the end of what the French call the trente glorieuses, the years before the current period of globalization and of dramatically increased inequality.

One of the relatively minor, and yet, for me, ultimately illuminating events of 1973 is the publication of a book of sonnets by Robert Lowell called History (1973). In 1969 Lowell had published many of these sonnets in a collection called Notebook (1969) that, four years and two editions later, became History. “I am learning to live in history,” Lowell writes in the second edition of the fifth poem of a twelve-part sonnet sequence called “Mexico.” “What is history? What you cannot touch.”

Lowell’s reflection in this particular poem is prompted by the passing of the end of the day: the “full sun,” the “silhouetting sunset,” then the “undimmed lights of the passing car.” Typical of Lowell’s sonnets, the intimacy of personal experience occasions a reflection on large-scale processes: though he’s often received as a confessional poet, Lowell’s style in fact reflects his globalized present. The question “what is history?” comes to life as part of a larger American cultural, historical, political, and economic predicament in the early 70s.

Admittedly, the sonnet is probably not the first cultural form that springs to mind when we think about the late 1960s and early 1970s. 1973 was not a marquee year for the sonnet—though it was for Gravity’s Rainbow, Dark Side of the Moon, “Killing Me Softly,” The Exorcist, “Love Is a Hurtin’ Thing,” Badlands, Linda Ronstadt, and “Goodbye Yellow Brick Road.” This list has very little in common except a vague sense that something was over and it was now, as Joan Didion writes, the “morning after the 60s.” Americans in the early 1970s suspect that time is out of joint, that their lives fit into their country and their country fits in the world in a different way.

I take Lowell’s comment seriously: what is it to “learn to live in history” and how can a sonnet, of all things, help do that? Whose history is he talking about, this white scion of a patrician New England family? Why is history “what you cannot touch”?

One of the sonnet’s oldest motifs is noli me tangere—literally, “don’t touch me”which appears in Thomas Wyatt’s mid-sixteenth-century sonnet “Whoso List to Hunt.” The sonnet’s noli me tangere, a phrase Wyatt borrows from the Latin Vulgate, is figured by Wyatt and later sonneteers as the gendered body. Its perspective is one of suspended or thwarted desire—usually, though not always, rooted in a male gaze. The sonnet man is probably a close cousin to the “longing man,” as Merve Emre describes this enduringly odious figure.

But in Notebook, history becomes the object of the sonnet instead. In this way, Lowell speaks quite clearly to the present, his and ours. Living with the “I” in history remains, perhaps, one of the most intractable questions for the United States in 2020. Later, Claudia Rankine will write this through in a different way, while nodding at Lowell: “drag that first person out of the social death of history,” she writes in Citizen: An American Lyric, “then we’re kin.”

The sonnet is not a form in which history lives comfortably. It’s a pretty room or a narrow cell, the place where poets argue with themselves. The schematic grace of the sonnet allows for two positions to be reconciled, or to be held in tension. And the sonnet’s closure, if it comes, is not the closure of an event, but the closure of logic or rhyme. When something happens in the sonnet, it is often a product of the energy created by the poem’s own forces, married to the poet’s thoughts and feelings. While the sonnet sequence might elaborate the emotional conditions in which an “I” is forged, we leave it to Tasso’s ottava rima stanza or Milton’s blank verse to provide the historical sweep.

So when Lowell writes a sonnet, he already knows that it’s not going to accommodate historical narratives very easily—which makes the sonnet’s aversion to history perfectly compatible with its history of adversion. But there’s also something about history itself which, for Lowell, is untouchable, even though he must learn to live in it.

Writing a sonnet can require some grappling with the long and cumbrous European history of the poetic form itself. Traveling to English from Italian, the sonnet loosens up dramatically in English, which lacks the availability of Italian rhymes. Sonnets by Wyatt, Wroth, Wordsworth, Meredith, Frost, McKay, and Brooks are closer to the language as it’s spoken than sonnets by Dante or Petrarch. The sonnet always arrives, in other words, with a lot of baggage—not the least of which is its co-emergence with early modern capitalism and colonialism in Europe.

Lowell struggles with the sonnet’s grandiosity, flattening its music out and grounding its images in material from “the abundance of reality.” He puts it this way at the of Notebook (1970):

My meter, fourteen-line unrhymed blank verse sections, is fairly strict at first and

elsewhere, but often corrupts in single lines to the freedom of prose. Even with this

license, I fear I have failed to avoid the themes and gigantism of the sonnet.

I have been thinking so far about the sonnet as a way of dramatizing Lowell’s discomfort with making history while learning to live in it. This is a predicament that, I think, is only more pressing for US artists today. Given the US influence in remaking the world within its own global framework, how can one shape that history from a position that is effectively inside it, complicit with it, part of its continuation? For Lowell, the sonnet, with its own elaborate history in poetry, becomes a convenient vehicle on which the poet can displace his anxiety about his position and privilege in American history.

For the humanities, grappling with the history of poetry can be an effective way of studying history writ large. So one question that came to me as I was writing this piece was not what the true facts are, but what the right forms might be. This could be the question that animates Rankine’s Citizen, too, as it situates everyday moments of racism within its “American lyric,” and Terrance Hayes’s American Sonnets for My Past and Future Assassin, a sequence of 70 sonnets written immediately after the election of Donald Trump in 2016. To “touch” history might be to possess it, as a new national myth; it might be to touch it up, as an old national lie. But the process of learning to live in history, to deal with what happened honestly, is to embrace one’s own role—not in remaking the past, but in continuing to make a future.

 

 

“One of the best living American writers,” and a Clemson English alum

Directors note: A few years ago, Clemson English MA alum Ron Rash generously spoke at a celebration for the 50th anniversary of the Clemson MA in English program.  I remember him saying something about how when he was a graduate student at Clemson, “people thought we were part of a cult, because we were always carrying books and reading.”  Well, reading is central to the Humanities, and it seems to have stood Ron in good stead.  In this review of his new book, Janet Maslin of the New York Times–the paper of record–describes Ron as “one of the best living American writers.”  This is Clemson Humanities Now.

Still Made in China

(Director’s note: Maria Bose is assistant professor of media and cultural studies in the Clemson English department. Her current project is “Cinema’s Hegemony,” a cultural-materialist account of cinema’s primacy to an unfolding phase of Asia-led global political economy.) 

Midway through Age of Extinction (Paramount, 2014), Michael Bay’s fourth installment for the Transformers franchise’s live-action film series, robotics magnate Joshua Joyce (Stanley Tucci) mistakes amateur inventor Cade Yeager (Mark Wahlberg) and race car driver Shane Dyson (Jack Reynor) for “grease monkeys” in his Chicago facility. There, Joyce has recently authorized the Chinese manufacture of a fleet of Galvatrons: sentient, modular automobiles-cum-super-soldiers reverse engineered from the “living metal” out of which the Transformer robots are composed. Irked to discover two low-level technicians in his pristine showroom, Joyce challenges Yeager and Dyson to define the products on which, he believes, they are all at work. “What do you think it is that we make here?” the CEO asks. “We make poetry. We’re poets.” Later, Joyce learns that his outsourced Galvatrons lack “souls,” their corruption and insurgency implicated in Chinese assembly. Having modeled the robots on benevolent, democratic Autobot leader Optimus Prime, Joyce is baffled by their similarity to evil, authoritarian Decepticon leader Megatron. Faced with this discrepancy—between soulful poetry and corrupt machinery, American design and Chinese execution—Joyce’s metaphors begin to harden. “Simple coding. Algorithms! Math! Why can’t we make what we want to make, the way we want to make it?”

Age of Extinction’s lurking resentment at shifts in the global balance of power away from the US and toward China is nearly lost in the film’s explicit narrative of US-China collaboration. The American Joyce is volubly enamored of his Chinese associate Su Yueming (Li Bingbing), who defends him valiantly in the film’s final Hong Kong sequence. But read as a reflexive parable, Age of Extinction’s preoccupation with production’s shifting centers appears immanently self-theorizing and complexly historical, offering both an index of Bay’s frustrated “poetic” autonomy in co-producing the film with Jiaflix Enterprises and the China Movie Channel, and also a broad systemic critique of US hegemony’s decline in conjunction with a “rising China.” Age of Extinction opens with Yeager purchasing analog film projectors from a shuttered Texas movie theater where Optimus Prime lies in hiding, disguised as a semi-truck manufactured by the now-defunct American brand, Marmon. Walking past a poster for Howard Hawks’ El Dorado (Paramount, 1966), the theater owner laments the current state of cinema. “The movies nowadays, that’s the trouble,” the old man sighs. “Sequels and remakes, bunch of crap.” If Joyce articulates Bay’s anxiety that his “poetry” will, by dint of global media conglomeration, runaway production, and foreign-market pandering, be reduced to “crap,” then Yeager simultaneously locates the source of that anxiety in late-twentieth-century US deindustrialization and binds it to the film’s nostalgia for a contemporaneous era of American studio auteurism putatively immune to conglomerate oversight, new-media competition, and foreign-market pressure. Surprisingly, then, given the film’s outpouring of nationalist bombast, Age of Extinction is less neoliberal-imperial jingo than neo-protectionist post-imperial swansong, less a refutation of US hegemony’s unraveling than its strategic diagnosis. Age of Extinction’s moral and geopolitical lesson is that global industrialists like Joyce are ultimately responsible for disenfranchising hardworking Americans like Yeager—corporate hardliners who offshore jobs in the automobile sector to China and speed blue-collar American workers’ deskilling and redundancy in the advent of automation. The film is willing to risk these admissions—driven, in fact, to solicit a rigorous account of American declension and the breakdown of the liberal-democratic project—because they serve its corporate-existential stakes, which are, on the one hand, to align the fates of state and cinema by synchronizing its account of American decline with that of Hollywood’s progressive evanescence in an increasingly diversified global media complex and, on the other, to motivate that synchronization toward Bay’s and Paramount’s auteurist self-promotion.

Since Age of Extinction’s release, the US-China tension it registers has only escalated, fueled by a hawkish Trump administration bent on hobbling Chinese firms’ forays into European markets and on portraying Chinese authoritarianism as the single greatest threat to US democracy. Amid a prolonged trade war and an unprecedented global downturn triggered by the coronavirus pandemic, American CEOs like Joyce and Hollywood filmmakers like Bay have been castigated for ongoing collaboration with Beijing. The film’s underlying diagnosis of the “Chinese threat,” and its nostalgic appeal to the competitive and autonomous US industrial sector foreclosed by deindustrialization is thus entirely familiar. Across party lines, presidential candidates have called for US manufacturers to maximize native production. “We’re bringing it back,” Trump said at a Ford factory plant in Michigan, just earlier this year. “The global pandemic has proven once and for all that to be a strong nation, America must be a manufacturing nation.” That statement captures both the promise of closing America’s gaping trade deficit with China and the threat of global supply chains’ overexposure in Chinese markets increasingly susceptible to rolling shutdowns and political oversight.

But despite Trump’s longing for a new “golden age” of industrialization and his apparent willingness to confront China on matters of trade—in less than two years Trump has imposed sanctions and tariffs on Chinese imports, revoked Hong Kong’s special trade status, pressured European allies to ostracize Chinese firms, and offered inducements to lure investors away from China—manufacturing job-growth in America has barely kept pace with the total workforce. At the same time, and in the recent course of the pandemic especially, China’s export shares have risen, a testament to the indispensability, competitiveness, and innovation of the nation’s industrial base. And while there is evidence that many firms plan to relocate some or all of their manufacturing facilities away from China, there is also widespread concern that manufacturing itself is swiftly becoming less labor intensive, as low-tech industries transition to the automated and robotic systems for which China is the largest market.

At a moment of reckoning for American manufacturing, and the reassessment of just-in-time global supply chains, then, calls to divest from “Factory Asia” unfold ideological debates about the fate of the global political order. But these debates carry significance less in terms of the overblown contest between “American democracy” and “Chinese authoritarianism” (pace Trump, or Bay). Rather, their significance lies in the exfoliation of far subtler processes of hegemonic rebalancing involving America’s wary retreat from, and China’s equally wary advance upon, a vertiginous neoliberal system, as both nations confront the potential boons and perils of a return to—or the continuation of—an anachronistic state-led capitalism. More likely than serious geo-economic overhaul is the emergence of regional hegemonies ordered to suit the state’s immediate desire for market, political, and military supremacy (China seeks such supremacy in the Indo-Pacific; America plans to carve it out by further industrializing hubs in Mexico and Canada). Perhaps, then, it’s only a matter of time before we can “make what we want to make, the way we want to make it.” For now, it’s all still made in China.

The Humanities as meaning-making

(Director’s note: David Sweeney Coombs is associate professor and Director of Undergraduate Studies in the Clemson English Department. He is the author of Reading with the Senses in Victorian Literature and Science, published in 2019 by University of Virginia Press.  This is Clemson Humanities Now.)

“A dreadful Plague in London was,

In the Year Sixty Five

Which swept an Hundred Thousand Souls

Away; yet I alive!”

These lines appear as the conclusion of Daniel Defoe’s Journal of the Plague Year, appended as a kind of epigram on the record of the fictitious author H. F.’s impressions of the bubonic plague epidemic in London in 1665. “Yet I alive!”: at once minimized in comparison to the preceding lines and the scale of the disaster they invoke and yet granted the importance of serving as the quatrain’s parting words, this concluding expression of baffled triumph and gratitude distills the confused sense of H. F. throughout the rest of the narrative that his survival is thanks to an inextricable combination of prudence, providence, and sheer dumb luck. Even though this is the end of the line for the novel, and this epigram might also be justly described as, in a sense, an epitaph, a question hangs over H. F.’s affirmation of having outlived the plague: I survived. Why, and what do I do now?

It is, of course, too early in the Covid-19 pandemic for us to ask ourselves this question in all earnestness. As I write, case-counts in South Carolina have been rising rapidly for several weeks and Clemson has just announced that it will delay bringing students back to campus until September. But if we can’t yet congratulate ourselves on being survivors, reading Defoe’s novel with the pandemic ongoing is nonetheless an occasion to ask what it means to survive and what survival might look like. Covid-19 has given that question a new urgency for us all as individuals, but it has also given a sharper ring to a version of that question that had already begun being sounded in American universities in the last few years even before the upheavals created by the novel coronavirus: can the academic humanities survive?

Enrollments in the academic humanities have fallen dramatically since the 2008 Financial Crisis. Nationally, the declines have been particularly sharp in the core disciplines that together represent the Humanities at Clemson: English, Languages, Philosophy and Religion, and History. The number of Bachelor’s Degrees awarded in those fields at U.S. universities declined by 17 percent from 2012-2015 (according to the American Academy of Arts and Sciences), with the downward trend continuing afterwards. I am now the Director of Undergraduate Studies for the English Department at Clemson, where the number of English majors decline by approximately 25 percent between 2011 and 2018.

The proximate cause of these drops is what Christopher Newfield in The Great Mistake has analyzed as the devolutionary cycle of privatization in American public universities, intensified by the long recession and fiscal austerity after 2008. As Clemson, like other public universities, has become locked into a cycle of slashed state funding, tuition hikes, and attempts to raise revenues and cut costs through partnering with private sector sponsors, more and more students have entered the university staggering under the pressure to monetize their studies so as to compete for the well-paid jobs requiring a bachelor’s degree. Under that burden, a major in the humanities look like a risk, or worse, a luxury. While there were some signs that enrollments in the humanities were ticking back up in the last two years, the economic fallout from the pandemic might wipe out those small gains and then some.

One task for humanities scholars at this moment is to take every opportunity to reiterate that the retreat from the humanities in search of supposedly more marketable majors rests on a popular misconception that humanities degrees don’t translate into good jobs. The economic value of Humanities BAs is in fact broadly similar to that of degrees in Business and STEM fields like Biology. A 2018 study found that an average 25-29 year-old Business major earns about $50,000 annually compared to the $45,000 of an average English or History major of the same age, but that Business major is slightly more likely to be unemployed than the History major. When students in one of Clemson colleague Tharon Howard’s courses conducted a survey last year, they found that 90 percent of graduates from Clemson English had found jobs within a year of graduation (and more than 60 percent of them did so within the first three months of graduating). While the coronavirus pandemic is currently destabilizing labor markets in new ways, a humanities degree is likely to continue serving our students’ career prospects well once the recovery is underway. If they were judged by their actual return on investment, as opposed to the caricature of the unemployed Philosophy major, the academic humanities could survive.

But do the humanities deserve to survive? Do they have value that can’t be reduced to the market? Considered intellectually and ethically, what elements of prudence or providence justify their survival as areas of study within the university? The humanities are difficult to discuss, let alone define, in a unified way, but I accept Helen Small’s working definition from The Value of the Humanities:

The humanities study the meaning-making practices of human culture, past and present, focusing on interpretation and critical evaluation, primarily in terms of the individual response and with an ineliminable element of subjectivity.

This definition is perhaps recognizably the work of an English professor, and some historians might not necessarily identify “the meaning-making practices of human culture” as their sole or primary object of study. At the risk of disciplinary chauvinism, though, Small’s placing practices of meaning-making at the center of what we do in the humanities strikes me as capturing something important about them.

When Coluccio Salutati articulated the first modern version of the studia humanitatis in the fourteenth century, his basic criterion for each mode of inquiry’s inclusion was a commitment to studying languages in order to understand the human. The modern humanities, that is, study meaning-making with the aim of self-understanding. That aim might be more or less implicit in practice but it is the humanities’ first and most important justification. While Small’s analysis is too cool to veer into anything with the potential for such embarrassing earnestness, it can’t but hint at one of the tacit promises we make our students: by teaching you how meaning is made, the humanities will also help you to cultivate the interpretive and critical skills that can make your own life more meaningful.

Seen along these lines, the humanities are vulnerable to the degree that their ineliminable element of subjectivity can metastasize, or be perceived to have metastasized. The charge that the humanities have turned away from the world and become a mere solipsistic exercise in personal enrichment is a recurring feature of sharp-elbowed public debates about their relationship with the sciences in particular. Some version of that claim is central to the Matthew Arnold-T.H. Huxley debates about education in the nineteenth century, the C. P. Snow-F. R. Leavis debate about the so-called two cultures of the sciences and the humanities in the 1950s and 1960s, and the Sokal Hoax and the debates it occasioned about post-structuralism and social construction in the 1990s. What tends to be obscured by these debates is that self-understanding, pursued doggedly enough, leads inevitably back to the world of which that self is a part. If the humanities without the sciences (or the social sciences) risk solipsism, their study of meaning-making practices can teach the sciences about the terms they use to study the world. An explicit critical engagement with those terms is in fact the very reason-for-being for multiple humanities sub-disciplines: intellectual history, literature and medicine, the history and philosophy of science, and so on.

The spectacular failures of the U.S. health system in the face of Covid-19 underscore the necessity for attention to that humanities’ scholarship in understanding what has gone so very wrong here and how we might fix it. The pandemic has given us a searchingly thorough diagnosis of our society: we live in a country riven by stark economic and racial inequality, its state capacity for anything but administering violence hollowed out by decades of disinvestment. In this respect, our world is not so very different from the London of 1665 presented in Defoe’s narrative. Published in 1722 amid the real possibility that another plague epidemic might spread from Marseilles to London, Journal of the Plague Year is an attempt by Defoe to persuade the public to support some of the controversial quarantine measures adopted by the Walpole government (which seems to have been paying him at the time). To accomplish this aim, Defoe put the extensive research into the 1665 plague that he, the real author, had made into the mouth of a fictional author imagined as living through the plague. The result is a strange hybrid of public health reporting, history, and fiction, one that, as Margaret Healy puts it, yields “a complex texture of interwoven relationships between the subjective and the social, the private and the collective—the kernel of any novel.” We might also identify this particular kind of interweaving as something like the kernel of the humanities. Let’s hope they survive.

Remembering Chris Dickey

(Director’s note: In the summer of 2008, I had just arrived at my destination in Istanbul, connected to wifi, checked my email, and saw that I had a received a message from Will Cathcart, Clemson English, Class of 2005.  In it, Will rightly predicted that Russia was about to invade the country of Georgia, and announced that he was boarding a plane to go interview the then President of Georgia, Mikheil Saakashvili.  The invasion announced the arrival of a multipolar world, and established a pattern Russia would repeat in Crimea, the Ukraine, and Syria.  Will, who has lived in and reported from Georgia ever since, was inducted in the Clemson Alumni Council’s Roaring 10 in 2014.  His reporting can be seen in CNN, Foreign Policy, and, most frequently, seems to me, The Daily Beast.  There, his editor was “renowned foreign correspondent,” Christopher Dickey, whose father, James Dickey, author of Deliverance, was a student at Clemson for his first year of college, before serving in World War II (and later graduating from USC).  Christopher Dickey, a regular visitor to and speaker at Clemson over the years, passed away on July 16th.  In a new post on Lit Hub, Will reflects on the loss of a friend, an advocate, and an editor.  This is Clemson Humanities Now.)

The second plague outbreak in my life

(Director’s Note: Lee Morrissey, Founding Director of the Humanities Hub, joined the Clemson English faculty in 1995, moving to Clemson from Manhattan, where, while a graduate student, he had volunteered to create the Archives at an arts center, The Kitchen.  From 1995 to 2000, he brought artists–including Ben Neill (who performed with David Wojnarowicz in the 1989 event described below), Carl Stone, Pamela ZMargaret Leng Tan, and Trimpin–to Clemson in a program he called “Technology and the Contemporary Arts.” This is Clemson Humanities Now.)

This summer, I will turn 56, which means, this summer in particular, I am well into what has been considered an elevated risk group for Covid 19.  But it also means, in my case, I am now living through the second plague outbreak of my lifetime.  There has been so much focus nationally on the Spanish flu of 1918, and other, earlier plagues, including the Black Death in Florence (see Boccaccio’s Decameron) and the bubonic plague in London (see Defoe’s Journal of a Plague Year), but much less attention to the more recent experience of HIV/AIDS.  A recent article in the Washington Post reviewed how knowledge gained from HIV/AIDS research is helping with Covid-19 research, but considering that Anthony Fauci first came to national prominence in the 1980s’ AIDS crisis (making a frenemy of Larry Kramer in the process), you would think that there would be more attention paid to the last time around.

I’ve been wondering recently why AIDS isn’t more in the news, and I am starting to think it’s because it affected minority communities, and, unless you were traveling in those minoritized circles, well, you just did not really know.  I grew up in Boston, shared a radio show on WOMR-FM in Provincetown, MA in the summer of 1984, started graduate school in NYC in 1987, and volunteered, from 1987 to 1995 at The Kitchen Center for Video, Music, Dance, Performance, and Literature, creating and organizing their archives.   I traveled in those circles.  AIDS affected people I knew, early in the disease.  By the time I reached Manhattan, in a New York City that does not exist anymore (or, if enough people now move out, yet), AIDS was a defining feature of one of the things that used to attract people to New York: the arts.  The Aids Coalition to Unlock Power (ACT UP) was already active, with intense meetings at the Lesbian and Gay Men’s Community Services Center. Gran fury, an artists collective formed the year after I moved to the city, created one of The Kitchen’s monthly posters, with the words “art is not enough” taking the center—most—of the page.  I remember David Wojnarowicz at The Kitchen, all wiry energy backstage (well, upstairs) before his 1989 performance there, In the Shadow of Forward Motion, and, then, in the performance, realizing that it had the same wiry energy—David offstage had been, that night anyway, David onstage, which was part of his point: silence equaled death, and he was going speak volumes, producing all the time, constantly.  His self- portrait with flames coming out of his left side, the viewer’s right (pointedly highlighting the political mirror), turned out to be realism.  I was back home in Boston when news of Ethyl Eichleberger’s death reached me via the New York Times, and I won’t forget the frustration of trying to explain to my parents over breakfast why the loss of someone reimagining the Classics in drag and garish makeup, with self-consciously schlocky accordion songs, might matter to someone else studying literature in the same city.  I remember inviting everyone I knew well enough to see John Kelly’s 1991 event, Maybe it’s cold outside, because it nearly worldlessly told a life and death story of learning that one is HIV positive, with a soundtrack by Arvo Pärt.  Later that year, on World AIDS Day, A Day Without Art, The Kitchen assembled a broadcast, in part from the Kitchen, and in part from other arts organizations, mixing frank safe sex PSAs with artistic reflection on loss.  The standout, for me, was dancer Bill T. Jones, HIV-positive, and having outlived his partner, Arnie Zane, dancing toward his mother, Estelle, while she sang “Walk with me Lord.”

I left New York City, and moved to Clemson, SC, where I have lived practically ever since.  The metaphorical city I had left had, in many ways, died.  Actually died.  C. Carr wrote its eulogy in her book, On Edge: Performance at the End of the Twentieth Century, whose last chapter is titled, “The Bohemian Diaspora.”  In Rent, which has a song called “La Vie Boheme,” Jonathan Larson memorialized it, and the gentrification pressures, which ended it (and on the morning of its first performance he passed away, too, from an undiagnosed aortic tear, not noticed in the several trips he made to his local hospitals, one of which has since been shut down).

There is another, much less well-known, and more abstract, long-form music and theatre piece from the era, Meredith Monk’s Book of Days (1989).  It opens in then-contemporary NYC, in color, as a construction crew blasts a hole in a wall, through which the camera travels, into a largely medieval village, complete with separate quarters for Christians and Jews, filmed largely in black and white.  As a plague breaks out.  There’s a documentary dimension, and contrasting musical and performance styles–martial and spectacular in the Christian white-wearing section, mystical and participatory in the Jewish section of town.   When, in this section, the plague arrives, in the figure of a lone dancer gesturing maniacally, the Christians amass and blame the Jews, despite the intervention of one man who tells them that “everyone is dying.”  A map of the town turns red as the deaths accumulate across its segregated quarters, and the sounds of the singers get more and more menacing.  As can be heard in this rendition of the song, we are listening to the plague itself: about two minutes in, out of those vocalizations, the chorus huffs “we know who you are; we know who you are; we know who you are; ha ha ha; ha ha ha.”  It is a chilling effect to have the plague emerge from the cacophony and claim us.  But that’s what happens.

In 1990, The Kitchen got caught up in the 1990 NEA defunding.  A decade earlier, as Bill T. Jones recounts in an essay he contributed to The Kitchen Turns 20, a book I edited on the occasion of the first two decades of The Kitchen, the US Information Agency had asked The Kitchen to select US artists to send to a cultural exchange in Bucharest, Romania.  But, now, after the 1989 fall of the Berlin Wall, the Cold War had ended and the so-called Culture War had broken out, at home.  The US is still living with the profound consequences of this redirection.  At the time of the NEA defundings, my father called, and said “I thought you were volunteering at a soup kitchen!”  Or, in other words, I guess, he was asking “what are you doing there?”  That’s a fine question, and I think the answer has to do with being interested in making meaning; the people, the artists, who performed at The Kitchen and other venues in New York, wanted to do something meaningful.  Like me, they came to New York from somewhere else with that hope.  They came from and spoke to overlapping communities, and they wanted to create something, usually ephemeral, filled with allusive, polyvalent, and, I think it’s important to note, embodied meaning.  Some of it, a really small number within it, left the performers vulnerable to charges of obscenity, but, ironically, usually in works which meant to raise questions about what else we live with, without considering it obscene.  All of this took place before the internet, before Ken Starr’s detailed report on President Clinton, and so might strike people today as quite tame.  They might also be struck by the convivial friendliness of the experiences in the audience at the time.  In the process, or processes, of creating their works these performers created new communities, sometimes as small as the seating at the original P.S. 122 (a former public school), or at the original Dixon Place (an apartment living room), but sometimes, as with Bill T. Jones, as big as everyone who’s seen any of his several Broadway choreographies.

Comparing then and now, a few things stand out.  First, I have not seen much emphasis recently on how fatal HIV was for that first couple of decades of the disease.  I think of it all the time now, in part because I saw so much loss at the time, but also because Covid-19 is so much more transmissible than HIV.  Imagine if the novel coronavirus were as fatal as AIDS.  Actually, don’t.  It’s too much to contemplate.  Second, in the early days with Covid 19, as doctors started to see patterns in who was most vulnerable to the virus, age, occupation, and then race were the initial factors.  Eventually, it became clear that other factors involved the status of one’s immune system, or “underlying health conditions,” and then it was reported that only 12% of the US population was metabolically healthy, meaning not obese, not diabetic, not with high blood pressure, etc.  It is as if 88% of the population has a different kind of acquired immune deficiency, related to diet (and access to healthy foods), environmental and living conditions (and access to places in which to exercise), and other cultural, meaning economic, meaning political, factors.  Third, HIV/AIDS is still with us.  Last year, 690,000 people died from AIDS, and 1.7 million people were infected, worldwide.  And, while there is PrEP if you think you are at high risk, there is, still, no vaccine, and no cure, yet.  In the US alone, according to the CDC, there were about 38,000 new HIV diagnoses in 2018, most of them in the South.

Fourth, thirty-five years ago, people who were affected by HIV/AIDS were frustrated by a President who would not mention the disease.  At the time, his political party was focused on family values and personal responsibility, which meant a disease cast as gay and preventable with celibacy (and safe sex) needn’t be a national issue, they thought.  Today, the president, of the same party, certainly talks about the current pandemic, but he has spent a lot of time saying that it will go away (it hasn’t yet, almost six months later), and that it is getting better (when it is getting worse).  And personal responsibility has devolved to NOT wearing protection, because, it is argued, nobody can tell another U.S. American how to do anything.  Four decades ago, conservatives blamed the sexual activities of gay people for spreading AIDS, mistakenly thinking it was a gay disease.  Today, as soon as the lockdowns are lifted, apparently heterosexual young people mingle in close proximity, close enough for this pandemic’s transmission–without wearing any protection.  In terms of disease transmission and prevention, they are acting as irresponsibly as Republicans used to allege gay people were, except this time, all they need to do for transmission is stand near each other, unmasked, in, say, a pool, while a beach ball bounces off their heads.

Finally, I think about how much changed in the US because of AIDS activism, a range of activisms.  By becoming experts in the disease—no one else would initially, so they did—activists changed healthcare.  By highlighting their exclusion from their partners’ hospital beds, activists pressed for recognizing lifelong commitments among lesbian and gay people, known today as gay marriage, and legal nationwide.  Most of all, though, because so many affected were artists, they could create verbal, visual, musical, and, in the end, emotional responses to living with or being stalked by the disease.  There was some hope, then, that there might be angels in America.  I can imagine the current Covid-19 pandemic–which is also creating the second great recession, or worse, in just 12 years–is going to have even bigger social consequences, and, I say hopefully, just as positive.  If so, though, we are going to have to do the imaginative work of the artists lost to AIDS to create affecting representations of living with and being stalked by both a disease and an economic collapse.  Gran fury said “Art is Not Enough,” but, actually, it is a start.