Humanities Hub

Jan. 6, 2021: “Mad King George,” or Charles the First?

Anonymous White House insider reports claim that Donald Trump is behaving like “Mad King George.” In private, maybe he is; in public, though, on Wednesday, January 6, 2021, he acted like Charles the First. Less familiar to Americans today, Charles’s actions in 1640s England inform the eighteenth-century Constitution of the United States.

Infuriated by Puritans in the House of Commons, Charles the First told the Lord Mayor of London not to call up any troops, and, in the absence of any armed opposition, led a march across the capital city, from Whitehall to Westminster. There, Charles entered Parliament with a group of 80 armed soldiers, and attempted to arrest five prominent members of Parliament who were opposed to him. A marching progress across London was no secret to Parliament, so the Five Members, as they became known, had already evacuated to a secure location, a barge in the Thames, famously undisclosed by the Speaker of the House, William Lenthall, who told the King, “I have neither eyes to see nor tongue to speak in this place but as this House is pleased to direct me.”*

The king had entered Parliament, attempted to arrest legislators, and failed. It was a stupendous overreach, and disastrous. Within months, England was at war with itself; within years, Charles had been executed by Parliament, and England became a republic. No reigning monarch has entered the House of Commons since that day in 1642. Remarkably, that day was January 4th, technically in 1641, on the Julian calendar the United Kingdom (and its colonies) used until 1752. That is, Charles the First laid seige to Parliament nearly exactly 380 years to the day before Donald Trump told a “Save America March” rally crowd which was chanting ”Fight for Trump! Fight for Trump!,” “after this, we’re going to walk down and I’ll be there with you. We’re going to walk down–We’re going to walk down. Anyone you want, but I think right here, we’re going to walk down to the Capitol.”

Of course, there are differences between the two events. For one thing, as it turned out, Donald Trump was not there with them; he was back in the White House watching it all play out on TV. By contrast, Charles actually led troops into Parliament, and, by contrast, kept them out of Commons when he went in. In 1642, it was King Charles who sat in the Speaker Chair, but only after asking to do so; by contrast, it was members of the pro-Trump mob who having chased Congress out of their chambers then sat for selfies in the Speaker’s Chair. Charles and his troops did not damage Parliament; if anything, they damaged the monarchy. Of course, by contrast, those who stormed Congress smashed windows, tore down signs, and, reportedly, relieved—one might say “expressed”—themselves in the hallways and offices of elected representatives.  And, in 1642, no one died.

We are all still sorting what else Trump and his supporters damaged when they laid siege to the Capitol—possibilities include art works, carpeting, desks, democracy, international reputation, and the future itself.  What happened between the White House and the Capitol on the afternoon of January 6 is not unprecedented; it is instead loaded with precedents, which is what makes it so powerfully significant.  Every event is unique, and developing a completely accurate picture–every angle of every participant before, during, and after, plus all the angles not available to participants—is beyond us. The events of 1642 vex historians to this day, at a four-century remove. What used to be called the Puritan Revolution became known as the English Revolution, then the English Civil War, then the English Civil Wars, and, most recently, the War of Three Kingdoms.  In any case, by definition, historical analogies cannot be exact, and, therefore, in that sense, history cannot be a guide.  However, the examination of similarity and difference is one of the advantages of analogies; they constitute a means of a cumulative measurement.

The most important disadvantage of historical analogy in particular is the contextual narrative in which the historical example is embedded, and thus comes with. In the familiar narrative, for example, George the Third precedes a Revolution. As we saw on Wednesday the 6th, the siege takers saw themselves in George the Third terms: taking on the narrative implied by the analogy, they were re-doing the American revolution, complete with “Don’t Tread on Me,” and with Mike Pence and Nancy Pelosi as the malevolent tyrants, apparently. (And like Charles, they arrived ready to arrest their five, albeit with plastic zip ties this time.) Here, too, the glaring inequity in treatment of Black Lives Matter protests and the Capitol siege involves the same racist logic seen in the analogical eighteenth-century ‘revolution’ against the Mad King George. As Samuel Johnson noted at the time, in his 1775 essay Taxation No Tyranny, “we hear the loudest yelps for liberty among the drivers of negroes.” Such iniquitous treatment in the defense of freedom is not unprecedented; it is built into the analogy. Nor, indeed, is racialized mob violence unprecedented. What happened to Congress on Wednesday, January 6, 2021, has happened at countless county court houses in the Jim Crow South, i.e., in America. It just hadn’t happened at Congress before. Fortunately, the stormtroopers were not able to find the people they apparently wanted to place in the nooses and scaffold they set up around the building. This time.

Despite all the disadvantages of the extraordinarily complicated contextual narrative of Charles the First (two civil wars, execution, Irish conquest, republic, Restoration, and thus interregnum), that analogy does highlight a central constitutional issue, then and now. Not only did the January 6 siege succeed in delaying, if only for a few hours, Congress’s certifying the election results, and thus in breaking the peaceful transfer of power in the US at the federal level. By doing so, the siege obscured the fact that the executive branch supported—seemingly even incited—an attack on the legislative branch, thus breaking the separation of powers at the heart of the US Constitution. Presumably permanently: if not formally rebuked, all future presidents can now hold rallies outside Congress, demanding their followers go there, “fight” and be “strong,” if there’s a vote whose results they don’t like.

Since 1642, no reigning monarch has entered the House of Commons, such a violation of the separation of powers was Charles’ arrival understood to be. The framers seeing the English experience with inherited monarchy, tweaked the separation of powers implicit in the English system: courts, a bicameral legislature, and an executive. It was, so to speak, an analogy, a salvage effort. As with all analogies, differences were also highlighted. Instead of a king, a President is elected, which is to say removable, without the execution that ended Charles’ reign. Unfortunately, the Constitutional Convention created and we have retained an indirect and thus analogically aristocratic form of electing the head of the executive branch, and it was that very feature of the eighteenth-century Constitution (in its late nineteenth century codification) which was in process at the time of the January 6 attack. The role of the electoral college will continue to be reexamined as a result. Through it all, though, the president of the United States has always required an invitation before addressing Congress (even for the state of the union address mandated by the Constitution), a separation-of-powers legacy of the English system, and Charles’s shocking disregard for it. On Wednesday, a president’s supporters took the building over, after he told them he would be there with them when they got there. It turns out, that last bit was fake news, straight from the head of the executive branch, but it sure looks like they went in to represent him. The case of Charles the First means both the integrity of US democratic elections and the constitutional separation of powers are both at stake in the aftermath of the invasion of the Capitol from the other end of Pennsylvania Avenue.

 

*Schama, Simon.  A History of Britain.  Volume II.  The Wars of the British, 1603-1776.  (New York: Hyperion, 2001), 123.

“The Hidden Pandemic”

Kapreece Dorrah, a Junior English Major English from Greenville, South Carolina, is a member of the We CU Video Campaign ), for which he made this video poem.  This is Clemson Humanities Now.

How do we do research in the humanities now?

(Director’s note: Gabriel Hankins teaches English literature and digital humanities in the Department of English at Clemson University.  His research interests include modernism, and digital literary studies.  His book, Interwar Modernism and Liberal World Order, has been published by Cambridge University Press.  He is co-editor of a new series on Digital Literary Studies, also published by Cambridge University Press.  As he and I corresponded this summer about online teaching tools suitable for close reading and group discussions, I asked him if he might contribute to this series.  He said yes, and this is the result.  This is Clemson Humanities Now.)

How do we do research in the humanities now?

Once, this might have been the title of a quietly confident survey of changing research practices in the humanities, some gradual, some sudden, with the confidence that familiar pathways and byways of humanities research remain largely unchanged. That confidence in the stability and continuity of the present has ended. Now the question has a slight tone of desperation: how are we, in the conditions imposed by a viral pandemic, going to do our research? Particularly in historical fields built around books, libraries, and archives, not to mention conferences, the conditions of our work seem to have changed utterly.

My response to the titular question is contradictory, like life in the present crisis, but I hope we can learn something by thinking through the contradictions.

We are all digital researchers now.

Within the digital humanities community, this is a familiar claim made by Lincoln Mullen, among others. I have refined a version of the claim for modernist studies in particular, in which I argue that literary fields are now contiguous with “digital humanities” methods and approaches along all their fractal boundaries. But, as I conclude in that essay, if we are all digital humanists now, we don’t yet know it. In the time of coronavirus, that knowledge is inescapable. Neither our teaching, research, nor service can escape a confrontation with digital mediation and its consequences. How we confront the digital mediation of our fields cannot be predetermined: in some modes, the dialectic of the digital and humanist plays out in terms of conflict, resistance, or refusal.

The middle matters: digital mediation is a problem of proximity, distance, and transmission.

Since T. S. Eliot at least, literary studies has seen its problem as that of positioning the individual experience in relation to inherited traditions, cultures, and texts, even if those texts and cultures are “inherited” from our contemporaries. Our method of constituting our traditions has everything to do with the historical moment we are in: Eliot’s particular mode of historical thinking arose out of a crisis in Western political order, a chaotic war, and a viral pandemic that recalls our own. At first the pandemic might seem to separate us from the archives, libraries, and conferences that formed the structure of research in the humanities, as it separates us from a livable past and a thinkable present. Digital mediation of those resources is experienced as distance and remove from the original. But in several senses digital archives and methods bring us unbearably close to our objects and interests, even as they always transmit, mediate, and thereby omit something of the original. All of Shakespeare is now available to the panoptic queries of the researcher; even more unbearably, the entire corpus of eighteenth-century Anglophone fiction, or nineteenth-century women novelists, or detective fictions of the 1930s. Certain archives, too, are available in digital form as they never were before – I am thinking here of queer fanfictions, but also the letters of Robert Southey, uniquely captured in digital form. The problem of researching these materials opens onto the more general problem of a self-reflective research practice, in which the modes and materials by which we do our work do not simply disappear in the background.

Digital research has to begin and end with the questions, objects, and interests of the humanities.

This may seem obvious, but it has not been clear for some generations of digital researchers. Our research questions and interests must rise out of the vibrant cultures, languages, traditions, and materials that inspire our work. We can and must think through the implications of our current digital condition, and the ways in which digital mediations of all kinds shape the questions we can ask and the canons we consider. But our questions should remain framed by our attachment to the primary questions of the humanities: what are the ways in which we make meaning, how is that meaning grounded in a history, a time and a place, a specific culture? How does a material history, a technological moment, a set of canonical texts, determine and enliven our response to a given textual and historical situation? What do we want our always-already digital research to transmit from the past to a more human future? In answering these questions, we move from the anxiety of a crisis-ridden present to our grounding in a meaningful past and our work towards a livable future.

Living in History

(Director’s note: Walt Hunter, author of Forms of a World: Contemporary Poetry and the Making of Globalization (Fordham UP, 2019) and co-translator, with Lindsay Turner, of Frédéric Neyrat’s Atopias: Manifesto for a Radical Existentialism (Fordham UP, 2017), is an associate professor of World Literature and associate chair of Clemson English.  Current events have him thinking about this moment in history, our moment in history, and our relationship to it.)

Social scientists and economists have singled out 1973 as a pivotal year in the history of global capitalism—the end of the post-war, Bretton Woods model of “embedded liberalism” and the beginning of the “long downturn”—but could the humanities add something new to this narrative? I’m interested in the cultural shifts in literature and art in the early 1970s that coincide with the end of what the French call the trente glorieuses, the years before the current period of globalization and of dramatically increased inequality.

One of the relatively minor, and yet, for me, ultimately illuminating events of 1973 is the publication of a book of sonnets by Robert Lowell called History (1973). In 1969 Lowell had published many of these sonnets in a collection called Notebook (1969) that, four years and two editions later, became History. “I am learning to live in history,” Lowell writes in the second edition of the fifth poem of a twelve-part sonnet sequence called “Mexico.” “What is history? What you cannot touch.”

Lowell’s reflection in this particular poem is prompted by the passing of the end of the day: the “full sun,” the “silhouetting sunset,” then the “undimmed lights of the passing car.” Typical of Lowell’s sonnets, the intimacy of personal experience occasions a reflection on large-scale processes: though he’s often received as a confessional poet, Lowell’s style in fact reflects his globalized present. The question “what is history?” comes to life as part of a larger American cultural, historical, political, and economic predicament in the early 70s.

Admittedly, the sonnet is probably not the first cultural form that springs to mind when we think about the late 1960s and early 1970s. 1973 was not a marquee year for the sonnet—though it was for Gravity’s Rainbow, Dark Side of the Moon, “Killing Me Softly,” The Exorcist, “Love Is a Hurtin’ Thing,” Badlands, Linda Ronstadt, and “Goodbye Yellow Brick Road.” This list has very little in common except a vague sense that something was over and it was now, as Joan Didion writes, the “morning after the 60s.” Americans in the early 1970s suspect that time is out of joint, that their lives fit into their country and their country fits in the world in a different way.

I take Lowell’s comment seriously: what is it to “learn to live in history” and how can a sonnet, of all things, help do that? Whose history is he talking about, this white scion of a patrician New England family? Why is history “what you cannot touch”?

One of the sonnet’s oldest motifs is noli me tangere—literally, “don’t touch me”which appears in Thomas Wyatt’s mid-sixteenth-century sonnet “Whoso List to Hunt.” The sonnet’s noli me tangere, a phrase Wyatt borrows from the Latin Vulgate, is figured by Wyatt and later sonneteers as the gendered body. Its perspective is one of suspended or thwarted desire—usually, though not always, rooted in a male gaze. The sonnet man is probably a close cousin to the “longing man,” as Merve Emre describes this enduringly odious figure.

But in Notebook, history becomes the object of the sonnet instead. In this way, Lowell speaks quite clearly to the present, his and ours. Living with the “I” in history remains, perhaps, one of the most intractable questions for the United States in 2020. Later, Claudia Rankine will write this through in a different way, while nodding at Lowell: “drag that first person out of the social death of history,” she writes in Citizen: An American Lyric, “then we’re kin.”

The sonnet is not a form in which history lives comfortably. It’s a pretty room or a narrow cell, the place where poets argue with themselves. The schematic grace of the sonnet allows for two positions to be reconciled, or to be held in tension. And the sonnet’s closure, if it comes, is not the closure of an event, but the closure of logic or rhyme. When something happens in the sonnet, it is often a product of the energy created by the poem’s own forces, married to the poet’s thoughts and feelings. While the sonnet sequence might elaborate the emotional conditions in which an “I” is forged, we leave it to Tasso’s ottava rima stanza or Milton’s blank verse to provide the historical sweep.

So when Lowell writes a sonnet, he already knows that it’s not going to accommodate historical narratives very easily—which makes the sonnet’s aversion to history perfectly compatible with its history of adversion. But there’s also something about history itself which, for Lowell, is untouchable, even though he must learn to live in it.

Writing a sonnet can require some grappling with the long and cumbrous European history of the poetic form itself. Traveling to English from Italian, the sonnet loosens up dramatically in English, which lacks the availability of Italian rhymes. Sonnets by Wyatt, Wroth, Wordsworth, Meredith, Frost, McKay, and Brooks are closer to the language as it’s spoken than sonnets by Dante or Petrarch. The sonnet always arrives, in other words, with a lot of baggage—not the least of which is its co-emergence with early modern capitalism and colonialism in Europe.

Lowell struggles with the sonnet’s grandiosity, flattening its music out and grounding its images in material from “the abundance of reality.” He puts it this way at the of Notebook (1970):

My meter, fourteen-line unrhymed blank verse sections, is fairly strict at first and

elsewhere, but often corrupts in single lines to the freedom of prose. Even with this

license, I fear I have failed to avoid the themes and gigantism of the sonnet.

I have been thinking so far about the sonnet as a way of dramatizing Lowell’s discomfort with making history while learning to live in it. This is a predicament that, I think, is only more pressing for US artists today. Given the US influence in remaking the world within its own global framework, how can one shape that history from a position that is effectively inside it, complicit with it, part of its continuation? For Lowell, the sonnet, with its own elaborate history in poetry, becomes a convenient vehicle on which the poet can displace his anxiety about his position and privilege in American history.

For the humanities, grappling with the history of poetry can be an effective way of studying history writ large. So one question that came to me as I was writing this piece was not what the true facts are, but what the right forms might be. This could be the question that animates Rankine’s Citizen, too, as it situates everyday moments of racism within its “American lyric,” and Terrance Hayes’s American Sonnets for My Past and Future Assassin, a sequence of 70 sonnets written immediately after the election of Donald Trump in 2016. To “touch” history might be to possess it, as a new national myth; it might be to touch it up, as an old national lie. But the process of learning to live in history, to deal with what happened honestly, is to embrace one’s own role—not in remaking the past, but in continuing to make a future.

 

 

“One of the best living American writers,” and a Clemson English alum

Directors note: A few years ago, Clemson English MA alum Ron Rash generously spoke at a celebration for the 50th anniversary of the Clemson MA in English program.  I remember him saying something about how when he was a graduate student at Clemson, “people thought we were part of a cult, because we were always carrying books and reading.”  Well, reading is central to the Humanities, and it seems to have stood Ron in good stead.  In this review of his new book, Janet Maslin of the New York Times–the paper of record–describes Ron as “one of the best living American writers.”  This is Clemson Humanities Now.

Still Made in China

(Director’s note: Maria Bose is assistant professor of media and cultural studies in the Clemson English department. Her current project is “Cinema’s Hegemony,” a cultural-materialist account of cinema’s primacy to an unfolding phase of Asia-led global political economy.) 

Midway through Age of Extinction (Paramount, 2014), Michael Bay’s fourth installment for the Transformers franchise’s live-action film series, robotics magnate Joshua Joyce (Stanley Tucci) mistakes amateur inventor Cade Yeager (Mark Wahlberg) and race car driver Shane Dyson (Jack Reynor) for “grease monkeys” in his Chicago facility. There, Joyce has recently authorized the Chinese manufacture of a fleet of Galvatrons: sentient, modular automobiles-cum-super-soldiers reverse engineered from the “living metal” out of which the Transformer robots are composed. Irked to discover two low-level technicians in his pristine showroom, Joyce challenges Yeager and Dyson to define the products on which, he believes, they are all at work. “What do you think it is that we make here?” the CEO asks. “We make poetry. We’re poets.” Later, Joyce learns that his outsourced Galvatrons lack “souls,” their corruption and insurgency implicated in Chinese assembly. Having modeled the robots on benevolent, democratic Autobot leader Optimus Prime, Joyce is baffled by their similarity to evil, authoritarian Decepticon leader Megatron. Faced with this discrepancy—between soulful poetry and corrupt machinery, American design and Chinese execution—Joyce’s metaphors begin to harden. “Simple coding. Algorithms! Math! Why can’t we make what we want to make, the way we want to make it?”

Age of Extinction’s lurking resentment at shifts in the global balance of power away from the US and toward China is nearly lost in the film’s explicit narrative of US-China collaboration. The American Joyce is volubly enamored of his Chinese associate Su Yueming (Li Bingbing), who defends him valiantly in the film’s final Hong Kong sequence. But read as a reflexive parable, Age of Extinction’s preoccupation with production’s shifting centers appears immanently self-theorizing and complexly historical, offering both an index of Bay’s frustrated “poetic” autonomy in co-producing the film with Jiaflix Enterprises and the China Movie Channel, and also a broad systemic critique of US hegemony’s decline in conjunction with a “rising China.” Age of Extinction opens with Yeager purchasing analog film projectors from a shuttered Texas movie theater where Optimus Prime lies in hiding, disguised as a semi-truck manufactured by the now-defunct American brand, Marmon. Walking past a poster for Howard Hawks’ El Dorado (Paramount, 1966), the theater owner laments the current state of cinema. “The movies nowadays, that’s the trouble,” the old man sighs. “Sequels and remakes, bunch of crap.” If Joyce articulates Bay’s anxiety that his “poetry” will, by dint of global media conglomeration, runaway production, and foreign-market pandering, be reduced to “crap,” then Yeager simultaneously locates the source of that anxiety in late-twentieth-century US deindustrialization and binds it to the film’s nostalgia for a contemporaneous era of American studio auteurism putatively immune to conglomerate oversight, new-media competition, and foreign-market pressure. Surprisingly, then, given the film’s outpouring of nationalist bombast, Age of Extinction is less neoliberal-imperial jingo than neo-protectionist post-imperial swansong, less a refutation of US hegemony’s unraveling than its strategic diagnosis. Age of Extinction’s moral and geopolitical lesson is that global industrialists like Joyce are ultimately responsible for disenfranchising hardworking Americans like Yeager—corporate hardliners who offshore jobs in the automobile sector to China and speed blue-collar American workers’ deskilling and redundancy in the advent of automation. The film is willing to risk these admissions—driven, in fact, to solicit a rigorous account of American declension and the breakdown of the liberal-democratic project—because they serve its corporate-existential stakes, which are, on the one hand, to align the fates of state and cinema by synchronizing its account of American decline with that of Hollywood’s progressive evanescence in an increasingly diversified global media complex and, on the other, to motivate that synchronization toward Bay’s and Paramount’s auteurist self-promotion.

Since Age of Extinction’s release, the US-China tension it registers has only escalated, fueled by a hawkish Trump administration bent on hobbling Chinese firms’ forays into European markets and on portraying Chinese authoritarianism as the single greatest threat to US democracy. Amid a prolonged trade war and an unprecedented global downturn triggered by the coronavirus pandemic, American CEOs like Joyce and Hollywood filmmakers like Bay have been castigated for ongoing collaboration with Beijing. The film’s underlying diagnosis of the “Chinese threat,” and its nostalgic appeal to the competitive and autonomous US industrial sector foreclosed by deindustrialization is thus entirely familiar. Across party lines, presidential candidates have called for US manufacturers to maximize native production. “We’re bringing it back,” Trump said at a Ford factory plant in Michigan, just earlier this year. “The global pandemic has proven once and for all that to be a strong nation, America must be a manufacturing nation.” That statement captures both the promise of closing America’s gaping trade deficit with China and the threat of global supply chains’ overexposure in Chinese markets increasingly susceptible to rolling shutdowns and political oversight.

But despite Trump’s longing for a new “golden age” of industrialization and his apparent willingness to confront China on matters of trade—in less than two years Trump has imposed sanctions and tariffs on Chinese imports, revoked Hong Kong’s special trade status, pressured European allies to ostracize Chinese firms, and offered inducements to lure investors away from China—manufacturing job-growth in America has barely kept pace with the total workforce. At the same time, and in the recent course of the pandemic especially, China’s export shares have risen, a testament to the indispensability, competitiveness, and innovation of the nation’s industrial base. And while there is evidence that many firms plan to relocate some or all of their manufacturing facilities away from China, there is also widespread concern that manufacturing itself is swiftly becoming less labor intensive, as low-tech industries transition to the automated and robotic systems for which China is the largest market.

At a moment of reckoning for American manufacturing, and the reassessment of just-in-time global supply chains, then, calls to divest from “Factory Asia” unfold ideological debates about the fate of the global political order. But these debates carry significance less in terms of the overblown contest between “American democracy” and “Chinese authoritarianism” (pace Trump, or Bay). Rather, their significance lies in the exfoliation of far subtler processes of hegemonic rebalancing involving America’s wary retreat from, and China’s equally wary advance upon, a vertiginous neoliberal system, as both nations confront the potential boons and perils of a return to—or the continuation of—an anachronistic state-led capitalism. More likely than serious geo-economic overhaul is the emergence of regional hegemonies ordered to suit the state’s immediate desire for market, political, and military supremacy (China seeks such supremacy in the Indo-Pacific; America plans to carve it out by further industrializing hubs in Mexico and Canada). Perhaps, then, it’s only a matter of time before we can “make what we want to make, the way we want to make it.” For now, it’s all still made in China.

The Humanities as meaning-making

(Director’s note: David Sweeney Coombs is associate professor and Director of Undergraduate Studies in the Clemson English Department. He is the author of Reading with the Senses in Victorian Literature and Science, published in 2019 by University of Virginia Press.  This is Clemson Humanities Now.)

“A dreadful Plague in London was,

In the Year Sixty Five

Which swept an Hundred Thousand Souls

Away; yet I alive!”

These lines appear as the conclusion of Daniel Defoe’s Journal of the Plague Year, appended as a kind of epigram on the record of the fictitious author H. F.’s impressions of the bubonic plague epidemic in London in 1665. “Yet I alive!”: at once minimized in comparison to the preceding lines and the scale of the disaster they invoke and yet granted the importance of serving as the quatrain’s parting words, this concluding expression of baffled triumph and gratitude distills the confused sense of H. F. throughout the rest of the narrative that his survival is thanks to an inextricable combination of prudence, providence, and sheer dumb luck. Even though this is the end of the line for the novel, and this epigram might also be justly described as, in a sense, an epitaph, a question hangs over H. F.’s affirmation of having outlived the plague: I survived. Why, and what do I do now?

It is, of course, too early in the Covid-19 pandemic for us to ask ourselves this question in all earnestness. As I write, case-counts in South Carolina have been rising rapidly for several weeks and Clemson has just announced that it will delay bringing students back to campus until September. But if we can’t yet congratulate ourselves on being survivors, reading Defoe’s novel with the pandemic ongoing is nonetheless an occasion to ask what it means to survive and what survival might look like. Covid-19 has given that question a new urgency for us all as individuals, but it has also given a sharper ring to a version of that question that had already begun being sounded in American universities in the last few years even before the upheavals created by the novel coronavirus: can the academic humanities survive?

Enrollments in the academic humanities have fallen dramatically since the 2008 Financial Crisis. Nationally, the declines have been particularly sharp in the core disciplines that together represent the Humanities at Clemson: English, Languages, Philosophy and Religion, and History. The number of Bachelor’s Degrees awarded in those fields at U.S. universities declined by 17 percent from 2012-2015 (according to the American Academy of Arts and Sciences), with the downward trend continuing afterwards. I am now the Director of Undergraduate Studies for the English Department at Clemson, where the number of English majors decline by approximately 25 percent between 2011 and 2018.

The proximate cause of these drops is what Christopher Newfield in The Great Mistake has analyzed as the devolutionary cycle of privatization in American public universities, intensified by the long recession and fiscal austerity after 2008. As Clemson, like other public universities, has become locked into a cycle of slashed state funding, tuition hikes, and attempts to raise revenues and cut costs through partnering with private sector sponsors, more and more students have entered the university staggering under the pressure to monetize their studies so as to compete for the well-paid jobs requiring a bachelor’s degree. Under that burden, a major in the humanities look like a risk, or worse, a luxury. While there were some signs that enrollments in the humanities were ticking back up in the last two years, the economic fallout from the pandemic might wipe out those small gains and then some.

One task for humanities scholars at this moment is to take every opportunity to reiterate that the retreat from the humanities in search of supposedly more marketable majors rests on a popular misconception that humanities degrees don’t translate into good jobs. The economic value of Humanities BAs is in fact broadly similar to that of degrees in Business and STEM fields like Biology. A 2018 study found that an average 25-29 year-old Business major earns about $50,000 annually compared to the $45,000 of an average English or History major of the same age, but that Business major is slightly more likely to be unemployed than the History major. When students in one of Clemson colleague Tharon Howard’s courses conducted a survey last year, they found that 90 percent of graduates from Clemson English had found jobs within a year of graduation (and more than 60 percent of them did so within the first three months of graduating). While the coronavirus pandemic is currently destabilizing labor markets in new ways, a humanities degree is likely to continue serving our students’ career prospects well once the recovery is underway. If they were judged by their actual return on investment, as opposed to the caricature of the unemployed Philosophy major, the academic humanities could survive.

But do the humanities deserve to survive? Do they have value that can’t be reduced to the market? Considered intellectually and ethically, what elements of prudence or providence justify their survival as areas of study within the university? The humanities are difficult to discuss, let alone define, in a unified way, but I accept Helen Small’s working definition from The Value of the Humanities:

The humanities study the meaning-making practices of human culture, past and present, focusing on interpretation and critical evaluation, primarily in terms of the individual response and with an ineliminable element of subjectivity.

This definition is perhaps recognizably the work of an English professor, and some historians might not necessarily identify “the meaning-making practices of human culture” as their sole or primary object of study. At the risk of disciplinary chauvinism, though, Small’s placing practices of meaning-making at the center of what we do in the humanities strikes me as capturing something important about them.

When Coluccio Salutati articulated the first modern version of the studia humanitatis in the fourteenth century, his basic criterion for each mode of inquiry’s inclusion was a commitment to studying languages in order to understand the human. The modern humanities, that is, study meaning-making with the aim of self-understanding. That aim might be more or less implicit in practice but it is the humanities’ first and most important justification. While Small’s analysis is too cool to veer into anything with the potential for such embarrassing earnestness, it can’t but hint at one of the tacit promises we make our students: by teaching you how meaning is made, the humanities will also help you to cultivate the interpretive and critical skills that can make your own life more meaningful.

Seen along these lines, the humanities are vulnerable to the degree that their ineliminable element of subjectivity can metastasize, or be perceived to have metastasized. The charge that the humanities have turned away from the world and become a mere solipsistic exercise in personal enrichment is a recurring feature of sharp-elbowed public debates about their relationship with the sciences in particular. Some version of that claim is central to the Matthew Arnold-T.H. Huxley debates about education in the nineteenth century, the C. P. Snow-F. R. Leavis debate about the so-called two cultures of the sciences and the humanities in the 1950s and 1960s, and the Sokal Hoax and the debates it occasioned about post-structuralism and social construction in the 1990s. What tends to be obscured by these debates is that self-understanding, pursued doggedly enough, leads inevitably back to the world of which that self is a part. If the humanities without the sciences (or the social sciences) risk solipsism, their study of meaning-making practices can teach the sciences about the terms they use to study the world. An explicit critical engagement with those terms is in fact the very reason-for-being for multiple humanities sub-disciplines: intellectual history, literature and medicine, the history and philosophy of science, and so on.

The spectacular failures of the U.S. health system in the face of Covid-19 underscore the necessity for attention to that humanities’ scholarship in understanding what has gone so very wrong here and how we might fix it. The pandemic has given us a searchingly thorough diagnosis of our society: we live in a country riven by stark economic and racial inequality, its state capacity for anything but administering violence hollowed out by decades of disinvestment. In this respect, our world is not so very different from the London of 1665 presented in Defoe’s narrative. Published in 1722 amid the real possibility that another plague epidemic might spread from Marseilles to London, Journal of the Plague Year is an attempt by Defoe to persuade the public to support some of the controversial quarantine measures adopted by the Walpole government (which seems to have been paying him at the time). To accomplish this aim, Defoe put the extensive research into the 1665 plague that he, the real author, had made into the mouth of a fictional author imagined as living through the plague. The result is a strange hybrid of public health reporting, history, and fiction, one that, as Margaret Healy puts it, yields “a complex texture of interwoven relationships between the subjective and the social, the private and the collective—the kernel of any novel.” We might also identify this particular kind of interweaving as something like the kernel of the humanities. Let’s hope they survive.

Remembering Chris Dickey

(Director’s note: In the summer of 2008, I had just arrived at my destination in Istanbul, connected to wifi, checked my email, and saw that I had a received a message from Will Cathcart, Clemson English, Class of 2005.  In it, Will rightly predicted that Russia was about to invade the country of Georgia, and announced that he was boarding a plane to go interview the then President of Georgia, Mikheil Saakashvili.  The invasion announced the arrival of a multipolar world, and established a pattern Russia would repeat in Crimea, the Ukraine, and Syria.  Will, who has lived in and reported from Georgia ever since, was inducted in the Clemson Alumni Council’s Roaring 10 in 2014.  His reporting can be seen in CNN, Foreign Policy, and, most frequently, seems to me, The Daily Beast.  There, his editor was “renowned foreign correspondent,” Christopher Dickey, whose father, James Dickey, author of Deliverance, was a student at Clemson for his first year of college, before serving in World War II (and later graduating from USC).  Christopher Dickey, a regular visitor to and speaker at Clemson over the years, passed away on July 16th.  In a new post on Lit Hub, Will reflects on the loss of a friend, an advocate, and an editor.  This is Clemson Humanities Now.)

The Earthquake of 1755 — An “extraordinary world event” (Goethe 1810)

(Director’s note: Johannes Schmidt teaches German in the Clemson Languages Department, where he will offer a course on “Tolerance in the Eighteenth Century” (in German) in the fall of 2020 and on Hanna Arendt and Totalitarianism (in English) in the spring of 2021.  This is Clemson Humanities Now.)

On November 1, 1755, an earthquake off the coast of Lisbon struck and destroyed a large portion of the city; aided by subsequent fires and massive tsunami wave, it claimed between 30,000 and 100,000 lives (of about 275,000 residents).

It disturbed all aspects of life. Economically, the destruction of one of the most important European port city was felt in all places of trade and commerce. The news of the earthquake reached London within a day; newspapers in Hamburg, my own hometown, reported only a few days later. Colonial politics became magnified; the outbreak of the Seven Years War (French and Indian War) may have been hastened due to the rising tensions caused by an increased desire for control of trade with the New World.

Art and literature took note, and so did the Enlightenment in general, reconsidering the Leibnizian philosophy of optimism. In his Candide, ou l’Optimisme, Voltaire famously makes fun of “the best of all worlds.” Johann Gottfried Herder—who together with Goethe coined the term zeitgeist (spirit of the age)—reprimanded the “philosophes” for hubris; they had falsely proclaimed a golden age that advanced beyond every other historical era and was above every other culture: “‘In Europe now, there is supposed to be more virtue than ever, [than] everywhere in the world has been?’ And why? because there is more enlightenment in the world – I believe that just for that reason it has to be less” (1774).

The faith in Enlightenment only made philosophy too proud of the knowledge gained than aware of that which we cannot control in nature. Unsurprisingly, neither knowledge nor reason predicted this catastrophe. To this day, earthquake predictions are plagued by their unpredictability—much to the vexation of the Italian government concerning the botched warning of the devastating earthquake in the Abruzzo region in 2009. The unpreparedness and our unenlightened–irrational, at times even heartless–response to the COVID-19 pandemic seem to be grounded in the same misgivings about nature and knowledge that many of the European thinkers had before 1755.

Goethe’s observation about the earthquake stands out to me today: “Perhaps, the daemon of terror had never before spread as speedily and powerfully over the world”. As a six-year old, he was terrified, despite being more than 1500 miles away. In Georg Philipp Telemann’s “Ode of Thunder” (TWV 6:3a-b, 1756 [link to the entire piece: https://www.youtube.com/watch?v=CDW19YqDq64]), we can hear the horrific feeling the existence of an “angry God” must have evoked; we can perceive young Goethe’s fear of the Hebrew Bible’s “wrathful deity” that could not be conciliated: Listen today to the frantic fifth and sixth movement from the Donnerode, entitled “The voice of God shatters the cedars!” [https://youtu.be/7ZUUBmol_HE] and “[The voice of God] makes the proud mountains collapse!” [https://youtu.be/4D-u6VslLlg]. Our anxieties connect us to the desolation people felt in the aftermath of 1755; one might hear the aftershocks in the fifth movement or people fleeing from proud palaces collapsing in the sixth.

It was Kant who—having already studied the physics of the earth’s crust—reminded us of the need for rational-scientific investigation into disastrous occurrences:

“Great events that affect the fate of all mankind rightly arouse that commendable curiosity, which is stimulated by all that is extraordinary and typically looks into the causes of such events. In such cases, the natural philosopher’s obligation to the public is to give an account of the insights yielded by observation and investigation”, he writes in one of his three studies on earthquakes (1756).

Of much needed insight was his proclamation that the event dictated a clear distinction between moral evil and natural disasters. Since all natural occurrences are subject to physical laws, a supernatural cause (divine punishment) for natural disasters must be rejected.

Epidemiologically, the coronavirus does not discriminate. Neither did the earthquake spare the righteous more than the corrupt. Yet, both events show the drastic inequalities brought forth when calamity strikes. While the earthquake affected the rich and the poor alike, the pious and the heretic equally, the nobility and the outlawed in similar ways, the fate of those who survived is very different. The king of Portugal—who spent the holiday away from Lisbon—never set foot into the city ever again; the entire royal family relocated to palaces elsewhere. A king’s luxury mirrored today: the affluent fled the cities hit hard by the pandemic and retreated into their summer homes. In the end, viruses and earthquakes do discriminate.

One of the core ideas of Kant’s philosophy of ethics is the importance of human dignity: the incalculable worth of a human being’s life, which is beyond any form of measurement. Free human existence is fundamental. Kant makes perfectly clear that the categorical autonomy of all human beings is absolute and independent of any phenomenological or physiological distinction. This imperative is as valid today as tomorrow. (Kant’s Eurocentric thinking is certainly disturbing and his belief in Western civilization’s supremacy contradicts the inclusiveness that may be suggested in this kind of universal call for autonomy of all peoples and races.) Still, I read this today as an affirmation that there cannot be any material or any other justification for the loss of even a single life.

Protest against racism and inequality, as well as solidarity with all whose human dignity has been and continues to be violated, is not only legitimate but also necessary at all times, especially during natural and manmade disasters. Even more so, for us in privileged positions, it is our responsibility.

ITALY COVID 19

(Director’s Note: Seeing  Covid-19 hit Italy early, and hard,  I wrote to a colleague in the Department of Languages, Roberto Risso, for his sense of the situation, below.  Dr. Risso was born and raised in Turin, Italy, where he lived
until eleven years ago.  After relocating to the US, he has lived
in Wisconsin, Maine, and South Carolina. He now lives in the Clemson area,
where he loves gardening and reading.  He is currently an Assistant Professor of Italian Studies at Clemson University.  This is Clemson Humanities Now.)

The planetary dimension of the catastrophe became clear to the Belpaese quite early, when a European citizen was said to have brought to Italy COVID 19, a virus originated in China and was spreading fast. Italy has a special bond with China, a mutual fascination that dates back to Marco Polo’s travel and to his famous book. The so-called new Silk Road has a relevance that should never be underestimated in today’s world. Billion of Euros’ worth of commerce and almost half million of Chinese people living in Italy reinforce this bond. And yet anti-Chinese xenophobia hit Italy at the beginning of the pandemic in February 2020: hard-to-watch videos surfaced of Chinese people being insulted and beaten on the streets. Italians were scared, angry, unhappy, very unprepared, but mostly had the decency to admit it from day one.

The virus hit Italy hard from the very beginning: a densely populated territory with areas that are among the most polluted in Europe, by March and April the North of Italy was a hotspot and multiple hundred of victims died daily, especially in the Lombardy region, the economic capital of the country. The harrowing images of the military trucks transporting coffins out of the Lombard city of Bergamo became iconic of the disaster that was taking place under the Tuscan (and Lombard) sun. As a result, new emergency laws were introduced and reinforced: total lockdown for more than a month and after that compulsory masks everywhere. Italians obeyed and thank God they did. And it worked, indeed. By end of April and May the situation was better: from hundred of deaths daily to a few dozen, then few people, a dozen less. But with relaxation of the attention a new wave came, mid-June, and the threat of a new Nation-wide lockdown convinced people to be more careful.

Is it possible to conceive Italy with no tourists? No, of course not. But without planes, and with borders sealed, no tourists came and the only visitors were the boatloads of refugees who came by sea. Singing from balconies, memes, online feuds, political chaos were nothing but distractions; from the beginning of the Second Republic on, politicians have been clowns all along. Since Italy has always been a divided and diverse country, the pandemic did not unite it; Italians did not became any better.  Contrary to what it was said during the lock-down–“Ne usciremo migliori e più forti”, or ‘we will get out of this stronger and better’–did not really happen, unfortunately. What really happened is that, due to the internal immigration of the past, at the beginning of the pandemic many southerners living in the North fled the closing cities and brought the virus with them. Italy has also the oldest population in Europe, second oldest in the world after Japan: the “nonni”, grandpas and grandmas, died like flies, especially in the nursing homes, where, the suspicion is, that nothing was done to avoid contagion. On the contrary, people say (and please remember, vox populi, vox Dei) that the virus was used to get rid of many of them. It is so atrocious that it can be true in a country no stranger to atrocities.

As of July 4th 2020 the contagion index is anything but uniform: it has increased from June but many central and southern regions are free from Covid 19, and not for lack of testing. Lombardy, where the virus has claimed so many lives in the past, has a level of 0.89 index, which is good, although the nation’s cases as of the beginning of July 2020 are 241,419 which is scary. In four months, twenty thousand jobs have been lost for good.

As of July 1st citizens from USA, Brazil, South Africa, Russia, Turkey, India, Saudi Arabia and Mexico have been blocked from entering the Country. Chinese nationals have not been blocked yet. With these conditions it is very hard to foresee the damage to the tourist industry. We will see what the European Union will do about it, since its priorities usually seem to be the well being of German Banks and not the hardships of citizens. What I know for sure, despite all the terrible news and the scary data, is that Italians will move forward despite the ten thousand babies that will not be born in 2021.  In a Country that is already at a level zero birthrate.

Moving forward and rebuilding after catastrophes is what Italians do best.