Humanities Hub

How do we do research in the humanities now?

(Director’s note: Gabriel Hankins teaches English literature and digital humanities in the Department of English at Clemson University.  His research interests include modernism, and digital literary studies.  His book, Interwar Modernism and Liberal World Order, has been published by Cambridge University Press.  He is co-editor of a new series on Digital Literary Studies, also published by Cambridge University Press.  As he and I corresponded this summer about online teaching tools suitable for close reading and group discussions, I asked him if he might contribute to this series.  He said yes, and this is the result.  This is Clemson Humanities Now.)

How do we do research in the humanities now?

Once, this might have been the title of a quietly confident survey of changing research practices in the humanities, some gradual, some sudden, with the confidence that familiar pathways and byways of humanities research remain largely unchanged. That confidence in the stability and continuity of the present has ended. Now the question has a slight tone of desperation: how are we, in the conditions imposed by a viral pandemic, going to do our research? Particularly in historical fields built around books, libraries, and archives, not to mention conferences, the conditions of our work seem to have changed utterly.

My response to the titular question is contradictory, like life in the present crisis, but I hope we can learn something by thinking through the contradictions.

We are all digital researchers now.

Within the digital humanities community, this is a familiar claim made by Lincoln Mullen, among others. I have refined a version of the claim for modernist studies in particular, in which I argue that literary fields are now contiguous with “digital humanities” methods and approaches along all their fractal boundaries. But, as I conclude in that essay, if we are all digital humanists now, we don’t yet know it. In the time of coronavirus, that knowledge is inescapable. Neither our teaching, research, nor service can escape a confrontation with digital mediation and its consequences. How we confront the digital mediation of our fields cannot be predetermined: in some modes, the dialectic of the digital and humanist plays out in terms of conflict, resistance, or refusal.

The middle matters: digital mediation is a problem of proximity, distance, and transmission.

Since T. S. Eliot at least, literary studies has seen its problem as that of positioning the individual experience in relation to inherited traditions, cultures, and texts, even if those texts and cultures are “inherited” from our contemporaries. Our method of constituting our traditions has everything to do with the historical moment we are in: Eliot’s particular mode of historical thinking arose out of a crisis in Western political order, a chaotic war, and a viral pandemic that recalls our own. At first the pandemic might seem to separate us from the archives, libraries, and conferences that formed the structure of research in the humanities, as it separates us from a livable past and a thinkable present. Digital mediation of those resources is experienced as distance and remove from the original. But in several senses digital archives and methods bring us unbearably close to our objects and interests, even as they always transmit, mediate, and thereby omit something of the original. All of Shakespeare is now available to the panoptic queries of the researcher; even more unbearably, the entire corpus of eighteenth-century Anglophone fiction, or nineteenth-century women novelists, or detective fictions of the 1930s. Certain archives, too, are available in digital form as they never were before – I am thinking here of queer fanfictions, but also the letters of Robert Southey, uniquely captured in digital form. The problem of researching these materials opens onto the more general problem of a self-reflective research practice, in which the modes and materials by which we do our work do not simply disappear in the background.

Digital research has to begin and end with the questions, objects, and interests of the humanities.

This may seem obvious, but it has not been clear for some generations of digital researchers. Our research questions and interests must rise out of the vibrant cultures, languages, traditions, and materials that inspire our work. We can and must think through the implications of our current digital condition, and the ways in which digital mediations of all kinds shape the questions we can ask and the canons we consider. But our questions should remain framed by our attachment to the primary questions of the humanities: what are the ways in which we make meaning, how is that meaning grounded in a history, a time and a place, a specific culture? How does a material history, a technological moment, a set of canonical texts, determine and enliven our response to a given textual and historical situation? What do we want our always-already digital research to transmit from the past to a more human future? In answering these questions, we move from the anxiety of a crisis-ridden present to our grounding in a meaningful past and our work towards a livable future.

Living in History

(Director’s note: Walt Hunter, author of Forms of a World: Contemporary Poetry and the Making of Globalization (Fordham UP, 2019) and co-translator, with Lindsay Turner, of Frédéric Neyrat’s Atopias: Manifesto for a Radical Existentialism (Fordham UP, 2017), is an associate professor of World Literature and associate chair of Clemson English.  Current events have him thinking about this moment in history, our moment in history, and our relationship to it.)

Social scientists and economists have singled out 1973 as a pivotal year in the history of global capitalism—the end of the post-war, Bretton Woods model of “embedded liberalism” and the beginning of the “long downturn”—but could the humanities add something new to this narrative? I’m interested in the cultural shifts in literature and art in the early 1970s that coincide with the end of what the French call the trente glorieuses, the years before the current period of globalization and of dramatically increased inequality.

One of the relatively minor, and yet, for me, ultimately illuminating events of 1973 is the publication of a book of sonnets by Robert Lowell called History (1973). In 1969 Lowell had published many of these sonnets in a collection called Notebook (1969) that, four years and two editions later, became History. “I am learning to live in history,” Lowell writes in the second edition of the fifth poem of a twelve-part sonnet sequence called “Mexico.” “What is history? What you cannot touch.”

Lowell’s reflection in this particular poem is prompted by the passing of the end of the day: the “full sun,” the “silhouetting sunset,” then the “undimmed lights of the passing car.” Typical of Lowell’s sonnets, the intimacy of personal experience occasions a reflection on large-scale processes: though he’s often received as a confessional poet, Lowell’s style in fact reflects his globalized present. The question “what is history?” comes to life as part of a larger American cultural, historical, political, and economic predicament in the early 70s.

Admittedly, the sonnet is probably not the first cultural form that springs to mind when we think about the late 1960s and early 1970s. 1973 was not a marquee year for the sonnet—though it was for Gravity’s Rainbow, Dark Side of the Moon, “Killing Me Softly,” The Exorcist, “Love Is a Hurtin’ Thing,” Badlands, Linda Ronstadt, and “Goodbye Yellow Brick Road.” This list has very little in common except a vague sense that something was over and it was now, as Joan Didion writes, the “morning after the 60s.” Americans in the early 1970s suspect that time is out of joint, that their lives fit into their country and their country fits in the world in a different way.

I take Lowell’s comment seriously: what is it to “learn to live in history” and how can a sonnet, of all things, help do that? Whose history is he talking about, this white scion of a patrician New England family? Why is history “what you cannot touch”?

One of the sonnet’s oldest motifs is noli me tangere—literally, “don’t touch me”which appears in Thomas Wyatt’s mid-sixteenth-century sonnet “Whoso List to Hunt.” The sonnet’s noli me tangere, a phrase Wyatt borrows from the Latin Vulgate, is figured by Wyatt and later sonneteers as the gendered body. Its perspective is one of suspended or thwarted desire—usually, though not always, rooted in a male gaze. The sonnet man is probably a close cousin to the “longing man,” as Merve Emre describes this enduringly odious figure.

But in Notebook, history becomes the object of the sonnet instead. In this way, Lowell speaks quite clearly to the present, his and ours. Living with the “I” in history remains, perhaps, one of the most intractable questions for the United States in 2020. Later, Claudia Rankine will write this through in a different way, while nodding at Lowell: “drag that first person out of the social death of history,” she writes in Citizen: An American Lyric, “then we’re kin.”

The sonnet is not a form in which history lives comfortably. It’s a pretty room or a narrow cell, the place where poets argue with themselves. The schematic grace of the sonnet allows for two positions to be reconciled, or to be held in tension. And the sonnet’s closure, if it comes, is not the closure of an event, but the closure of logic or rhyme. When something happens in the sonnet, it is often a product of the energy created by the poem’s own forces, married to the poet’s thoughts and feelings. While the sonnet sequence might elaborate the emotional conditions in which an “I” is forged, we leave it to Tasso’s ottava rima stanza or Milton’s blank verse to provide the historical sweep.

So when Lowell writes a sonnet, he already knows that it’s not going to accommodate historical narratives very easily—which makes the sonnet’s aversion to history perfectly compatible with its history of adversion. But there’s also something about history itself which, for Lowell, is untouchable, even though he must learn to live in it.

Writing a sonnet can require some grappling with the long and cumbrous European history of the poetic form itself. Traveling to English from Italian, the sonnet loosens up dramatically in English, which lacks the availability of Italian rhymes. Sonnets by Wyatt, Wroth, Wordsworth, Meredith, Frost, McKay, and Brooks are closer to the language as it’s spoken than sonnets by Dante or Petrarch. The sonnet always arrives, in other words, with a lot of baggage—not the least of which is its co-emergence with early modern capitalism and colonialism in Europe.

Lowell struggles with the sonnet’s grandiosity, flattening its music out and grounding its images in material from “the abundance of reality.” He puts it this way at the of Notebook (1970):

My meter, fourteen-line unrhymed blank verse sections, is fairly strict at first and

elsewhere, but often corrupts in single lines to the freedom of prose. Even with this

license, I fear I have failed to avoid the themes and gigantism of the sonnet.

I have been thinking so far about the sonnet as a way of dramatizing Lowell’s discomfort with making history while learning to live in it. This is a predicament that, I think, is only more pressing for US artists today. Given the US influence in remaking the world within its own global framework, how can one shape that history from a position that is effectively inside it, complicit with it, part of its continuation? For Lowell, the sonnet, with its own elaborate history in poetry, becomes a convenient vehicle on which the poet can displace his anxiety about his position and privilege in American history.

For the humanities, grappling with the history of poetry can be an effective way of studying history writ large. So one question that came to me as I was writing this piece was not what the true facts are, but what the right forms might be. This could be the question that animates Rankine’s Citizen, too, as it situates everyday moments of racism within its “American lyric,” and Terrance Hayes’s American Sonnets for My Past and Future Assassin, a sequence of 70 sonnets written immediately after the election of Donald Trump in 2016. To “touch” history might be to possess it, as a new national myth; it might be to touch it up, as an old national lie. But the process of learning to live in history, to deal with what happened honestly, is to embrace one’s own role—not in remaking the past, but in continuing to make a future.

 

 

“One of the best living American writers,” and a Clemson English alum

Directors note: A few years ago, Clemson English MA alum Ron Rash generously spoke at a celebration for the 50th anniversary of the Clemson MA in English program.  I remember him saying something about how when he was a graduate student at Clemson, “people thought we were part of a cult, because we were always carrying books and reading.”  Well, reading is central to the Humanities, and it seems to have stood Ron in good stead.  In this review of his new book, Janet Maslin of the New York Times–the paper of record–describes Ron as “one of the best living American writers.”  This is Clemson Humanities Now.

Still Made in China

(Director’s note: Maria Bose is assistant professor of media and cultural studies in the Clemson English department. Her current project is “Cinema’s Hegemony,” a cultural-materialist account of cinema’s primacy to an unfolding phase of Asia-led global political economy.) 

Midway through Age of Extinction (Paramount, 2014), Michael Bay’s fourth installment for the Transformers franchise’s live-action film series, robotics magnate Joshua Joyce (Stanley Tucci) mistakes amateur inventor Cade Yeager (Mark Wahlberg) and race car driver Shane Dyson (Jack Reynor) for “grease monkeys” in his Chicago facility. There, Joyce has recently authorized the Chinese manufacture of a fleet of Galvatrons: sentient, modular automobiles-cum-super-soldiers reverse engineered from the “living metal” out of which the Transformer robots are composed. Irked to discover two low-level technicians in his pristine showroom, Joyce challenges Yeager and Dyson to define the products on which, he believes, they are all at work. “What do you think it is that we make here?” the CEO asks. “We make poetry. We’re poets.” Later, Joyce learns that his outsourced Galvatrons lack “souls,” their corruption and insurgency implicated in Chinese assembly. Having modeled the robots on benevolent, democratic Autobot leader Optimus Prime, Joyce is baffled by their similarity to evil, authoritarian Decepticon leader Megatron. Faced with this discrepancy—between soulful poetry and corrupt machinery, American design and Chinese execution—Joyce’s metaphors begin to harden. “Simple coding. Algorithms! Math! Why can’t we make what we want to make, the way we want to make it?”

Age of Extinction’s lurking resentment at shifts in the global balance of power away from the US and toward China is nearly lost in the film’s explicit narrative of US-China collaboration. The American Joyce is volubly enamored of his Chinese associate Su Yueming (Li Bingbing), who defends him valiantly in the film’s final Hong Kong sequence. But read as a reflexive parable, Age of Extinction’s preoccupation with production’s shifting centers appears immanently self-theorizing and complexly historical, offering both an index of Bay’s frustrated “poetic” autonomy in co-producing the film with Jiaflix Enterprises and the China Movie Channel, and also a broad systemic critique of US hegemony’s decline in conjunction with a “rising China.” Age of Extinction opens with Yeager purchasing analog film projectors from a shuttered Texas movie theater where Optimus Prime lies in hiding, disguised as a semi-truck manufactured by the now-defunct American brand, Marmon. Walking past a poster for Howard Hawks’ El Dorado (Paramount, 1966), the theater owner laments the current state of cinema. “The movies nowadays, that’s the trouble,” the old man sighs. “Sequels and remakes, bunch of crap.” If Joyce articulates Bay’s anxiety that his “poetry” will, by dint of global media conglomeration, runaway production, and foreign-market pandering, be reduced to “crap,” then Yeager simultaneously locates the source of that anxiety in late-twentieth-century US deindustrialization and binds it to the film’s nostalgia for a contemporaneous era of American studio auteurism putatively immune to conglomerate oversight, new-media competition, and foreign-market pressure. Surprisingly, then, given the film’s outpouring of nationalist bombast, Age of Extinction is less neoliberal-imperial jingo than neo-protectionist post-imperial swansong, less a refutation of US hegemony’s unraveling than its strategic diagnosis. Age of Extinction’s moral and geopolitical lesson is that global industrialists like Joyce are ultimately responsible for disenfranchising hardworking Americans like Yeager—corporate hardliners who offshore jobs in the automobile sector to China and speed blue-collar American workers’ deskilling and redundancy in the advent of automation. The film is willing to risk these admissions—driven, in fact, to solicit a rigorous account of American declension and the breakdown of the liberal-democratic project—because they serve its corporate-existential stakes, which are, on the one hand, to align the fates of state and cinema by synchronizing its account of American decline with that of Hollywood’s progressive evanescence in an increasingly diversified global media complex and, on the other, to motivate that synchronization toward Bay’s and Paramount’s auteurist self-promotion.

Since Age of Extinction’s release, the US-China tension it registers has only escalated, fueled by a hawkish Trump administration bent on hobbling Chinese firms’ forays into European markets and on portraying Chinese authoritarianism as the single greatest threat to US democracy. Amid a prolonged trade war and an unprecedented global downturn triggered by the coronavirus pandemic, American CEOs like Joyce and Hollywood filmmakers like Bay have been castigated for ongoing collaboration with Beijing. The film’s underlying diagnosis of the “Chinese threat,” and its nostalgic appeal to the competitive and autonomous US industrial sector foreclosed by deindustrialization is thus entirely familiar. Across party lines, presidential candidates have called for US manufacturers to maximize native production. “We’re bringing it back,” Trump said at a Ford factory plant in Michigan, just earlier this year. “The global pandemic has proven once and for all that to be a strong nation, America must be a manufacturing nation.” That statement captures both the promise of closing America’s gaping trade deficit with China and the threat of global supply chains’ overexposure in Chinese markets increasingly susceptible to rolling shutdowns and political oversight.

But despite Trump’s longing for a new “golden age” of industrialization and his apparent willingness to confront China on matters of trade—in less than two years Trump has imposed sanctions and tariffs on Chinese imports, revoked Hong Kong’s special trade status, pressured European allies to ostracize Chinese firms, and offered inducements to lure investors away from China—manufacturing job-growth in America has barely kept pace with the total workforce. At the same time, and in the recent course of the pandemic especially, China’s export shares have risen, a testament to the indispensability, competitiveness, and innovation of the nation’s industrial base. And while there is evidence that many firms plan to relocate some or all of their manufacturing facilities away from China, there is also widespread concern that manufacturing itself is swiftly becoming less labor intensive, as low-tech industries transition to the automated and robotic systems for which China is the largest market.

At a moment of reckoning for American manufacturing, and the reassessment of just-in-time global supply chains, then, calls to divest from “Factory Asia” unfold ideological debates about the fate of the global political order. But these debates carry significance less in terms of the overblown contest between “American democracy” and “Chinese authoritarianism” (pace Trump, or Bay). Rather, their significance lies in the exfoliation of far subtler processes of hegemonic rebalancing involving America’s wary retreat from, and China’s equally wary advance upon, a vertiginous neoliberal system, as both nations confront the potential boons and perils of a return to—or the continuation of—an anachronistic state-led capitalism. More likely than serious geo-economic overhaul is the emergence of regional hegemonies ordered to suit the state’s immediate desire for market, political, and military supremacy (China seeks such supremacy in the Indo-Pacific; America plans to carve it out by further industrializing hubs in Mexico and Canada). Perhaps, then, it’s only a matter of time before we can “make what we want to make, the way we want to make it.” For now, it’s all still made in China.

The Humanities as meaning-making

(Director’s note: David Sweeney Coombs is associate professor and Director of Undergraduate Studies in the Clemson English Department. He is the author of Reading with the Senses in Victorian Literature and Science, published in 2019 by University of Virginia Press.  This is Clemson Humanities Now.)

“A dreadful Plague in London was,

In the Year Sixty Five

Which swept an Hundred Thousand Souls

Away; yet I alive!”

These lines appear as the conclusion of Daniel Defoe’s Journal of the Plague Year, appended as a kind of epigram on the record of the fictitious author H. F.’s impressions of the bubonic plague epidemic in London in 1665. “Yet I alive!”: at once minimized in comparison to the preceding lines and the scale of the disaster they invoke and yet granted the importance of serving as the quatrain’s parting words, this concluding expression of baffled triumph and gratitude distills the confused sense of H. F. throughout the rest of the narrative that his survival is thanks to an inextricable combination of prudence, providence, and sheer dumb luck. Even though this is the end of the line for the novel, and this epigram might also be justly described as, in a sense, an epitaph, a question hangs over H. F.’s affirmation of having outlived the plague: I survived. Why, and what do I do now?

It is, of course, too early in the Covid-19 pandemic for us to ask ourselves this question in all earnestness. As I write, case-counts in South Carolina have been rising rapidly for several weeks and Clemson has just announced that it will delay bringing students back to campus until September. But if we can’t yet congratulate ourselves on being survivors, reading Defoe’s novel with the pandemic ongoing is nonetheless an occasion to ask what it means to survive and what survival might look like. Covid-19 has given that question a new urgency for us all as individuals, but it has also given a sharper ring to a version of that question that had already begun being sounded in American universities in the last few years even before the upheavals created by the novel coronavirus: can the academic humanities survive?

Enrollments in the academic humanities have fallen dramatically since the 2008 Financial Crisis. Nationally, the declines have been particularly sharp in the core disciplines that together represent the Humanities at Clemson: English, Languages, Philosophy and Religion, and History. The number of Bachelor’s Degrees awarded in those fields at U.S. universities declined by 17 percent from 2012-2015 (according to the American Academy of Arts and Sciences), with the downward trend continuing afterwards. I am now the Director of Undergraduate Studies for the English Department at Clemson, where the number of English majors decline by approximately 25 percent between 2011 and 2018.

The proximate cause of these drops is what Christopher Newfield in The Great Mistake has analyzed as the devolutionary cycle of privatization in American public universities, intensified by the long recession and fiscal austerity after 2008. As Clemson, like other public universities, has become locked into a cycle of slashed state funding, tuition hikes, and attempts to raise revenues and cut costs through partnering with private sector sponsors, more and more students have entered the university staggering under the pressure to monetize their studies so as to compete for the well-paid jobs requiring a bachelor’s degree. Under that burden, a major in the humanities look like a risk, or worse, a luxury. While there were some signs that enrollments in the humanities were ticking back up in the last two years, the economic fallout from the pandemic might wipe out those small gains and then some.

One task for humanities scholars at this moment is to take every opportunity to reiterate that the retreat from the humanities in search of supposedly more marketable majors rests on a popular misconception that humanities degrees don’t translate into good jobs. The economic value of Humanities BAs is in fact broadly similar to that of degrees in Business and STEM fields like Biology. A 2018 study found that an average 25-29 year-old Business major earns about $50,000 annually compared to the $45,000 of an average English or History major of the same age, but that Business major is slightly more likely to be unemployed than the History major. When students in one of Clemson colleague Tharon Howard’s courses conducted a survey last year, they found that 90 percent of graduates from Clemson English had found jobs within a year of graduation (and more than 60 percent of them did so within the first three months of graduating). While the coronavirus pandemic is currently destabilizing labor markets in new ways, a humanities degree is likely to continue serving our students’ career prospects well once the recovery is underway. If they were judged by their actual return on investment, as opposed to the caricature of the unemployed Philosophy major, the academic humanities could survive.

But do the humanities deserve to survive? Do they have value that can’t be reduced to the market? Considered intellectually and ethically, what elements of prudence or providence justify their survival as areas of study within the university? The humanities are difficult to discuss, let alone define, in a unified way, but I accept Helen Small’s working definition from The Value of the Humanities:

The humanities study the meaning-making practices of human culture, past and present, focusing on interpretation and critical evaluation, primarily in terms of the individual response and with an ineliminable element of subjectivity.

This definition is perhaps recognizably the work of an English professor, and some historians might not necessarily identify “the meaning-making practices of human culture” as their sole or primary object of study. At the risk of disciplinary chauvinism, though, Small’s placing practices of meaning-making at the center of what we do in the humanities strikes me as capturing something important about them.

When Coluccio Salutati articulated the first modern version of the studia humanitatis in the fourteenth century, his basic criterion for each mode of inquiry’s inclusion was a commitment to studying languages in order to understand the human. The modern humanities, that is, study meaning-making with the aim of self-understanding. That aim might be more or less implicit in practice but it is the humanities’ first and most important justification. While Small’s analysis is too cool to veer into anything with the potential for such embarrassing earnestness, it can’t but hint at one of the tacit promises we make our students: by teaching you how meaning is made, the humanities will also help you to cultivate the interpretive and critical skills that can make your own life more meaningful.

Seen along these lines, the humanities are vulnerable to the degree that their ineliminable element of subjectivity can metastasize, or be perceived to have metastasized. The charge that the humanities have turned away from the world and become a mere solipsistic exercise in personal enrichment is a recurring feature of sharp-elbowed public debates about their relationship with the sciences in particular. Some version of that claim is central to the Matthew Arnold-T.H. Huxley debates about education in the nineteenth century, the C. P. Snow-F. R. Leavis debate about the so-called two cultures of the sciences and the humanities in the 1950s and 1960s, and the Sokal Hoax and the debates it occasioned about post-structuralism and social construction in the 1990s. What tends to be obscured by these debates is that self-understanding, pursued doggedly enough, leads inevitably back to the world of which that self is a part. If the humanities without the sciences (or the social sciences) risk solipsism, their study of meaning-making practices can teach the sciences about the terms they use to study the world. An explicit critical engagement with those terms is in fact the very reason-for-being for multiple humanities sub-disciplines: intellectual history, literature and medicine, the history and philosophy of science, and so on.

The spectacular failures of the U.S. health system in the face of Covid-19 underscore the necessity for attention to that humanities’ scholarship in understanding what has gone so very wrong here and how we might fix it. The pandemic has given us a searchingly thorough diagnosis of our society: we live in a country riven by stark economic and racial inequality, its state capacity for anything but administering violence hollowed out by decades of disinvestment. In this respect, our world is not so very different from the London of 1665 presented in Defoe’s narrative. Published in 1722 amid the real possibility that another plague epidemic might spread from Marseilles to London, Journal of the Plague Year is an attempt by Defoe to persuade the public to support some of the controversial quarantine measures adopted by the Walpole government (which seems to have been paying him at the time). To accomplish this aim, Defoe put the extensive research into the 1665 plague that he, the real author, had made into the mouth of a fictional author imagined as living through the plague. The result is a strange hybrid of public health reporting, history, and fiction, one that, as Margaret Healy puts it, yields “a complex texture of interwoven relationships between the subjective and the social, the private and the collective—the kernel of any novel.” We might also identify this particular kind of interweaving as something like the kernel of the humanities. Let’s hope they survive.

Remembering Chris Dickey

(Director’s note: In the summer of 2008, I had just arrived at my destination in Istanbul, connected to wifi, checked my email, and saw that I had a received a message from Will Cathcart, Clemson English, Class of 2005.  In it, Will rightly predicted that Russia was about to invade the country of Georgia, and announced that he was boarding a plane to go interview the then President of Georgia, Mikheil Saakashvili.  The invasion announced the arrival of a multipolar world, and established a pattern Russia would repeat in Crimea, the Ukraine, and Syria.  Will, who has lived in and reported from Georgia ever since, was inducted in the Clemson Alumni Council’s Roaring 10 in 2014.  His reporting can be seen in CNN, Foreign Policy, and, most frequently, seems to me, The Daily Beast.  There, his editor was “renowned foreign correspondent,” Christopher Dickey, whose father, James Dickey, author of Deliverance, was a student at Clemson for his first year of college, before serving in World War II (and later graduating from USC).  Christopher Dickey, a regular visitor to and speaker at Clemson over the years, passed away on July 16th.  In a new post on Lit Hub, Will reflects on the loss of a friend, an advocate, and an editor.  This is Clemson Humanities Now.)

The Earthquake of 1755 — An “extraordinary world event” (Goethe 1810)

(Director’s note: Johannes Schmidt teaches German in the Clemson Languages Department, where he will offer a course on “Tolerance in the Eighteenth Century” (in German) in the fall of 2020 and on Hanna Arendt and Totalitarianism (in English) in the spring of 2021.  This is Clemson Humanities Now.)

On November 1, 1755, an earthquake off the coast of Lisbon struck and destroyed a large portion of the city; aided by subsequent fires and massive tsunami wave, it claimed between 30,000 and 100,000 lives (of about 275,000 residents).

It disturbed all aspects of life. Economically, the destruction of one of the most important European port city was felt in all places of trade and commerce. The news of the earthquake reached London within a day; newspapers in Hamburg, my own hometown, reported only a few days later. Colonial politics became magnified; the outbreak of the Seven Years War (French and Indian War) may have been hastened due to the rising tensions caused by an increased desire for control of trade with the New World.

Art and literature took note, and so did the Enlightenment in general, reconsidering the Leibnizian philosophy of optimism. In his Candide, ou l’Optimisme, Voltaire famously makes fun of “the best of all worlds.” Johann Gottfried Herder—who together with Goethe coined the term zeitgeist (spirit of the age)—reprimanded the “philosophes” for hubris; they had falsely proclaimed a golden age that advanced beyond every other historical era and was above every other culture: “‘In Europe now, there is supposed to be more virtue than ever, [than] everywhere in the world has been?’ And why? because there is more enlightenment in the world – I believe that just for that reason it has to be less” (1774).

The faith in Enlightenment only made philosophy too proud of the knowledge gained than aware of that which we cannot control in nature. Unsurprisingly, neither knowledge nor reason predicted this catastrophe. To this day, earthquake predictions are plagued by their unpredictability—much to the vexation of the Italian government concerning the botched warning of the devastating earthquake in the Abruzzo region in 2009. The unpreparedness and our unenlightened–irrational, at times even heartless–response to the COVID-19 pandemic seem to be grounded in the same misgivings about nature and knowledge that many of the European thinkers had before 1755.

Goethe’s observation about the earthquake stands out to me today: “Perhaps, the daemon of terror had never before spread as speedily and powerfully over the world”. As a six-year old, he was terrified, despite being more than 1500 miles away. In Georg Philipp Telemann’s “Ode of Thunder” (TWV 6:3a-b, 1756 [link to the entire piece: https://www.youtube.com/watch?v=CDW19YqDq64]), we can hear the horrific feeling the existence of an “angry God” must have evoked; we can perceive young Goethe’s fear of the Hebrew Bible’s “wrathful deity” that could not be conciliated: Listen today to the frantic fifth and sixth movement from the Donnerode, entitled “The voice of God shatters the cedars!” [https://youtu.be/7ZUUBmol_HE] and “[The voice of God] makes the proud mountains collapse!” [https://youtu.be/4D-u6VslLlg]. Our anxieties connect us to the desolation people felt in the aftermath of 1755; one might hear the aftershocks in the fifth movement or people fleeing from proud palaces collapsing in the sixth.

It was Kant who—having already studied the physics of the earth’s crust—reminded us of the need for rational-scientific investigation into disastrous occurrences:

“Great events that affect the fate of all mankind rightly arouse that commendable curiosity, which is stimulated by all that is extraordinary and typically looks into the causes of such events. In such cases, the natural philosopher’s obligation to the public is to give an account of the insights yielded by observation and investigation”, he writes in one of his three studies on earthquakes (1756).

Of much needed insight was his proclamation that the event dictated a clear distinction between moral evil and natural disasters. Since all natural occurrences are subject to physical laws, a supernatural cause (divine punishment) for natural disasters must be rejected.

Epidemiologically, the coronavirus does not discriminate. Neither did the earthquake spare the righteous more than the corrupt. Yet, both events show the drastic inequalities brought forth when calamity strikes. While the earthquake affected the rich and the poor alike, the pious and the heretic equally, the nobility and the outlawed in similar ways, the fate of those who survived is very different. The king of Portugal—who spent the holiday away from Lisbon—never set foot into the city ever again; the entire royal family relocated to palaces elsewhere. A king’s luxury mirrored today: the affluent fled the cities hit hard by the pandemic and retreated into their summer homes. In the end, viruses and earthquakes do discriminate.

One of the core ideas of Kant’s philosophy of ethics is the importance of human dignity: the incalculable worth of a human being’s life, which is beyond any form of measurement. Free human existence is fundamental. Kant makes perfectly clear that the categorical autonomy of all human beings is absolute and independent of any phenomenological or physiological distinction. This imperative is as valid today as tomorrow. (Kant’s Eurocentric thinking is certainly disturbing and his belief in Western civilization’s supremacy contradicts the inclusiveness that may be suggested in this kind of universal call for autonomy of all peoples and races.) Still, I read this today as an affirmation that there cannot be any material or any other justification for the loss of even a single life.

Protest against racism and inequality, as well as solidarity with all whose human dignity has been and continues to be violated, is not only legitimate but also necessary at all times, especially during natural and manmade disasters. Even more so, for us in privileged positions, it is our responsibility.

ITALY COVID 19

(Director’s Note: Seeing  Covid-19 hit Italy early, and hard,  I wrote to a colleague in the Department of Languages, Roberto Risso, for his sense of the situation, below.  Dr. Risso was born and raised in Turin, Italy, where he lived
until eleven years ago.  After relocating to the US, he has lived
in Wisconsin, Maine, and South Carolina. He now lives in the Clemson area,
where he loves gardening and reading.  He is currently an Assistant Professor of Italian Studies at Clemson University.  This is Clemson Humanities Now.)

The planetary dimension of the catastrophe became clear to the Belpaese quite early, when a European citizen was said to have brought to Italy COVID 19, a virus originated in China and was spreading fast. Italy has a special bond with China, a mutual fascination that dates back to Marco Polo’s travel and to his famous book. The so-called new Silk Road has a relevance that should never be underestimated in today’s world. Billion of Euros’ worth of commerce and almost half million of Chinese people living in Italy reinforce this bond. And yet anti-Chinese xenophobia hit Italy at the beginning of the pandemic in February 2020: hard-to-watch videos surfaced of Chinese people being insulted and beaten on the streets. Italians were scared, angry, unhappy, very unprepared, but mostly had the decency to admit it from day one.

The virus hit Italy hard from the very beginning: a densely populated territory with areas that are among the most polluted in Europe, by March and April the North of Italy was a hotspot and multiple hundred of victims died daily, especially in the Lombardy region, the economic capital of the country. The harrowing images of the military trucks transporting coffins out of the Lombard city of Bergamo became iconic of the disaster that was taking place under the Tuscan (and Lombard) sun. As a result, new emergency laws were introduced and reinforced: total lockdown for more than a month and after that compulsory masks everywhere. Italians obeyed and thank God they did. And it worked, indeed. By end of April and May the situation was better: from hundred of deaths daily to a few dozen, then few people, a dozen less. But with relaxation of the attention a new wave came, mid-June, and the threat of a new Nation-wide lockdown convinced people to be more careful.

Is it possible to conceive Italy with no tourists? No, of course not. But without planes, and with borders sealed, no tourists came and the only visitors were the boatloads of refugees who came by sea. Singing from balconies, memes, online feuds, political chaos were nothing but distractions; from the beginning of the Second Republic on, politicians have been clowns all along. Since Italy has always been a divided and diverse country, the pandemic did not unite it; Italians did not became any better.  Contrary to what it was said during the lock-down–“Ne usciremo migliori e più forti”, or ‘we will get out of this stronger and better’–did not really happen, unfortunately. What really happened is that, due to the internal immigration of the past, at the beginning of the pandemic many southerners living in the North fled the closing cities and brought the virus with them. Italy has also the oldest population in Europe, second oldest in the world after Japan: the “nonni”, grandpas and grandmas, died like flies, especially in the nursing homes, where, the suspicion is, that nothing was done to avoid contagion. On the contrary, people say (and please remember, vox populi, vox Dei) that the virus was used to get rid of many of them. It is so atrocious that it can be true in a country no stranger to atrocities.

As of July 4th 2020 the contagion index is anything but uniform: it has increased from June but many central and southern regions are free from Covid 19, and not for lack of testing. Lombardy, where the virus has claimed so many lives in the past, has a level of 0.89 index, which is good, although the nation’s cases as of the beginning of July 2020 are 241,419 which is scary. In four months, twenty thousand jobs have been lost for good.

As of July 1st citizens from USA, Brazil, South Africa, Russia, Turkey, India, Saudi Arabia and Mexico have been blocked from entering the Country. Chinese nationals have not been blocked yet. With these conditions it is very hard to foresee the damage to the tourist industry. We will see what the European Union will do about it, since its priorities usually seem to be the well being of German Banks and not the hardships of citizens. What I know for sure, despite all the terrible news and the scary data, is that Italians will move forward despite the ten thousand babies that will not be born in 2021.  In a Country that is already at a level zero birthrate.

Moving forward and rebuilding after catastrophes is what Italians do best.

The second plague outbreak in my life

(Director’s Note: Lee Morrissey, Founding Director of the Humanities Hub, joined the Clemson English faculty in 1995, moving to Clemson from Manhattan, where, while a graduate student, he had volunteered to create the Archives at an arts center, The Kitchen.  From 1995 to 2000, he brought artists–including Ben Neill (who performed with David Wojnarowicz in the 1989 event described below), Carl Stone, Pamela ZMargaret Leng Tan, and Trimpin–to Clemson in a program he called “Technology and the Contemporary Arts.” This is Clemson Humanities Now.)

This summer, I will turn 56, which means, this summer in particular, I am well into what has been considered an elevated risk group for Covid 19.  But it also means, in my case, I am now living through the second plague outbreak of my lifetime.  There has been so much focus nationally on the Spanish flu of 1918, and other, earlier plagues, including the Black Death in Florence (see Boccaccio’s Decameron) and the bubonic plague in London (see Defoe’s Journal of a Plague Year), but much less attention to the more recent experience of HIV/AIDS.  A recent article in the Washington Post reviewed how knowledge gained from HIV/AIDS research is helping with Covid-19 research, but considering that Anthony Fauci first came to national prominence in the 1980s’ AIDS crisis (making a frenemy of Larry Kramer in the process), you would think that there would be more attention paid to the last time around.

I’ve been wondering recently why AIDS isn’t more in the news, and I am starting to think it’s because it affected minority communities, and, unless you were traveling in those minoritized circles, well, you just did not really know.  I grew up in Boston, shared a radio show on WOMR-FM in Provincetown, MA in the summer of 1984, started graduate school in NYC in 1987, and volunteered, from 1987 to 1995 at The Kitchen Center for Video, Music, Dance, Performance, and Literature, creating and organizing their archives.   I traveled in those circles.  AIDS affected people I knew, early in the disease.  By the time I reached Manhattan, in a New York City that does not exist anymore (or, if enough people now move out, yet), AIDS was a defining feature of one of the things that used to attract people to New York: the arts.  The Aids Coalition to Unlock Power (ACT UP) was already active, with intense meetings at the Lesbian and Gay Men’s Community Services Center. Gran fury, an artists collective formed the year after I moved to the city, created one of The Kitchen’s monthly posters, with the words “art is not enough” taking the center—most—of the page.  I remember David Wojnarowicz at The Kitchen, all wiry energy backstage (well, upstairs) before his 1989 performance there, In the Shadow of Forward Motion, and, then, in the performance, realizing that it had the same wiry energy—David offstage had been, that night anyway, David onstage, which was part of his point: silence equaled death, and he was going speak volumes, producing all the time, constantly.  His self- portrait with flames coming out of his left side, the viewer’s right (pointedly highlighting the political mirror), turned out to be realism.  I was back home in Boston when news of Ethyl Eichleberger’s death reached me via the New York Times, and I won’t forget the frustration of trying to explain to my parents over breakfast why the loss of someone reimagining the Classics in drag and garish makeup, with self-consciously schlocky accordion songs, might matter to someone else studying literature in the same city.  I remember inviting everyone I knew well enough to see John Kelly’s 1991 event, Maybe it’s cold outside, because it nearly worldlessly told a life and death story of learning that one is HIV positive, with a soundtrack by Arvo Pärt.  Later that year, on World AIDS Day, A Day Without Art, The Kitchen assembled a broadcast, in part from the Kitchen, and in part from other arts organizations, mixing frank safe sex PSAs with artistic reflection on loss.  The standout, for me, was dancer Bill T. Jones, HIV-positive, and having outlived his partner, Arnie Zane, dancing toward his mother, Estelle, while she sang “Walk with me Lord.”

I left New York City, and moved to Clemson, SC, where I have lived practically ever since.  The metaphorical city I had left had, in many ways, died.  Actually died.  C. Carr wrote its eulogy in her book, On Edge: Performance at the End of the Twentieth Century, whose last chapter is titled, “The Bohemian Diaspora.”  In Rent, which has a song called “La Vie Boheme,” Jonathan Larson memorialized it, and the gentrification pressures, which ended it (and on the morning of its first performance he passed away, too, from an undiagnosed aortic tear, not noticed in the several trips he made to his local hospitals, one of which has since been shut down).

There is another, much less well-known, and more abstract, long-form music and theatre piece from the era, Meredith Monk’s Book of Days (1989).  It opens in then-contemporary NYC, in color, as a construction crew blasts a hole in a wall, through which the camera travels, into a largely medieval village, complete with separate quarters for Christians and Jews, filmed largely in black and white.  As a plague breaks out.  There’s a documentary dimension, and contrasting musical and performance styles–martial and spectacular in the Christian white-wearing section, mystical and participatory in the Jewish section of town.   When, in this section, the plague arrives, in the figure of a lone dancer gesturing maniacally, the Christians amass and blame the Jews, despite the intervention of one man who tells them that “everyone is dying.”  A map of the town turns red as the deaths accumulate across its segregated quarters, and the sounds of the singers get more and more menacing.  As can be heard in this rendition of the song, we are listening to the plague itself: about two minutes in, out of those vocalizations, the chorus huffs “we know who you are; we know who you are; we know who you are; ha ha ha; ha ha ha.”  It is a chilling effect to have the plague emerge from the cacophony and claim us.  But that’s what happens.

In 1990, The Kitchen got caught up in the 1990 NEA defunding.  A decade earlier, as Bill T. Jones recounts in an essay he contributed to The Kitchen Turns 20, a book I edited on the occasion of the first two decades of The Kitchen, the US Information Agency had asked The Kitchen to select US artists to send to a cultural exchange in Bucharest, Romania.  But, now, after the 1989 fall of the Berlin Wall, the Cold War had ended and the so-called Culture War had broken out, at home.  The US is still living with the profound consequences of this redirection.  At the time of the NEA defundings, my father called, and said “I thought you were volunteering at a soup kitchen!”  Or, in other words, I guess, he was asking “what are you doing there?”  That’s a fine question, and I think the answer has to do with being interested in making meaning; the people, the artists, who performed at The Kitchen and other venues in New York, wanted to do something meaningful.  Like me, they came to New York from somewhere else with that hope.  They came from and spoke to overlapping communities, and they wanted to create something, usually ephemeral, filled with allusive, polyvalent, and, I think it’s important to note, embodied meaning.  Some of it, a really small number within it, left the performers vulnerable to charges of obscenity, but, ironically, usually in works which meant to raise questions about what else we live with, without considering it obscene.  All of this took place before the internet, before Ken Starr’s detailed report on President Clinton, and so might strike people today as quite tame.  They might also be struck by the convivial friendliness of the experiences in the audience at the time.  In the process, or processes, of creating their works these performers created new communities, sometimes as small as the seating at the original P.S. 122 (a former public school), or at the original Dixon Place (an apartment living room), but sometimes, as with Bill T. Jones, as big as everyone who’s seen any of his several Broadway choreographies.

Comparing then and now, a few things stand out.  First, I have not seen much emphasis recently on how fatal HIV was for that first couple of decades of the disease.  I think of it all the time now, in part because I saw so much loss at the time, but also because Covid-19 is so much more transmissible than HIV.  Imagine if the novel coronavirus were as fatal as AIDS.  Actually, don’t.  It’s too much to contemplate.  Second, in the early days with Covid 19, as doctors started to see patterns in who was most vulnerable to the virus, age, occupation, and then race were the initial factors.  Eventually, it became clear that other factors involved the status of one’s immune system, or “underlying health conditions,” and then it was reported that only 12% of the US population was metabolically healthy, meaning not obese, not diabetic, not with high blood pressure, etc.  It is as if 88% of the population has a different kind of acquired immune deficiency, related to diet (and access to healthy foods), environmental and living conditions (and access to places in which to exercise), and other cultural, meaning economic, meaning political, factors.  Third, HIV/AIDS is still with us.  Last year, 690,000 people died from AIDS, and 1.7 million people were infected, worldwide.  And, while there is PrEP if you think you are at high risk, there is, still, no vaccine, and no cure, yet.  In the US alone, according to the CDC, there were about 38,000 new HIV diagnoses in 2018, most of them in the South.

Fourth, thirty-five years ago, people who were affected by HIV/AIDS were frustrated by a President who would not mention the disease.  At the time, his political party was focused on family values and personal responsibility, which meant a disease cast as gay and preventable with celibacy (and safe sex) needn’t be a national issue, they thought.  Today, the president, of the same party, certainly talks about the current pandemic, but he has spent a lot of time saying that it will go away (it hasn’t yet, almost six months later), and that it is getting better (when it is getting worse).  And personal responsibility has devolved to NOT wearing protection, because, it is argued, nobody can tell another U.S. American how to do anything.  Four decades ago, conservatives blamed the sexual activities of gay people for spreading AIDS, mistakenly thinking it was a gay disease.  Today, as soon as the lockdowns are lifted, apparently heterosexual young people mingle in close proximity, close enough for this pandemic’s transmission–without wearing any protection.  In terms of disease transmission and prevention, they are acting as irresponsibly as Republicans used to allege gay people were, except this time, all they need to do for transmission is stand near each other, unmasked, in, say, a pool, while a beach ball bounces off their heads.

Finally, I think about how much changed in the US because of AIDS activism, a range of activisms.  By becoming experts in the disease—no one else would initially, so they did—activists changed healthcare.  By highlighting their exclusion from their partners’ hospital beds, activists pressed for recognizing lifelong commitments among lesbian and gay people, known today as gay marriage, and legal nationwide.  Most of all, though, because so many affected were artists, they could create verbal, visual, musical, and, in the end, emotional responses to living with or being stalked by the disease.  There was some hope, then, that there might be angels in America.  I can imagine the current Covid-19 pandemic–which is also creating the second great recession, or worse, in just 12 years–is going to have even bigger social consequences, and, I say hopefully, just as positive.  If so, though, we are going to have to do the imaginative work of the artists lost to AIDS to create affecting representations of living with and being stalked by both a disease and an economic collapse.  Gran fury said “Art is Not Enough,” but, actually, it is a start.

Déjà Vu All Over Again

(Director’s note: Dr. Abel A. Bartley, a native of Jacksonville, Florida, and a graduate of Florida State University where he received his BA in History and Political Science, his MA in History, and his Ph.D. in History in African Americans and Urban History, is Professor of African American and Urban History at Clemson.  Author of Keeping the Faith: Race Politics and Social Development in Jacksonville, Florida, 1940-1970 (2000), Akron’s Black Heritage (2004), and In No Ways Tired: The NAACP’s Struggle to Integrate the Duval County Public School System (2014), which received the Stetson Kennedy Prize for best book on civil rights history.  In 2004 he accepted the challenge of running the African American studies program, then a minor, at Clemson University.  In 2007 he created the new Pan African Studies Program, a major, at Clemson, and served as its first Director.  This is Clemson Humanities Now.)

I cannot believe that we are here again. It seems as if I am stuck in a bad dream that keeps recurring. I remember my older brother marching in 1980 after a 33-year-old African American insurance salesman named Arthur McDuffie was murdered by police officers in Miami, Florida. McDuffie was accused of committing the grievous crime of running a traffic light. The officers originally reported that McDuffie died when he crashed his motorcycle, but the coroner found that his injuries did not match the autopsy. Eventually, the officers were accused of using their flashlights to beat McDuffie. We all believed that with the autopsy, the officers would be convicted of murder. However, amazingly, the officers were acquitted. African Americans and their white allies protested and marched, but black deaths like McDuffie’s continued.

In 1991 a video surfaced of Black motorist Rodney King being brutally beaten by four Los Angeles policemen. After a change of venue and lengthy trial, the officers were all acquitted. Like my bad dream, this sort of scenario does not end. African Americans get killed by the police or their acolytes, we march, we pray, we cry, we make high sounding declarations and yet the bad dream endures.

As Einstein once stated, the definition of insanity is doing the same thing and expecting a different result.  Why do the juries keep coming back with the familiar refrain, “Not guilty!”

After years of studying this phenomenon, I am convinced that we are aiming at the wrong target. Our anger, though understandable, is misplaced. We are blaming the police for our problem when we should be blaming the courts. Blaming the police for all of the wrongs misses the point. The police are just a part of a systemic racism that has historically oppressed people of color. African Americans were brought to America to serve as slaves and the American courts have never come to terms with seeing us in any other condition other than that of servitude.  Consequently, Black life, progress, and, most importantly, deaths have never been celebrated or properly mourned by most Americans. It is obvious that the courts have never be viewed or treated African Americans as equals under the law.

During Reconstruction many progressive-minded Americans attempted to write equality before the law into American jurisprudence. The resulting 14th and 15th Amendments became precedent-setting pieces of legislation that ensured that all Americans are to be treated fairly and equally under the law and courts and provided them a powerful weapon to protect their rights. Unfortunately, the courts quickly undermined the Amendments. Despite the legally sound arguments of lawyer Albion Tourgee in defending Homer Plessy’s right to ride anywhere he could afford on a train in the South, the courts said that African Americans could be separated and subjected to disparate treatment.  Therefore, White Americans were given carte blanche to establish an elaborate system of legalized segregation, now known in the South as Jim Crow, which separated and subjugated African Americans in all social, political and economic settings. The courts allowed the South to establish capricious unfair restrictions on the franchise, unilaterally disarming African Americans of their most powerful weapon.

The nation’s worse period of racial violence ensued. Of the violence Frederick Douglass wrote, “If American conscience were only half alive, if the American church and clergy were only half christianized, if American moral sensibility were not hardened by persistent infliction of outrage and crime against colored people, a scream of horror, shame and indignation would rise to Heaven … Alas!, even crime has power to reproduce itself and create conditions favorable to its own existence. It sometimes seems we are deserted by earth and Heaven–yet we must still think, speak, and work, and trust in the power of a merciful God for final deliverance.”  Since that period, African Americans have been the constant victims of extralegal killings, whether at the hands of the police or their devotees. From Maceo Snipes to Emmett Till, the one constant that keeps these killings going is the assurance that the courts will acquit the accused. Therefore, we continue to get angry at the police, the lynchers, or the ugly racist when what we really need to do is reform a racist judicial system that consistently vindicates the accused if the victim is black and the defendant is white. I suspect that if police, racists, or others really feared that they would be held responsible for their actions, they would think twice about whether it was worth choking a man over a counterfeit $20 bill, shooting a 12-year-old boy over a fake gun, shooting Trayvon because he walked through a white neighborhood, murdering a 26-year-old because he jogged through your community, murdering a woman in her house playing video games with her nephew, or murdering a sleeping EMT.

So how do we reform the system?  Let me suggest a few modifications.  First, we must reform the laws to ensure that police are subject to the same use of force standards as the public. Secondly, the laws have to be changed so that police are no longer allowed to just stop people without a legally defensible reason. African Americans must become much more active on juries. Next, the Supreme Court must remember its role. The courts speak for those who are voiceless. They must maintain their objectivity and protect the unprotected. Lastly, juries have to see the victims as them. Juries have to be able to put themselves in the victim’s shoes. Whites see police officers as protectors, while African Americans see policemen as enforcers of an unjust society. Therefore, when police face juries, the juries have to see both sides.

I suspect that we would see dramatic reductions in these crimes if those who were horrified by them would mobilize.  If the politicians who are so concerned about law would spend half of that energy on justice, we could solve this problem.  But, as Frederick Douglass wrote, “But alas! even crime has power to reproduce itself and create conditions favorable to its own existence.” So, I say protest against police violence, make your speeches, say your prayers, remove your statues, change your building names, and make your resolutions. I applaud all of those who have demonstrated their outrage and participated in protest activities demanding change. Nevertheless, until and unless our courts are ready to hold those responsible who wantonly kill our brothers and sisters, we are just experiencing a recurring bad dream. Those in power know that if the law applied equally to all and if everyone had unfettered access to the franchise, it would mean a real realignment of power in this nation and challenge to their power and privilege. So, I suspect we will rearrange the furniture, change the drapes, fulminate and yet nothing meaningful will change. As a historian, I fear that after these rallies and protests end, we will go back to “American normal”—that is, back to this same bad dream.

Please America, reach into your soul and prove me wrong this time!