Humanities Hub

The Humanities as meaning-making

(Director’s note: David Sweeney Coombs is associate professor and Director of Undergraduate Studies in the Clemson English Department. He is the author of Reading with the Senses in Victorian Literature and Science, published in 2019 by University of Virginia Press.  This is Clemson Humanities Now.)

“A dreadful Plague in London was,

In the Year Sixty Five

Which swept an Hundred Thousand Souls

Away; yet I alive!”

These lines appear as the conclusion of Daniel Defoe’s Journal of the Plague Year, appended as a kind of epigram on the record of the fictitious author H. F.’s impressions of the bubonic plague epidemic in London in 1665. “Yet I alive!”: at once minimized in comparison to the preceding lines and the scale of the disaster they invoke and yet granted the importance of serving as the quatrain’s parting words, this concluding expression of baffled triumph and gratitude distills the confused sense of H. F. throughout the rest of the narrative that his survival is thanks to an inextricable combination of prudence, providence, and sheer dumb luck. Even though this is the end of the line for the novel, and this epigram might also be justly described as, in a sense, an epitaph, a question hangs over H. F.’s affirmation of having outlived the plague: I survived. Why, and what do I do now?

It is, of course, too early in the Covid-19 pandemic for us to ask ourselves this question in all earnestness. As I write, case-counts in South Carolina have been rising rapidly for several weeks and Clemson has just announced that it will delay bringing students back to campus until September. But if we can’t yet congratulate ourselves on being survivors, reading Defoe’s novel with the pandemic ongoing is nonetheless an occasion to ask what it means to survive and what survival might look like. Covid-19 has given that question a new urgency for us all as individuals, but it has also given a sharper ring to a version of that question that had already begun being sounded in American universities in the last few years even before the upheavals created by the novel coronavirus: can the academic humanities survive?

Enrollments in the academic humanities have fallen dramatically since the 2008 Financial Crisis. Nationally, the declines have been particularly sharp in the core disciplines that together represent the Humanities at Clemson: English, Languages, Philosophy and Religion, and History. The number of Bachelor’s Degrees awarded in those fields at U.S. universities declined by 17 percent from 2012-2015 (according to the American Academy of Arts and Sciences), with the downward trend continuing afterwards. I am now the Director of Undergraduate Studies for the English Department at Clemson, where the number of English majors decline by approximately 25 percent between 2011 and 2018.

The proximate cause of these drops is what Christopher Newfield in The Great Mistake has analyzed as the devolutionary cycle of privatization in American public universities, intensified by the long recession and fiscal austerity after 2008. As Clemson, like other public universities, has become locked into a cycle of slashed state funding, tuition hikes, and attempts to raise revenues and cut costs through partnering with private sector sponsors, more and more students have entered the university staggering under the pressure to monetize their studies so as to compete for the well-paid jobs requiring a bachelor’s degree. Under that burden, a major in the humanities look like a risk, or worse, a luxury. While there were some signs that enrollments in the humanities were ticking back up in the last two years, the economic fallout from the pandemic might wipe out those small gains and then some.

One task for humanities scholars at this moment is to take every opportunity to reiterate that the retreat from the humanities in search of supposedly more marketable majors rests on a popular misconception that humanities degrees don’t translate into good jobs. The economic value of Humanities BAs is in fact broadly similar to that of degrees in Business and STEM fields like Biology. A 2018 study found that an average 25-29 year-old Business major earns about $50,000 annually compared to the $45,000 of an average English or History major of the same age, but that Business major is slightly more likely to be unemployed than the History major. When students in one of Clemson colleague Tharon Howard’s courses conducted a survey last year, they found that 90 percent of graduates from Clemson English had found jobs within a year of graduation (and more than 60 percent of them did so within the first three months of graduating). While the coronavirus pandemic is currently destabilizing labor markets in new ways, a humanities degree is likely to continue serving our students’ career prospects well once the recovery is underway. If they were judged by their actual return on investment, as opposed to the caricature of the unemployed Philosophy major, the academic humanities could survive.

But do the humanities deserve to survive? Do they have value that can’t be reduced to the market? Considered intellectually and ethically, what elements of prudence or providence justify their survival as areas of study within the university? The humanities are difficult to discuss, let alone define, in a unified way, but I accept Helen Small’s working definition from The Value of the Humanities:

The humanities study the meaning-making practices of human culture, past and present, focusing on interpretation and critical evaluation, primarily in terms of the individual response and with an ineliminable element of subjectivity.

This definition is perhaps recognizably the work of an English professor, and some historians might not necessarily identify “the meaning-making practices of human culture” as their sole or primary object of study. At the risk of disciplinary chauvinism, though, Small’s placing practices of meaning-making at the center of what we do in the humanities strikes me as capturing something important about them.

When Coluccio Salutati articulated the first modern version of the studia humanitatis in the fourteenth century, his basic criterion for each mode of inquiry’s inclusion was a commitment to studying languages in order to understand the human. The modern humanities, that is, study meaning-making with the aim of self-understanding. That aim might be more or less implicit in practice but it is the humanities’ first and most important justification. While Small’s analysis is too cool to veer into anything with the potential for such embarrassing earnestness, it can’t but hint at one of the tacit promises we make our students: by teaching you how meaning is made, the humanities will also help you to cultivate the interpretive and critical skills that can make your own life more meaningful.

Seen along these lines, the humanities are vulnerable to the degree that their ineliminable element of subjectivity can metastasize, or be perceived to have metastasized. The charge that the humanities have turned away from the world and become a mere solipsistic exercise in personal enrichment is a recurring feature of sharp-elbowed public debates about their relationship with the sciences in particular. Some version of that claim is central to the Matthew Arnold-T.H. Huxley debates about education in the nineteenth century, the C. P. Snow-F. R. Leavis debate about the so-called two cultures of the sciences and the humanities in the 1950s and 1960s, and the Sokal Hoax and the debates it occasioned about post-structuralism and social construction in the 1990s. What tends to be obscured by these debates is that self-understanding, pursued doggedly enough, leads inevitably back to the world of which that self is a part. If the humanities without the sciences (or the social sciences) risk solipsism, their study of meaning-making practices can teach the sciences about the terms they use to study the world. An explicit critical engagement with those terms is in fact the very reason-for-being for multiple humanities sub-disciplines: intellectual history, literature and medicine, the history and philosophy of science, and so on.

The spectacular failures of the U.S. health system in the face of Covid-19 underscore the necessity for attention to that humanities’ scholarship in understanding what has gone so very wrong here and how we might fix it. The pandemic has given us a searchingly thorough diagnosis of our society: we live in a country riven by stark economic and racial inequality, its state capacity for anything but administering violence hollowed out by decades of disinvestment. In this respect, our world is not so very different from the London of 1665 presented in Defoe’s narrative. Published in 1722 amid the real possibility that another plague epidemic might spread from Marseilles to London, Journal of the Plague Year is an attempt by Defoe to persuade the public to support some of the controversial quarantine measures adopted by the Walpole government (which seems to have been paying him at the time). To accomplish this aim, Defoe put the extensive research into the 1665 plague that he, the real author, had made into the mouth of a fictional author imagined as living through the plague. The result is a strange hybrid of public health reporting, history, and fiction, one that, as Margaret Healy puts it, yields “a complex texture of interwoven relationships between the subjective and the social, the private and the collective—the kernel of any novel.” We might also identify this particular kind of interweaving as something like the kernel of the humanities. Let’s hope they survive.

Remembering Chris Dickey

(Director’s note: In the summer of 2008, I had just arrived at my destination in Istanbul, connected to wifi, checked my email, and saw that I had a received a message from Will Cathcart, Clemson English, Class of 2005.  In it, Will rightly predicted that Russia was about to invade the country of Georgia, and announced that he was boarding a plane to go interview the then President of Georgia, Mikheil Saakashvili.  The invasion announced the arrival of a multipolar world, and established a pattern Russia would repeat in Crimea, the Ukraine, and Syria.  Will, who has lived in and reported from Georgia ever since, was inducted in the Clemson Alumni Council’s Roaring 10 in 2014.  His reporting can be seen in CNN, Foreign Policy, and, most frequently, seems to me, The Daily Beast.  There, his editor was “renowned foreign correspondent,” Christopher Dickey, whose father, James Dickey, author of Deliverance, was a student at Clemson for his first year of college, before serving in World War II (and later graduating from USC).  Christopher Dickey, a regular visitor to and speaker at Clemson over the years, passed away on July 16th.  In a new post on Lit Hub, Will reflects on the loss of a friend, an advocate, and an editor.  This is Clemson Humanities Now.)

The Earthquake of 1755 — An “extraordinary world event” (Goethe 1810)

(Director’s note: Johannes Schmidt teaches German in the Clemson Languages Department, where he will offer a course on “Tolerance in the Eighteenth Century” (in German) in the fall of 2020 and on Hanna Arendt and Totalitarianism (in English) in the spring of 2021.  This is Clemson Humanities Now.)

On November 1, 1755, an earthquake off the coast of Lisbon struck and destroyed a large portion of the city; aided by subsequent fires and massive tsunami wave, it claimed between 30,000 and 100,000 lives (of about 275,000 residents).

It disturbed all aspects of life. Economically, the destruction of one of the most important European port city was felt in all places of trade and commerce. The news of the earthquake reached London within a day; newspapers in Hamburg, my own hometown, reported only a few days later. Colonial politics became magnified; the outbreak of the Seven Years War (French and Indian War) may have been hastened due to the rising tensions caused by an increased desire for control of trade with the New World.

Art and literature took note, and so did the Enlightenment in general, reconsidering the Leibnizian philosophy of optimism. In his Candide, ou l’Optimisme, Voltaire famously makes fun of “the best of all worlds.” Johann Gottfried Herder—who together with Goethe coined the term zeitgeist (spirit of the age)—reprimanded the “philosophes” for hubris; they had falsely proclaimed a golden age that advanced beyond every other historical era and was above every other culture: “‘In Europe now, there is supposed to be more virtue than ever, [than] everywhere in the world has been?’ And why? because there is more enlightenment in the world – I believe that just for that reason it has to be less” (1774).

The faith in Enlightenment only made philosophy too proud of the knowledge gained than aware of that which we cannot control in nature. Unsurprisingly, neither knowledge nor reason predicted this catastrophe. To this day, earthquake predictions are plagued by their unpredictability—much to the vexation of the Italian government concerning the botched warning of the devastating earthquake in the Abruzzo region in 2009. The unpreparedness and our unenlightened–irrational, at times even heartless–response to the COVID-19 pandemic seem to be grounded in the same misgivings about nature and knowledge that many of the European thinkers had before 1755.

Goethe’s observation about the earthquake stands out to me today: “Perhaps, the daemon of terror had never before spread as speedily and powerfully over the world”. As a six-year old, he was terrified, despite being more than 1500 miles away. In Georg Philipp Telemann’s “Ode of Thunder” (TWV 6:3a-b, 1756 [link to the entire piece: https://www.youtube.com/watch?v=CDW19YqDq64]), we can hear the horrific feeling the existence of an “angry God” must have evoked; we can perceive young Goethe’s fear of the Hebrew Bible’s “wrathful deity” that could not be conciliated: Listen today to the frantic fifth and sixth movement from the Donnerode, entitled “The voice of God shatters the cedars!” [https://youtu.be/7ZUUBmol_HE] and “[The voice of God] makes the proud mountains collapse!” [https://youtu.be/4D-u6VslLlg]. Our anxieties connect us to the desolation people felt in the aftermath of 1755; one might hear the aftershocks in the fifth movement or people fleeing from proud palaces collapsing in the sixth.

It was Kant who—having already studied the physics of the earth’s crust—reminded us of the need for rational-scientific investigation into disastrous occurrences:

“Great events that affect the fate of all mankind rightly arouse that commendable curiosity, which is stimulated by all that is extraordinary and typically looks into the causes of such events. In such cases, the natural philosopher’s obligation to the public is to give an account of the insights yielded by observation and investigation”, he writes in one of his three studies on earthquakes (1756).

Of much needed insight was his proclamation that the event dictated a clear distinction between moral evil and natural disasters. Since all natural occurrences are subject to physical laws, a supernatural cause (divine punishment) for natural disasters must be rejected.

Epidemiologically, the coronavirus does not discriminate. Neither did the earthquake spare the righteous more than the corrupt. Yet, both events show the drastic inequalities brought forth when calamity strikes. While the earthquake affected the rich and the poor alike, the pious and the heretic equally, the nobility and the outlawed in similar ways, the fate of those who survived is very different. The king of Portugal—who spent the holiday away from Lisbon—never set foot into the city ever again; the entire royal family relocated to palaces elsewhere. A king’s luxury mirrored today: the affluent fled the cities hit hard by the pandemic and retreated into their summer homes. In the end, viruses and earthquakes do discriminate.

One of the core ideas of Kant’s philosophy of ethics is the importance of human dignity: the incalculable worth of a human being’s life, which is beyond any form of measurement. Free human existence is fundamental. Kant makes perfectly clear that the categorical autonomy of all human beings is absolute and independent of any phenomenological or physiological distinction. This imperative is as valid today as tomorrow. (Kant’s Eurocentric thinking is certainly disturbing and his belief in Western civilization’s supremacy contradicts the inclusiveness that may be suggested in this kind of universal call for autonomy of all peoples and races.) Still, I read this today as an affirmation that there cannot be any material or any other justification for the loss of even a single life.

Protest against racism and inequality, as well as solidarity with all whose human dignity has been and continues to be violated, is not only legitimate but also necessary at all times, especially during natural and manmade disasters. Even more so, for us in privileged positions, it is our responsibility.

ITALY COVID 19

(Director’s Note: Seeing  Covid-19 hit Italy early, and hard,  I wrote to a colleague in the Department of Languages, Roberto Risso, for his sense of the situation, below.  Dr. Risso was born and raised in Turin, Italy, where he lived
until eleven years ago.  After relocating to the US, he has lived
in Wisconsin, Maine, and South Carolina. He now lives in the Clemson area,
where he loves gardening and reading.  He is currently an Assistant Professor of Italian Studies at Clemson University.  This is Clemson Humanities Now.)

The planetary dimension of the catastrophe became clear to the Belpaese quite early, when a European citizen was said to have brought to Italy COVID 19, a virus originated in China and was spreading fast. Italy has a special bond with China, a mutual fascination that dates back to Marco Polo’s travel and to his famous book. The so-called new Silk Road has a relevance that should never be underestimated in today’s world. Billion of Euros’ worth of commerce and almost half million of Chinese people living in Italy reinforce this bond. And yet anti-Chinese xenophobia hit Italy at the beginning of the pandemic in February 2020: hard-to-watch videos surfaced of Chinese people being insulted and beaten on the streets. Italians were scared, angry, unhappy, very unprepared, but mostly had the decency to admit it from day one.

The virus hit Italy hard from the very beginning: a densely populated territory with areas that are among the most polluted in Europe, by March and April the North of Italy was a hotspot and multiple hundred of victims died daily, especially in the Lombardy region, the economic capital of the country. The harrowing images of the military trucks transporting coffins out of the Lombard city of Bergamo became iconic of the disaster that was taking place under the Tuscan (and Lombard) sun. As a result, new emergency laws were introduced and reinforced: total lockdown for more than a month and after that compulsory masks everywhere. Italians obeyed and thank God they did. And it worked, indeed. By end of April and May the situation was better: from hundred of deaths daily to a few dozen, then few people, a dozen less. But with relaxation of the attention a new wave came, mid-June, and the threat of a new Nation-wide lockdown convinced people to be more careful.

Is it possible to conceive Italy with no tourists? No, of course not. But without planes, and with borders sealed, no tourists came and the only visitors were the boatloads of refugees who came by sea. Singing from balconies, memes, online feuds, political chaos were nothing but distractions; from the beginning of the Second Republic on, politicians have been clowns all along. Since Italy has always been a divided and diverse country, the pandemic did not unite it; Italians did not became any better.  Contrary to what it was said during the lock-down–“Ne usciremo migliori e più forti”, or ‘we will get out of this stronger and better’–did not really happen, unfortunately. What really happened is that, due to the internal immigration of the past, at the beginning of the pandemic many southerners living in the North fled the closing cities and brought the virus with them. Italy has also the oldest population in Europe, second oldest in the world after Japan: the “nonni”, grandpas and grandmas, died like flies, especially in the nursing homes, where, the suspicion is, that nothing was done to avoid contagion. On the contrary, people say (and please remember, vox populi, vox Dei) that the virus was used to get rid of many of them. It is so atrocious that it can be true in a country no stranger to atrocities.

As of July 4th 2020 the contagion index is anything but uniform: it has increased from June but many central and southern regions are free from Covid 19, and not for lack of testing. Lombardy, where the virus has claimed so many lives in the past, has a level of 0.89 index, which is good, although the nation’s cases as of the beginning of July 2020 are 241,419 which is scary. In four months, twenty thousand jobs have been lost for good.

As of July 1st citizens from USA, Brazil, South Africa, Russia, Turkey, India, Saudi Arabia and Mexico have been blocked from entering the Country. Chinese nationals have not been blocked yet. With these conditions it is very hard to foresee the damage to the tourist industry. We will see what the European Union will do about it, since its priorities usually seem to be the well being of German Banks and not the hardships of citizens. What I know for sure, despite all the terrible news and the scary data, is that Italians will move forward despite the ten thousand babies that will not be born in 2021.  In a Country that is already at a level zero birthrate.

Moving forward and rebuilding after catastrophes is what Italians do best.

The second plague outbreak in my life

(Director’s Note: Lee Morrissey, Founding Director of the Humanities Hub, joined the Clemson English faculty in 1995, moving to Clemson from Manhattan, where, while a graduate student, he had volunteered to create the Archives at an arts center, The Kitchen.  From 1995 to 2000, he brought artists–including Ben Neill (who performed with David Wojnarowicz in the 1989 event described below), Carl Stone, Pamela ZMargaret Leng Tan, and Trimpin–to Clemson in a program he called “Technology and the Contemporary Arts.” This is Clemson Humanities Now.)

This summer, I will turn 56, which means, this summer in particular, I am well into what has been considered an elevated risk group for Covid 19.  But it also means, in my case, I am now living through the second plague outbreak of my lifetime.  There has been so much focus nationally on the Spanish flu of 1918, and other, earlier plagues, including the Black Death in Florence (see Boccaccio’s Decameron) and the bubonic plague in London (see Defoe’s Journal of a Plague Year), but much less attention to the more recent experience of HIV/AIDS.  A recent article in the Washington Post reviewed how knowledge gained from HIV/AIDS research is helping with Covid-19 research, but considering that Anthony Fauci first came to national prominence in the 1980s’ AIDS crisis (making a frenemy of Larry Kramer in the process), you would think that there would be more attention paid to the last time around.

I’ve been wondering recently why AIDS isn’t more in the news, and I am starting to think it’s because it affected minority communities, and, unless you were traveling in those minoritized circles, well, you just did not really know.  I grew up in Boston, shared a radio show on WOMR-FM in Provincetown, MA in the summer of 1984, started graduate school in NYC in 1987, and volunteered, from 1987 to 1995 at The Kitchen Center for Video, Music, Dance, Performance, and Literature, creating and organizing their archives.   I traveled in those circles.  AIDS affected people I knew, early in the disease.  By the time I reached Manhattan, in a New York City that does not exist anymore (or, if enough people now move out, yet), AIDS was a defining feature of one of the things that used to attract people to New York: the arts.  The Aids Coalition to Unlock Power (ACT UP) was already active, with intense meetings at the Lesbian and Gay Men’s Community Services Center. Gran fury, an artists collective formed the year after I moved to the city, created one of The Kitchen’s monthly posters, with the words “art is not enough” taking the center—most—of the page.  I remember David Wojnarowicz at The Kitchen, all wiry energy backstage (well, upstairs) before his 1989 performance there, In the Shadow of Forward Motion, and, then, in the performance, realizing that it had the same wiry energy—David offstage had been, that night anyway, David onstage, which was part of his point: silence equaled death, and he was going speak volumes, producing all the time, constantly.  His self- portrait with flames coming out of his left side, the viewer’s right (pointedly highlighting the political mirror), turned out to be realism.  I was back home in Boston when news of Ethyl Eichleberger’s death reached me via the New York Times, and I won’t forget the frustration of trying to explain to my parents over breakfast why the loss of someone reimagining the Classics in drag and garish makeup, with self-consciously schlocky accordion songs, might matter to someone else studying literature in the same city.  I remember inviting everyone I knew well enough to see John Kelly’s 1991 event, Maybe it’s cold outside, because it nearly worldlessly told a life and death story of learning that one is HIV positive, with a soundtrack by Arvo Pärt.  Later that year, on World AIDS Day, A Day Without Art, The Kitchen assembled a broadcast, in part from the Kitchen, and in part from other arts organizations, mixing frank safe sex PSAs with artistic reflection on loss.  The standout, for me, was dancer Bill T. Jones, HIV-positive, and having outlived his partner, Arnie Zane, dancing toward his mother, Estelle, while she sang “Walk with me Lord.”

I left New York City, and moved to Clemson, SC, where I have lived practically ever since.  The metaphorical city I had left had, in many ways, died.  Actually died.  C. Carr wrote its eulogy in her book, On Edge: Performance at the End of the Twentieth Century, whose last chapter is titled, “The Bohemian Diaspora.”  In Rent, which has a song called “La Vie Boheme,” Jonathan Larson memorialized it, and the gentrification pressures, which ended it (and on the morning of its first performance he passed away, too, from an undiagnosed aortic tear, not noticed in the several trips he made to his local hospitals, one of which has since been shut down).

There is another, much less well-known, and more abstract, long-form music and theatre piece from the era, Meredith Monk’s Book of Days (1989).  It opens in then-contemporary NYC, in color, as a construction crew blasts a hole in a wall, through which the camera travels, into a largely medieval village, complete with separate quarters for Christians and Jews, filmed largely in black and white.  As a plague breaks out.  There’s a documentary dimension, and contrasting musical and performance styles–martial and spectacular in the Christian white-wearing section, mystical and participatory in the Jewish section of town.   When, in this section, the plague arrives, in the figure of a lone dancer gesturing maniacally, the Christians amass and blame the Jews, despite the intervention of one man who tells them that “everyone is dying.”  A map of the town turns red as the deaths accumulate across its segregated quarters, and the sounds of the singers get more and more menacing.  As can be heard in this rendition of the song, we are listening to the plague itself: about two minutes in, out of those vocalizations, the chorus huffs “we know who you are; we know who you are; we know who you are; ha ha ha; ha ha ha.”  It is a chilling effect to have the plague emerge from the cacophony and claim us.  But that’s what happens.

In 1990, The Kitchen got caught up in the 1990 NEA defunding.  A decade earlier, as Bill T. Jones recounts in an essay he contributed to The Kitchen Turns 20, a book I edited on the occasion of the first two decades of The Kitchen, the US Information Agency had asked The Kitchen to select US artists to send to a cultural exchange in Bucharest, Romania.  But, now, after the 1989 fall of the Berlin Wall, the Cold War had ended and the so-called Culture War had broken out, at home.  The US is still living with the profound consequences of this redirection.  At the time of the NEA defundings, my father called, and said “I thought you were volunteering at a soup kitchen!”  Or, in other words, I guess, he was asking “what are you doing there?”  That’s a fine question, and I think the answer has to do with being interested in making meaning; the people, the artists, who performed at The Kitchen and other venues in New York, wanted to do something meaningful.  Like me, they came to New York from somewhere else with that hope.  They came from and spoke to overlapping communities, and they wanted to create something, usually ephemeral, filled with allusive, polyvalent, and, I think it’s important to note, embodied meaning.  Some of it, a really small number within it, left the performers vulnerable to charges of obscenity, but, ironically, usually in works which meant to raise questions about what else we live with, without considering it obscene.  All of this took place before the internet, before Ken Starr’s detailed report on President Clinton, and so might strike people today as quite tame.  They might also be struck by the convivial friendliness of the experiences in the audience at the time.  In the process, or processes, of creating their works these performers created new communities, sometimes as small as the seating at the original P.S. 122 (a former public school), or at the original Dixon Place (an apartment living room), but sometimes, as with Bill T. Jones, as big as everyone who’s seen any of his several Broadway choreographies.

Comparing then and now, a few things stand out.  First, I have not seen much emphasis recently on how fatal HIV was for that first couple of decades of the disease.  I think of it all the time now, in part because I saw so much loss at the time, but also because Covid-19 is so much more transmissible than HIV.  Imagine if the novel coronavirus were as fatal as AIDS.  Actually, don’t.  It’s too much to contemplate.  Second, in the early days with Covid 19, as doctors started to see patterns in who was most vulnerable to the virus, age, occupation, and then race were the initial factors.  Eventually, it became clear that other factors involved the status of one’s immune system, or “underlying health conditions,” and then it was reported that only 12% of the US population was metabolically healthy, meaning not obese, not diabetic, not with high blood pressure, etc.  It is as if 88% of the population has a different kind of acquired immune deficiency, related to diet (and access to healthy foods), environmental and living conditions (and access to places in which to exercise), and other cultural, meaning economic, meaning political, factors.  Third, HIV/AIDS is still with us.  Last year, 690,000 people died from AIDS, and 1.7 million people were infected, worldwide.  And, while there is PrEP if you think you are at high risk, there is, still, no vaccine, and no cure, yet.  In the US alone, according to the CDC, there were about 38,000 new HIV diagnoses in 2018, most of them in the South.

Fourth, thirty-five years ago, people who were affected by HIV/AIDS were frustrated by a President who would not mention the disease.  At the time, his political party was focused on family values and personal responsibility, which meant a disease cast as gay and preventable with celibacy (and safe sex) needn’t be a national issue, they thought.  Today, the president, of the same party, certainly talks about the current pandemic, but he has spent a lot of time saying that it will go away (it hasn’t yet, almost six months later), and that it is getting better (when it is getting worse).  And personal responsibility has devolved to NOT wearing protection, because, it is argued, nobody can tell another U.S. American how to do anything.  Four decades ago, conservatives blamed the sexual activities of gay people for spreading AIDS, mistakenly thinking it was a gay disease.  Today, as soon as the lockdowns are lifted, apparently heterosexual young people mingle in close proximity, close enough for this pandemic’s transmission–without wearing any protection.  In terms of disease transmission and prevention, they are acting as irresponsibly as Republicans used to allege gay people were, except this time, all they need to do for transmission is stand near each other, unmasked, in, say, a pool, while a beach ball bounces off their heads.

Finally, I think about how much changed in the US because of AIDS activism, a range of activisms.  By becoming experts in the disease—no one else would initially, so they did—activists changed healthcare.  By highlighting their exclusion from their partners’ hospital beds, activists pressed for recognizing lifelong commitments among lesbian and gay people, known today as gay marriage, and legal nationwide.  Most of all, though, because so many affected were artists, they could create verbal, visual, musical, and, in the end, emotional responses to living with or being stalked by the disease.  There was some hope, then, that there might be angels in America.  I can imagine the current Covid-19 pandemic–which is also creating the second great recession, or worse, in just 12 years–is going to have even bigger social consequences, and, I say hopefully, just as positive.  If so, though, we are going to have to do the imaginative work of the artists lost to AIDS to create affecting representations of living with and being stalked by both a disease and an economic collapse.  Gran fury said “Art is Not Enough,” but, actually, it is a start.

Déjà Vu All Over Again

(Director’s note: Dr. Abel A. Bartley, a native of Jacksonville, Florida, and a graduate of Florida State University where he received his BA in History and Political Science, his MA in History, and his Ph.D. in History in African Americans and Urban History, is Professor of African American and Urban History at Clemson.  Author of Keeping the Faith: Race Politics and Social Development in Jacksonville, Florida, 1940-1970 (2000), Akron’s Black Heritage (2004), and In No Ways Tired: The NAACP’s Struggle to Integrate the Duval County Public School System (2014), which received the Stetson Kennedy Prize for best book on civil rights history.  In 2004 he accepted the challenge of running the African American studies program, then a minor, at Clemson University.  In 2007 he created the new Pan African Studies Program, a major, at Clemson, and served as its first Director.  This is Clemson Humanities Now.)

I cannot believe that we are here again. It seems as if I am stuck in a bad dream that keeps recurring. I remember my older brother marching in 1980 after a 33-year-old African American insurance salesman named Arthur McDuffie was murdered by police officers in Miami, Florida. McDuffie was accused of committing the grievous crime of running a traffic light. The officers originally reported that McDuffie died when he crashed his motorcycle, but the coroner found that his injuries did not match the autopsy. Eventually, the officers were accused of using their flashlights to beat McDuffie. We all believed that with the autopsy, the officers would be convicted of murder. However, amazingly, the officers were acquitted. African Americans and their white allies protested and marched, but black deaths like McDuffie’s continued.

In 1991 a video surfaced of Black motorist Rodney King being brutally beaten by four Los Angeles policemen. After a change of venue and lengthy trial, the officers were all acquitted. Like my bad dream, this sort of scenario does not end. African Americans get killed by the police or their acolytes, we march, we pray, we cry, we make high sounding declarations and yet the bad dream endures.

As Einstein once stated, the definition of insanity is doing the same thing and expecting a different result.  Why do the juries keep coming back with the familiar refrain, “Not guilty!”

After years of studying this phenomenon, I am convinced that we are aiming at the wrong target. Our anger, though understandable, is misplaced. We are blaming the police for our problem when we should be blaming the courts. Blaming the police for all of the wrongs misses the point. The police are just a part of a systemic racism that has historically oppressed people of color. African Americans were brought to America to serve as slaves and the American courts have never come to terms with seeing us in any other condition other than that of servitude.  Consequently, Black life, progress, and, most importantly, deaths have never been celebrated or properly mourned by most Americans. It is obvious that the courts have never be viewed or treated African Americans as equals under the law.

During Reconstruction many progressive-minded Americans attempted to write equality before the law into American jurisprudence. The resulting 14th and 15th Amendments became precedent-setting pieces of legislation that ensured that all Americans are to be treated fairly and equally under the law and courts and provided them a powerful weapon to protect their rights. Unfortunately, the courts quickly undermined the Amendments. Despite the legally sound arguments of lawyer Albion Tourgee in defending Homer Plessy’s right to ride anywhere he could afford on a train in the South, the courts said that African Americans could be separated and subjected to disparate treatment.  Therefore, White Americans were given carte blanche to establish an elaborate system of legalized segregation, now known in the South as Jim Crow, which separated and subjugated African Americans in all social, political and economic settings. The courts allowed the South to establish capricious unfair restrictions on the franchise, unilaterally disarming African Americans of their most powerful weapon.

The nation’s worse period of racial violence ensued. Of the violence Frederick Douglass wrote, “If American conscience were only half alive, if the American church and clergy were only half christianized, if American moral sensibility were not hardened by persistent infliction of outrage and crime against colored people, a scream of horror, shame and indignation would rise to Heaven … Alas!, even crime has power to reproduce itself and create conditions favorable to its own existence. It sometimes seems we are deserted by earth and Heaven–yet we must still think, speak, and work, and trust in the power of a merciful God for final deliverance.”  Since that period, African Americans have been the constant victims of extralegal killings, whether at the hands of the police or their devotees. From Maceo Snipes to Emmett Till, the one constant that keeps these killings going is the assurance that the courts will acquit the accused. Therefore, we continue to get angry at the police, the lynchers, or the ugly racist when what we really need to do is reform a racist judicial system that consistently vindicates the accused if the victim is black and the defendant is white. I suspect that if police, racists, or others really feared that they would be held responsible for their actions, they would think twice about whether it was worth choking a man over a counterfeit $20 bill, shooting a 12-year-old boy over a fake gun, shooting Trayvon because he walked through a white neighborhood, murdering a 26-year-old because he jogged through your community, murdering a woman in her house playing video games with her nephew, or murdering a sleeping EMT.

So how do we reform the system?  Let me suggest a few modifications.  First, we must reform the laws to ensure that police are subject to the same use of force standards as the public. Secondly, the laws have to be changed so that police are no longer allowed to just stop people without a legally defensible reason. African Americans must become much more active on juries. Next, the Supreme Court must remember its role. The courts speak for those who are voiceless. They must maintain their objectivity and protect the unprotected. Lastly, juries have to see the victims as them. Juries have to be able to put themselves in the victim’s shoes. Whites see police officers as protectors, while African Americans see policemen as enforcers of an unjust society. Therefore, when police face juries, the juries have to see both sides.

I suspect that we would see dramatic reductions in these crimes if those who were horrified by them would mobilize.  If the politicians who are so concerned about law would spend half of that energy on justice, we could solve this problem.  But, as Frederick Douglass wrote, “But alas! even crime has power to reproduce itself and create conditions favorable to its own existence.” So, I say protest against police violence, make your speeches, say your prayers, remove your statues, change your building names, and make your resolutions. I applaud all of those who have demonstrated their outrage and participated in protest activities demanding change. Nevertheless, until and unless our courts are ready to hold those responsible who wantonly kill our brothers and sisters, we are just experiencing a recurring bad dream. Those in power know that if the law applied equally to all and if everyone had unfettered access to the franchise, it would mean a real realignment of power in this nation and challenge to their power and privilege. So, I suspect we will rearrange the furniture, change the drapes, fulminate and yet nothing meaningful will change. As a historian, I fear that after these rallies and protests end, we will go back to “American normal”—that is, back to this same bad dream.

Please America, reach into your soul and prove me wrong this time!

 

Reading the Classics under Lockdown

(Director’s note: Elizabeth Rivlin, teaches in the English department, and her research interests include the history of Shakespeare in American literature and culture, especially cultures of reading; theories of adaptation; and early modern drama and prose. Her current book project is titled Shakespeare and the American Middlebrow, for which she won a NEH Summer Stipend.  This is Clemson Humanities Now.)

With the stay-at-home orders that arrived in March came a slew of advice about what to do with all the unexpected, homebound leisure that some Americans found themselves with. Reading seemed to be high on the list, judging by a flurry of media pieces with titles like “The Lockdown List: Books to Read During Quarantine.” While some of the recommendations were for new and contemporary literature, the “classics” and “the great books” also had their cheerleaders. Take, for example, the well-publicized “Tolstoy Together,” a project in which the writer Yiyun Li read and discussed War and Peace, book club-style, with members of the public.

Shakespeare, too, has gotten love during the lockdowns. Patrick Stewart read the Sonnets aloud to us daily on Facebook, and, inspired by a Twitter meme claiming that Shakespeare wrote King Lear while in quarantine for the plague, prominent Shakespeare scholars like Stephen Greenblatt, Emma Smith, and James Shapiro meditated on what Shakespeare has to tell us about being creative in or simply living through a pandemic. Others offered direct advice to help readers tackle Shakespeare. Acknowledging that “many people have said they find reading Shakespeare a bit daunting,” Emma Smith returned with “five tips for how to make it simpler and more pleasurable,” her last bit of counsel being “Don’t worry.” Allie Esiri recommended particular readings from Shakespeare for the lockdown period, on the theory that “Reading Shakespeare concentrates the mind which we are all in need of, lending both a challenge and a reward.” Meanwhile, in the conservative publication The Spectator, Chilton Williamson, Jr. guiltily confessed that “it is too easy to put off reading” Shakespeare and pledged to “rectify” that by putting The Complete Works of Shakespeare on his pandemic reading list.

This lockdown-inspired burst of enthusiasm for reading, and in particular for “classics” like Shakespeare, is just the latest manifestation of a persistent current in American life that has linked reading to a quest for self-improvement in which edification and pleasure are supposed ideally to mix. I’m working on a book, Shakespeare and the American Middlebrow: Reading Publics, 1878-Present, that looks closely at institutions that promoted the reading of Shakespeare with just such aims: Chautauqua is a small town in upstate New York that for several decades at the turn of the last century was the center of the burgeoning self-education movement in the United States; The Book-of-the-Month Club pioneered the mail-order sale of books at large scale in the early- to mid-twentieth century; and the Great Books Movement generated a list at mid-century of what it dubbed “the greatest books of the western tradition” and sold them to the American public in 54 volumes. All three of these institutions promised that learning to read books of a certain “quality” or canonical status would create individual mobility and collective uplift, and all three promised to help readers deal with the perceived difficulties and obstacles that accompanied this ambitious reading.

Arguably, no one entity today has the impact on Americans’ reading habits as these institutions did in their heyday (although Oprah’s Book Club has more recently enjoyed something of the same influence). Even on the diffused internet, however, many voices dispense familiar-sounding wisdom, especially the assurance that the reading of culturally sanctioned, esteemed works is therapeutic and even curative in a time of crisis. In this, they hearken back to earlier reading initiatives, for example, the Great Books Program, which in the midst of the Cold War promoted self-educational reading as a means for the American public to rise to the level of an enlightened democratic citizenry and thereby defeat the insidious threat of communism. The dangers may present themselves in different shapes in 2020, but the prescription remains the same.

One thing that does seem to be changing is Americans’ expanding sense of the kind of reading that can enlighten and improve: witness the recent outpouring of #BlackLivesMatter reading lists. When so much seems to fall outside of people’s control, the lure of reading, whether it’s James Baldwin, Shakespeare, or both, is still that the self can be improved, and that if progress is possible on a personal level, perhaps it can also be imagined on a collective scale.

COVID-19 digital contract tracing shows how badly we need data-literate humanists

(Director’s note: Jordan Frith, Pearce Professor of Professional Communication in English, researches mobile technologies, social media, and infrastructure, particularly where those topics intersect through questions of space and place. He’s the author of 30 peer-reviewed articles on these topics and three books, the most recent of which was published by MIT Press in 2019.  This is Clemson Humanities Now.)

Back in March when much of the world was starting to realize that the COVID-19 pandemic was not going to just disappear, various governments and corporations began exploring plans for digital contact tracing applications. Digital contact tracing involves using the data from mobile phones to track whom people come in contact with on a daily basis, so if someone is later diagnosed with COVID-19, anyone they shared a space with can quickly be notified. The apps mostly work through Bluetooth, meaning the phones of anyone who signed up for the contact tracing app would constantly broadcast a Bluetooth signal to nearby phones and record the unique ID of each phone within Bluetooth range. Those unique IDs would link to a database of users, which is how people would be notified if a positive COVID-19 case did occur.

With the COVID-19 outbreaks across the globe, digital contact tracing began being talked about as maybe the number one tool for containing outbreaks without going into full lockdown. A successful system would enable health officials to quickly identify case clusters and quarantine people who were at risk. And in a few cases, digital contact tracing has been useful, particularly in places like South Korea that put these plans into place early in the pandemic and got significant public buy-in. But many of the much-hyped digital contact tracing plans have mostly failed. For example, the UK digital contact tracing app has not made much of an impact, and the United States still does not have a robust Federal digital contact tracing system. So why have these systems not succeeded despite getting so much public attention? I argue that it’s partly because the plans relied too much upon data as a good in itself without the necessary critical reflection necessary for the success of most big data projects. That reflection and interrogation is where the humanities come in. As I explain below, the failure of digital contact tracing projects at the time we need them most shows why we need more data-literate humanists.

Digital contact tracing is obviously somewhat unique because it’s not every day we have a global pandemic that has killed hundreds of thousands of people. But the hype surrounding digital contact tracing is less unique when considered in the greater context of what has been called the “big data revolution.” The supposed big data revolution represents two major shifts in society in the 21st century. The first is that we are producing far more data than ever before about everything from energy usage to mobility patterns. For some context, according to Viktor Mayer-Schonberger and Kenneth Cukier, ‘‘Google processes more than 24 petabytes of data per day, a volume that is thousands of times the quantity of all printed material in the U.S. Library of Congress.’’ And that’s just Google. The second development is closely related. We now have vastly improved computing power necessary to process and act upon huge datasets. These two trends have combined to usher in an increased reliance on data in many walks of life, and according to some proponents, if people just collect enough data, the “data can speak for itself.” This blind belief in data without diving deeper into the unique contexts of data sources is part and parcel with the hype we saw about digital contact tracing.

There’s nothing wrong with the use of large-scale datasets to solve problems. Data informed decision making is often far superior to just winging it and guessing. However, the hype surrounded big data often went too far and did not involve humanists to ask questions about data projects that could have helped them succeed. After all, despite beliefs that data could supposedly speak for itself, that can simply never be the case. Data always needs to be interpreted to be acted upon. In addition, datasets are often a partial picture of a phenomenon, and data collection also often has privacy and personal liberty implications that are not closely enough considered. As Rob Kitchin argues, data projects are often best served by combinations of data scientists and humanists/social scientists who can interrogate the data and how it is used to inform decision making.

The COVID digital contact tracing is an important contemporary example of what critical, data-literate humanists can add to data projects. After all, these digital contact tracing apps, despite all their hype, faced some major issues from the very beginning that were not fully addressed. For one, as anthropologist Genevieve Bell argued, the design of these apps matters significantly for user privacy, and governments did not communicate design choices clearly to the public. The lack of explanation of where data was coming from and how it was stored likely hampered public adoption, and contact data is only useful when enough people buy into the system to provide a comprehensive dataset. Secondly, and maybe most importantly, the design choices in these apps were inherently exclusionary. They required mobile phone data (unlike focusing on a human-driven contact tracing system), so children, some elderly people, the homeless, and other non-adopters simply could not participate. In addition, many cheaper mobile phones do not have low-energy Bluetooth capabilities necessary for the applications. For people without the right devices, they are simply left out of the dataset, which can have major consequences for epidemiological tracking. And finally, digital contact tracing itself cannot work successfully without a rapid testing infrastructure to quickly identify positive cases, and—in my opinion—a social safety net that can financially support people staying home from work if they receive a notice they could be infected. Without these broader systems in place, the data provided through these systems and how actionable the alerts are was always going to be limited.

The issues in the previous paragraph have all limited the efficacy of digital contact tracing, and that’s not even mentioning the major surveillance concerns surrounding location data and whether systems are dismantled once (well, maybe if) the pandemic is under control. After all, Congress just renewed parts of the Patriot Act almost 20 years after it was first passed in the shadow of 9/11. But regardless, many of these issues could have been addressed through more critical engagement and less of a blind embrace of data. In other words, I argue part of the reasons these projects have not succeeded is that they often did not involve the necessary types of critical literacies that are the expertise of the humanities. By interrogating the plans for these datasets, issues of inequality, concerns about design and data storage, fears about surveillance, all could have been addressed from the start and communicated to the public. But instead, governments too often pushed forward with these plans without the types of critical analysis required to make them work.

In conclusion, I’m using digital contact tracing here as an example, but I could point to so many other data projects to make the same case. As humanists, we need to play a role in how data is collected and how data is interpreted. We also have to not be afraid of data. Humanists don’t have to be mathematicians or data scientists, but some of us need to know how to question datasets, how to explore where data comes from, and how to interrogate what it may be missing. And we need to move past antiquated quantitative/qualitative divides that act like numbers are outside the purview of the humanities. Most data projects aren’t going to have major public health consequences like a digital contact tracing app. But regardless, we need to fight for a place at the table for future data projects in both industry and government. We have a role to play to shape the collection and interpretation of data of various types.

 

Peace not Patience

(Director’s note: Pauline de Tholozany, an  Assistant Professor in the Department of Languages department, specializes in 19th-century French Literature. Her first book, L’Ecole de la maladresse (Paris: Honoré Champion, 2017), is a history of clumsiness in the 18th and 19th centuries. She is now working on a second book that focuses on impatience, a feeling that we tend to decry; she is interested in why we do so. Her contribution to this series, posted on Medium.com, is on impatience.  She reports that this project as a whole was inspired by Clemson students’ activism and their legitimate, peaceful, and powerful impatience.  This is Clemson Humanities Now.)

 

An Accurate Description of What Has Never Occurred

(Director’s Note: Stephanie Barczewski is Professor of History and Carol K. Brown Scholar in the Humanities. She is a specialist in the history of modern Britain. Her most recent book is Heroic Failure and the British (2016); her next book, Englishness and the Country House, is forthcoming from Reaktion Books in 2021.  The title of her essay comes from Oscar Wilde’s essay “The Critic as Artist” (1891): “To give an accurate description of what has never occurred is not merely the proper occupation of the historian, but the inalienable privilege of any man of parts and culture.”  This is Clemson Humanities Now.)

The response to the coronavirus pandemic has relied upon academic expertise from obvious sources: medicine, epidemiology, biology and other scientific disciplines. Occasionally, however, historians have been summoned to provide insight, mostly related to the “Spanish flu” pandemic of 1918. For example, they have pointed out that the very name of the pandemic is misleading, because it resulted from Spain’s neutrality in World War I, which meant that its newspapers were the first to report the outbreak. Even today, the source remains unknown: it was most likely the United States but could also have been China, France or Britain. This knowledge is valuable not only for accuracy’s sake, but because it points out the damage that can be done by associating a deadly virus with a particular country. This has been useful for combating President Trump’s efforts to label Covid-19 the “Chinese” or “Wuhan virus,” which can lead to the demonization of people of East Asian ethnicity.[2]

Historians have also contributed to efforts to examine the efforts made to reduce the impact of the 1918 flu through “non-pharmaceutical interventions” (NPIs). Academic studies showing that American cities that imposed NPIs more aggressively saw reduced death rates have been frequently cited on social media.[3] But I would like to introduce a counter-example: in Britain, almost nothing was done to combat the spread of the virus, as the Great War was deemed a greater threat to public safety and the national welfare. The general attitude was summed up by Sir Arthur Newsholme in a report compiled for the Royal Society of Medicine in 1919: “There are national circumstances in which the major duty is to ‘carry on’, even when risk to health and life is involved.”[4] We might then assume that the death toll was higher in Britain than it was in the United States. But this was not the case: Britain, with minimal NPIs, suffered around 228,000 deaths out of a total population of around 42,000,000 (or .005 of the total population). The United States, with far more NPIs, suffered 670,000 deaths out of a total population of 104,000,000 (.006). So are we to conclude from this that not introducing NPIs saved 24,000 lives in Britain, or the difference between .005 and .006 of the population? Or conversely that NPIs cost 130,000 lives in the United States? This would be ridiculously simplistic, but it is no more so than the “Philadelphia had a parade and St. Louis did not” comparisons that have frequently been used to justify aggressive NPIs.

I am not attempting here to assess the efficacy of NPIs, a subject on which I am not qualified to offer an opinion, but rather I am trying to make a suggestion about how history can be most effective in helping to shape the response to a present-day crisis. My point is that the past rarely offers simple answers, because the evidence often points in multiple directions. History may be doomed to repeat itself, but this is not because we ignore it, but because of how difficult its lessons are to extract. In 1918, there was a debate over whether wearing masks could slow the spread of influenza. In some large cities, mask usage was widespread, and, as one Red Cross public-service announcement put it, “the man or woman or child who will not wear a mask” was seen as “a dangerous slacker.” But other voices argued that masks did not work, were uncomfortable or detrimental to commerce. Some public officials refused to wear them, and in San Francisco there was even an “Anti-Mask League.”[5] A century later, the “science” has not much changed; it is instead the culture of mask-wearing that has evolved in recent months, perhaps to fit people’s political predilections.

If history is going to provide understanding of the Covid-19 pandemic, it must be done in a way that acknowledges the unknowability of the past alongside its knowability; history tells us not only what we do know, but also what we do not. If we fail to acknowledge this, it will mislead more than it will enlighten. As a guide to present-day decision-making, it therefore requires more than facile comparisons between then and now. And with its eye on the long view, it may be better suited to helping to provide the answers to the big, long-term questions about the impact of the pandemic rather than the shorter-term ones such as what measures might be effective. In this moment where we are increasingly recognizing the historical impact of the privileging of certain groups over others, we should be unafraid to acknowledge how the ways in which western societies have chosen to react to a threatening new disease are having an impact on the rest of the world. Is the near-obsessive focus on Covid-19, in other words, yet another embodiment of the elevation of the lives of wealthy westerners over those of people in the developing world? While attention and resources are directed exclusively at Covid, millions may die from other infectious diseases such as tuberculosis and malaria, the treatment of which is currently being set back for years if not decades, while millions of children will go unvaccinated for diseases like polio that were, prior to this crisis, on the brink of elimination. And this is without taking into account the effects of the massive economic disruption caused by the western response to the pandemic, which will cause widespread hunger and social unrest. This, I fear, and not our failure to achieve the eradication of the virus through lockdowns or universal mask-wearing, is the most dire way in which our current history will take the form of a repetition of the past.

[1] The title of this essay comes from Oscar Wilde’s essay “The Critic as Artist” (1891): “To give an accurate description of what has never occurred is not merely the proper occupation of the historian, but the inalienable privilege of any man of parts and culture.”

[2] See https://ajph.aphapublications.org/doi/10.2105/AJPH.2018.304645.

[3] The most-frequently cited study is: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3291356/.

[4] https://www.bbc.com/news/in-pictures-52564371. See also https://www.historyextra.com/period/first-world-war/spanish-flu-britain-how-many-died-quarantine-corona-virus-deaths-pandemic/.

[5] https://www.washingtonpost.com/history/2020/05/06/mask-protests-flu-san-francisco-coronavirus/.