January 6, 2021, Epiphany.

January 7, 2021

[Author’s note: Lee Morrissey, Founding Director of the Humanities Hub, and English Professor at Clemson, on media, civil war, and democracy in seventeenth-century England, and early 2021. This is Clemson Humanities Now.]

In 1995, when I was still a graduate student, I was lucky enough to get a job as an assistant professor of English, and I moved to Clemson from Manhattan, where I had been studying. Over the years, people have understandably inquired about that transition, sometimes out of curiosity, sometimes with a presumptuous knowing chuckle. Of course, there were many differences in my experiences in the nation’s largest city, and a town of 11,000 year-round residents, and some of them were improvements (if only because living on a salary in Clemson is better than living on a $10,000 stipend in upper Manhattan).

One difference, though, that no one has been able to anticipate has weighed on me, for decades now—the utopic excitement about the World Wide Web I encountered when I joined Clemson’s Department of English. The internet was twenty years old when I started at Clemson, although the World Wide Web was nearly brand new. A swath of my new colleagues were convinced not only that it represented the future, which it did, but more importantly that it boded well for democracy.

My own research in English literature spanned the seventeenth and eighteenth centuries, a period that goes from Civil Wars (culminating in the execution of a monarch) to the Enlightenment. In other words, my ‘field’ covered the protracted emergence of representative democracy from the ashes of a civil war and interregnum. As a result, my colleagues’ faith in this new technology struck me as naïve at best, even as I share their hopes for democracy.

Nothing about the democratization of print in the 1640s suggested that the World Wide Web was going to be an unvarnished good for democracy as my colleagues understood it. The experience with my colleagues changed my career: I wrote a book—The Constitution of Literature (Stanford UP, 2008)–on the topic, telling the story across nearly a century and a half, exploring different ideas of democracy as they unfolded in that period, and arguing in the end that “today’s discussion of the World Wide Web needs to consider each of these facets of the development of literary criticism. For the potential political instability, the possible emergence of critical protocols, and debates over models of democracy will all unfold for digital media, and the constitution of literature can provide a powerful, instructive analogue” (197).

Legacy broadcast media post-Fairness Doctrine played its part, but the impact and tenor of the Trump presidency have been facilitated by the openness and anonymity of the World Wide Web, especially what was once called the Web 2.0, the web that ‘you’ could create, which was, it was said, the most democratic form (non-linear and non-hierarchical, they said) of this most democratic technology. At the time, pointing out that there might be a problem with such an understanding of democracy was mistakenly seen as conservative, even though one could also already see, first, that the technology had system requirements that priced participation out for many people, and second, that the architecture was shaping the so-called choices, and third, that the algorithms were replicating real world biases (even as anonymous ‘users’ were enforcing them).

For those of us–not so many of us, perhaps–who study seventeenth-century English history and literature knew, the English Civil Wars coincided with, and were accelerated by, a new openness and accessibility of print. Eventually, the English political system stabilized, and a series of legal conventions (copyright, free speech protections, balanced by defamation accountability, etc.) developed to stabilize the political effects of print, too. As I wrote in The Constitution of Literature, “Eighteenth-century literary critics may have prioritized stability over openness. But early English literary critics describe their visions of reading against the backdrop of the profound instability created, they believe, by other modes of reading. It must also be said that stability is likely preferable to protracted Civil War; even the most libertarian digerati is probably concerned about the democratizing effect the internet seems to have offered global Jihadists. Moreover, within that eighteenth-century preference for stability there is a wide range of differing opinions on how to organize the constitution of literature, on how to understand literacy and political philosophy, and even on how to understand stability itself” (196-7).

This analogue, from print, has been on my mind throughout the Trump presidency, increasingly during 2020, but no more so than January 6, 2021, Epiphany Day, when Twitter, Facebook, and Youtube banned, albeit temporarily, the President of the United States from participating in the Web 2.0. In order to preserve one form of democracy—the deliberative, representative, and established one—these companies needed to suspend their form of anti-procedural participatory, rules-free democracy. Or, in the end the democracy offered by Web 2.0 had been revealed as a threat to democracy IRL. Of course, we had long suspected that it was a threat, truthiness and whataboutism having replacing reflection, even before the 2016 election hack by Cambridge Analytica and Russia’s IRA. But this is what made January 6, 2021 an epiphany: a new world is struggling to be born, and the supposedly democratizing technologies were enlisted in preventing it (in a seemingly non-linear and non-hierarchical way). Imagine if the three wise men had been sent to Bethlehem by Herod, Twitter, and GPS to trample through the inn, scare away the family, and take selfies.

The question, unfortunately, is whether yesterday represents an end of such behavior, or its beginning, the culmination, or the starter’s gun. Which epiphany or revelation it turns out to be will depend not only on whether or not to treat the president as if he were only the country’s ‘cranky uncle.’ It will also depend on a reexamination of technologies of representation. The good news is that the seventeenth- and eighteenth-century example shows us we have been here before. Regrettably for us, it took England a century before the final Jacobite Rebellion was quelled, and nearly as long for early copyright laws to be created. Fortunately, though, we have the lessons of their example to draw on; there are vaccines, already developed for print, and offering readily adaptable treatments today.

Earlier, I mentioned how arriving at Clemson from Manhattan and finding colleagues in an English Department who held techno-utopic hopes for the democratic possibilities of digitized representation changed my career. I mean, for the better. If you had asked me in my earliest post-doctoral years what I expected to be researching, I would have told you about a continuation of my dissertation, on authors who were also architects, topics in which I remain interested. It would be several years before The Constitution of Literature formed as a project for me. It is today my most-cited monograph, and it would not exist had I not been so surprised by my colleagues’ conviction that we were about to enter a golden age of the democratic word. Two decades later, the storming of the Capitol yesterday, by people egged on by the president, and by QAnon, shows how much we all need to sift these new media. My own experience, studying civil war and interregnum England, also shows how much these media were not new.

Related Posts