top of page
unnamed_edited.jpg

Apocalypse Now? Yes, Please:
Notes on Navigating the Digital Revolution 

by Michael P. Murphy

Header image from the first AI comic, "The Abolition of Man."

4

5

6

1

2

3

In her 1924 essay “Mr. Bennett and Mrs. Brown,” Virginia Woolf famously observed that “On or about December, 1910 human nature changed. All human relations shifted.” Of course, there is some hyperbole here, but Woolf keenly observed a creeping cultural irruption—an arguably apocalyptic development wrought primarily by the Industrial Revolution—and she subtly spun its manifestations as a decisive relational attribute of what (would later become) literary modernism into her watershed novel To the Lighthouse (1927). The novel describes two days in the Ramsay family, one in 1910 the other in 1920, and its main achievement is primarily in the virtuoso, stream of consciousness innovations that so richly illuminate the interior lives of its anxiety-ridden characters. 

 

No less an accomplishment in To the Lighthouse is in its tracking and narration of time. Woolf renders the many convulsions that take place in the years between 1910 and 1920 with a realist’s sense of lyricism, driftwood on the seas that suddenly appear (and then pile up, implacably and demandingly) on shore. For his part, W.B. Yeats knew well the modern convulsions of the early 20th century—the traumatic shadows cast by the mechanized carnage of World War I, the unfolding Russian revolution, and, closer to home, the drama of the Irish War of Independence. All of these events would embody and beget the various forms of fragmentation that would subsequently describe life in the 20th century. Yeats captured the apocalyptic intensity of it all in his monumental poem, “The Second Coming” published in 1920.

 

The Digital Revolution, for all of its ingenuity and promise, carries with it the freight and specter of a catastrophic apocalypticism heralded by these two literary artists.  Sure, it is clear that people these days are making apocalypse jokes like there’s no tomorrow (and all jokes intended), but the insights of Woolf and Yeats inform late modern life in ways that are now doubly prescient and prophetic, even. I disagree with Woolf that human nature changed—I know what she means, and that is a different discussion—but it is quite clear that human relations certainly did. On or about December, 1910 human relations shifted. On or about September, 1959 (with the invention of the silicon chip), human relations shifted. On or about October, 1995 (when we started using e-mail as our main form of communication), human relations shifted. On or about June, 2007 (with the release of the first I-Phone), human relations shifted. On or about October, 2008 (when Facebook began to go viral), human relations shifted. On or about November, 2022 (when most of us became aware of ChatGPT, large language models, and the possibilities of and robust developments in Artificial Intelligence {AI}), human relations shifted. Of course, Woolf’s point is that human relations are constantly shifting in all our days; but there are certain moments, viewed in sober retrospect, when pivotal, definitive ruptures to history are uniquely disclosed. As Pope Francis astutely observes, we dwell presently in such an era, or as he puts it, “we are not living in an epoch of change, but in change in epochs.” 

 

The Digital Revolution is certainly the biggest sea change since 1438 (when Gutenberg’s printing press came “online”), but the AI aspect that accompanies it (and is ascendant today) bridges the gap between the Industrial and Digital Revolutions. AI is more comprehensive in that it moves in both excarnate and incarnate ways; but there is something very Star Trek, something very “first contact,” going on here. I often think of the famous bumper sticker that depicts the cascading progression of human evolution in a series of figures. On the left side is a curious monkey; on the right, an upright person walking confidently. People have had fun playing with the right-side image—adding a golfer, a kayaker, or even a Darth Vader as the present evolutionary point of the process. For my part, observing as I do every human inseparable from her cell phone, I have imagined in my mind the last few years the right-side image to be person, bent back over (but in a new way), gazing incessantly at a cell phone that has mysteriously sprouted from and is clutched tightly in the right hand. 

 

These days, though, my thoughts are a bit darker and I start thinking that the robots are taking over. I recently completed my bi-annual security training at LUC, and if the folks who make those cherished HR training videos are to be taken at their word, the robots have gained the upper hand. Is my new right-side image one of a Terminator-like robot, a supercilious grimace on his cyborgian face, who proudly wields a new-found power too easily forfeited by its human creators? Or is it one of Klara, Kazuo Ishiguro’s robotic Artificial Friend (AF) who, in a way that inspires true pathos, captures our hearts in his recent (and uniformly superb) novel, Klara and the Sun. As ever, it is likely both. 

 

With this prologue in mind, what follows are three points of reflection, three points about faith and culture in the era of science, digital culture, and AI. First: Digital Culture is Making us Increasingly Servile. The clear nod to Hilaire Belloc’s important (if uneven) 1912 book, The Servile State, is intended and fully in play here. Written at the height (and traumatic aftermath) of the first phase of the Industrial Revolution, Belloc’s book has perhaps even more to say to us today. A text dear to both Libertarians and Catholics possessed of a Mediaeval sensibility, Belloc’s insights anticipate the knotty neoliberalism that describes our current cultural and economic moment. Because of this, the book speaks to people with divergent political and philosophical commitments. Belloc locates the servility he decries in the burgeoning (and insidious) partnership between big business and the state. He posits that, much in the same way slavery propped up economies in various ways in human history, British (etc.) citizens dwelling in this early form of capitalist “corpocracy” are no less enslaved. Belloc’s “servile state” is chiefly the consequence of governmental leadership selling-out its duty in exchange for Pound Sterling, a pattern he observed while serving in the House of Commons. In this scenario, citizens are worked like galley slaves in order to sustain a system that benefits the privileged few. Belloc renounces this partnership and exhorts readers against the instability and vampirism of the capitalist nature that engenders and sustains it.

Servility as it appears 100 years after Belloc is no less subtle or dangerous. But one important difference today is that, since big business is just about finished devouring the state (along with other mediating institutions), there are very few firewalls that remain that might shield the individual person from the raw power of markets. In this sense, the aggregation of our shared experiences of screen saturation and too-close-to-each-other cyber intimacy these last thirty years have produced its own pandemic: a disordered cultural condition that Mark Fisher calls “capitalist realism”— a collectivizing state in which “beliefs have collapsed at the level of ritual or symbolic elaboration, and all that is left is the consumer-spectator, trudging through the ruins and the relics.”    Fisher, who was a post-religious materialist, couches his concern in the mental health costs of disaffection brought on by the dark liturgies of neoliberalism, “repetition compulsions” taking place largely in the excarnate venues of digital life. Fisher’s analysis rhymes with Belloc’s—and how comfortable we seem to be exchanging hard-won freedom and agency with servility. Like lemmings off the cliff, we run full speed to the rituals of insatiable consumerism and corpse-cold technocracy that, left unimpeded, will do us in.

Like lemmings off the cliff, we run full speed to the rituals of insatiable consumerism and corpse-cold technocracy that, left unimpeded, will do us in.

 

Which brings us to my second point: the servility that we are currently propagating and performing is an expression of auto-exploitation, a term coined by Korean born German philosopher, Byung-Chul Han. In a number of works, Han also monitors servility creep, human capitulation to the digital collective, and the increasing weakness of states to protect their citizens. “Citizen” is likely not the right term here as Han is decidedly “post citizen,” and refers to people instead as “Netizens” who dwell primarily in borderless cyber settings. Netizens are not so much exploited by state power as they are by themselves. This is perhaps the most radical and enduring effect on social life wrought by the Digital Revolution—and one that is developing so rapidly. 

 

For Han, there is a kind of “dictatorship of transparency” that rules social media engagement; and Netizens, already prone to human fallibility and inexhaustible desire, are persuaded to voluntarily expose themselves so as to gain and practice a kind of social capital. The price of exteriorizing one’s interior life, though, comes with very high concessions—most of which are tied to the unreserved sharing of personal data. Our most personal information is so easily volunteered, and then harvested (impersonally and repeatedly), by minions of the digital panopticon. In this way, Netizens are at once laborers and products. They exist in increasingly disembodied relationships not only with other human Netizens (who are in the same boat), but also alongside created (and then artificially replicated) algorithms, chatbots, and robots whose main purpose is to impersonally construct and sustain novel forms engagement, productivity, and commerce. This landscape is decisively new and creates an expanding loop that is increasingly less about human community and more about the gravitational pull and volatility of mutually shared auto-exploitation. And let us hasten to add here that “robot” comes from the Czech word “robota” which best translates to “forced labor.”

Given this environment, it is not surprising to learn that, for Han, the main consequence of auto-exploitation is psychological depression and human burnout. Our relationships are becoming circuits of depersonalized “I-it” encounters (to follow Martin Buber), reduced to “the immanency of vital functions and capacities” that serve capitalist consumption and production as opposed to human spirituality and vitality.     The end game of human life in Digital culture is not leisure (which clearly is the basis and object of human culture), but an incessant and unreflective series of “achievements”— attainments which are always monetized and financialized. For Han, Netizens have also become “achievement-subjects,” a self-generated species evolved and begotten by habits and acts of auto-exploitation. As human cogs in a digital machine (existing in a terrain increasingly populated and manipulated by artificial cogs), achievement-subjects become entrepreneurs of the self and, in doing so, become alienated from themselves and other selves. 

 

A digital screen is a poor representation of the world and it does not reciprocate emotional life in the ways that humans need—even if (i.e., especially because) the dopamine hits it produces are off the charts. Of course, there are exceptions, but relationships that transpire in cyberspace 1) are not full bodied, person-to-person encounters (“I-Thou” encounters to follow Buber) and so are deficient, and 2) are largely ordered to commodity and transaction as opposed to altruism and healthy other-centeredness. These embedded tendencies have exacting costs on spiritual and psychological health, and our experiences in these spaces is one of advancing instrumentalization.  Saturated screen culture produces and reinforces business (not human) ontologies of commodification and atomization that, again, not only disconnect us from each other, but from the core of ourselves. 

 

Clearly, we cannot simply blame algorithms, netbots, and robots for the predicament in which we suddenly find ourselves. Servility and auto-exploitation require both consent and some type of anthropological engine if they are to gain real, widespread traction.  This brings us to my third point, and for this aspect we need to look no further than René Girard and his superb work on mimesis. Girard’s last major book before he died in 2015, Battling to the End: Politics, War, and Apocalypse, is a book-length conversation (with Benoit Chantres) that covers everything from Carl von Clausewitz’s theories of war to the global scale of political polarization precipitated by the failure of mediating institutions.  Battling to the End shook the ramparts of French culture but made little waves in the North America. Girardians have been arguing (online, of course) about the shape of his apocalypticism ever since. 

 

Readers of Girard know that his central view is that mimesis is the great behavioral reality of human persons and cultures. Humans learn from models and imitate them as a natural matter of course, and history is largely a tale of the dramatic expressions of mimetic performance and disruption. While mimesis, according to Girard, is “neutral” and part of our anthropological machinery, its effects tend to be explosive and unbridled.  In Battling to the End, Girard notes how the mimetic loop is contracting at an alarming clip and is reproducing a mostly negative contagion—one that is intoxicating and disintegrating everything from interpersonal relationships to the stability of global democracies.

A main source of this apocalyptic development, of course, is the speed of technology and the multiplying forms of social media. “What drives history is not what seems essential in the eyes of Western rationalists,” Girard declares, “in today’s implausible amalgam, I think that mimesis is the true primary engine.”    Mimesis, as it happens, is also the key to AI design and large language models and finds life online in “Nudging”— a term that describes the ability of systems or robots (embodied AI) to overtly influence human users with or without their direct consent. The IEEE counsels great care with Nudging and views it as a fundamental ethical issue associated with cyber influence. There are some benefits to Nudging (it is, for example, the root of autocorrect technology that helps expedite digital communication); but these benefits are quite proscribed and, at present, narrow. AI systems that possess deep learning Nudging capacities operate on an advanced mimetic logic (which, of course is all too human) in order to manipulate and quietly coerce human decisions, a disturbing potential (also human) that must give us pause. 

Girard’s insights from Battling to the End not only serve as a corrective to the whiggish utopianism of our Enlightenment derived historical moment, but also steers us back to authentic Christian humanism, a phenomenon that has happened more than once in human history. Girard, a Catholic convert, observes late in the dialog: “Christianity invites us to imitate a God who is perfectly good. It teaches us that if we do not do so, we will expose ourselves to the worst. There is no solution to mimetism aside from a good model.”    Clearly for Girard human anthropology is also theological anthropology. It may seem painfully self-evident (and pious, even), but it is upon the humanity Jesus, modelled so explicitly in the Gospels, that our focus must be fixed if we are to truly understand how we might navigate this apocalyptic moment as creatures lovingly made in the Imago Dei.  To do so we must admit that history follows an apocalyptic logic more than it does the logic of rationalism. 

But it is precisely what is meant by “apocalyptic,” a concept that Girard returns to again and again in his work, that needs to be recovered.  The word’s original sense is a theological one—of “revelation” or “uncovering”—more than it is an end-of-days term deployed today to describe catastrophes of all kinds. For me, what is being revealed and uncovered in our technologized moment is what we are learning—or not learning—about ourselves. 

For me, what is being revealed and uncovered in our technologized moment is what we are learning—or not learning—about ourselves.

“More than ever,” Girard observes early in Battling to the End, “I am convinced that history has meaning—and that its meaning is terrifying.”    Hard to argue that, especially in a media landscape drenched in the arts of zombie dystopia. But in this observation there is also an invitation. As Pope Francis perceives so incisively in Laudato Si': “We have to accept that technological products are not neutral, for they create a framework which ends up conditioning lifestyles and shaping social possibilities along the lines dictated by the interests of certain powerful groups.”    We must come to see that the technocratic paradigm, manifested in our age of digital culture and AI, is a force of our own making with which to be reckoned. Yes, we find out who we are by what we create—and, perhaps more importantly, by how we care for what we create (Mary Shelley’s Frankenstein, anyone?); but we are at the same time reminded that we have the gifts, capacity, and responsibility to respond to the challenges we face. God’s self-gift, specifically embodied in the Jesus of the Gospels, models a durable anthropology because it is true; and, because it is true, it makes a limitless answer to the dynamic unfolding of faith and culture in the era of science, digital culture, and AI. 

 

The spectrum of possibility here, as ever, is a comprehensive one.  The imagination that creates such innovative and constructive tools is a most generous and powerful gift. But let us not forget the other tools in our haversack either—those of contemplation, discernment, reason, courage, and friendship to name only a few. In the full cultivation and exercise of all of our endowments and capacities, we can say here, without reservation—and with hope in the God in whom we live, move, and have our being—that on or about April, 2023 human relations shifted. Human nature? Not so much. Proceed accordingly.

1    Mark Fisher, Capitalist Realism: Is There no Alternative? (Winchester, UK: Zero Books, 2009), 4. 

2    Han, The Expulsion of the Other: Society, Perception and Communication Today (Cambridge, UK: Polity, 2018), 51.

3    René Girard, Battling to the End: Politics, War, and Apocalypse (Lansing: MSU Press, 2009), 213.

4    Ibid., 215.

5    Ibid., xvii.

6    Pope Francis, Encyclical letter Laudato si' of the Holy Father Francis. 1st ed. (Vatican City), 107. Available: https://www.vatican.va/content/francesco/en/encyclicals/documents/papa-francesco_20150524_enciclica-laudato-si.html.

Murphy.jpeg

Michael P. Murphy

Michael P. Murphy is Director of Loyola’s Hank Center for the Catholic Intellectual Heritage. His research interests are in Theology and Literature, Sacramental Theology, and the literary/political cultures of Catholicism—but he also thinks and writes about issues in eco-theology, Ignatian pedagogy, and social ethics. Mike’s first book, A Theology of Criticism: Balthasar, Postmodernism, and the Catholic Imagination (Oxford), was named a "Distinguished Publication" in 2008 by the American Academy of Religion. His most recent published work is a co-edited volume (with Melissa Bradshaw), this need to dance/this need to kneel: Denise Levertov and the Poetics of Faith (Wipf and Stock, 2019). He is currently at work on a monograph entitled The Humane Realists: Catholic Fiction, Poetry, and Film 1965-2020

bottom of page