Latest

Wittgenstein Wednesdays, Session 1: Philosophical Investigations Sections 1-25

Nathan P. Gilmour

Want to read along with us? Get Philosophical Investigations from amazon.com.

Recently an intrepid group of students and faculty at Emmanuel College began a school-year-long adventure in philosophy, planning together to read Wittgenstein’s Philosophical Investigations together, a bit at a time, over the fall and spring semesters.  As we got the group rolling, I volunteered to provide some sort of online catch-up for those who couldn’t make this or that meeting.  This series of blog posts will attempt to do that.

Slab!  Or, Why Simple Theories and Solipsism Both Miss the Phenomena

If you ever want to find out whether an English-speaking friend of yours has read this book, send her a one-word (and punctuation mark) email reading simply “Slab!”  If your friend assumes your account has been compromised, then likely he hasn’t read the book.  If your friend asks when you read Wittgenstein, she might just have read this book.

The opening sections of Philosophical Investigations (parenthetical notations in the following refer to sections, not to pages) inquire into the nature of language, starting first with the common conception that words and things correspond on some level, with the word “apple” corresponding to a particular sort of fruit and the word “red” corresponding to this color but not that one (1).  But by means of a series of thought-experiments, Wittgenstein begins to show that such a view of language misses some of the most basic ways in which language does work.  The earliest thought-experiment asks a reader to imagine a language in which those who know the language use the words “block,” “pillar,” “slab,” and “beam” to help them build something together.  When one calls out “Slab!,” the other brings along the slab, and so on (2).

Simple enough, right?  But such a scenario already assumes that a one-word utterance might mean something entirely different depending on where in the story of the construction-project it happens.  So, for instance, two workers who have worked together for some time might proceed as described above, but when someone new joins the crew, a more experienced worker might use the same one-word utterances to demonstrate what work the words do, without the expectation that the new worker will bring the component to a given spot.  Thus the veteran might point and say “Slab!” not to call for another component for the task at hand but to let the rookie know, for future reference, what to pick up when the same worker calls “Slab!” later (6.)

A couple things should be occurring to you already: first, Wittgenstein uses particular scenarios precisely to demonstrate that simple, universal theories of language cannot account for all of the complexity and different rule-sets that govern language use.  Second, neither a simple universal theory would do (because it would have to account for too many uses of “Slab!) nor a solipsistic notion of language, in which each individual decides what words mean, will work here.  “Slab!” means something, and not something else, in each of these moments and in other conceivable moments (for instance, sending one’s friend a one-word email), and those who try to make “Slab!” do other work have a task ahead of them.  (Such a task is not impossible, as Wittgenstein demonstrates with the word “Slab,” but it takes some doing.)

Let’s Play a Language-Game!

For the sake of thinking differently about how language works, he proposes thinking about them not as neat, orderly systems but as clusters of games.  So in one game, “Slab!” is a command to bring a certain sort of building material from there to here, whereas in another, “Slab!” is the name of an object, a word but not a thing, to be discussed by a philosophy seminar.  How those games connect to each other isn’t yet the concern of the book, but Wittgenstein offers an early metaphor of the same, comparing language to the growth of a metropolis, complete with orderly suburbs, from the core of an old city, whose inner streets twist and a turn with a logic lost to time (18).

But back to games, Wittgenstein counters in these early sections certain philosophies of language that posit some sort of proposition (a sentence with a subject and a verb, in that order, making a statement) behind all other sorts of utterance.  One could, Wittgenstein concedes, insist that “Slab!” really means “I want you to hand me a slab,” but just as easily, he jests, could we not imagine that, behind all other sorts of utterance, there’s “really” a question followed by an affirmation?  He supplies one example, and the imagination runs from there: “[F]or instance, ‘Is it raining?  Yes!'” (22)

As I read it, one of the early theses of the book is that “to imagine a language means to imagine a form of life” (19).  Thus language is neither limited nor unlimited because human existence is neither limited nor unlimited.  Before the invention of baseball, to talk about baseball was unintelligible, but that doesn’t mean that the addition of baseball-talk makes language anarchic; instead, it’s intelligibly historical.  The “Slab!” language (yes, by the end of the first session of Wittgenstein Wednesdays, our whole group found the word “slab” utterly hysterical) derives its intelligibility from the human practice of building things with beams and blocks and slabs, and talking about double-plays and the infield fly rule is gibberish in an imagined context in which folks don’t play baseball.  The philosopher of language does well not to say too early (which is to say at all) that the limits of language are already drawn, but the skeptic of linguistics does well to note that, when talking to baseball fans, “designated hitter” does not mean whatever the speaker wants it to: it’s intelligible as evil because it’s part of the discourse of baseball, but that doesn’t change the fact that it’s evil.

As I write this, I know full well I need to leave this segment of Philosophical Investigations (which we discussed two weeks ago) and turn to sections 26-52 (which I’ve read but need to review now).  More on that in the next few days.

 

Dante’s Purgatorio and Graduate School

Nathan P. Gilmour

Why Can’t Grad School Be Purgatorial?

No, good reader, that would be too easy.  Graduate school itself wasn’t much at all like Purgatory.  After all, the conditions for my leaving graduate school had little to do with desire for God and not much more to do with purging vices.  No, graduate school is a place where one earns one’s departure by writing and defending a dissertation, and I tip my hat to those still earning that exit.  One thought that never occurs to me, while I teach Dante, is that Purgatory is anything like graduate school or vice versa.

Rather, something occurred to me recently while planning a lesson on the Terrace of the Prideful in the Purgatorio.  (That’s stanzas 10 to 12, if you want to go read them now.  You really should.)  Graduate school isn’t Purgatory because one earns one’s way out of it, but it’s also not a place where a resident undergoes the same kind of moral formation that the saved undergo as they prepare for Heaven.  And that might be why we among the living don’t experience art as do the saved dead.

When I teach the middle Canticle of Dante’s Commedia, I frame Purgatory proper as one of the great medieval manifestos on art.  Dante is an Aristotelian most of the time, but with regards to art (along with several other questions, to be fair), he takes the best from Plato and from Aristotle.  On one hand, every terrace involves seeing or hearing stories of virtue, the sorts of narratives of which Socrates approves in the Republic.  On the other, the fear and pity that Aristotle recommends in the  Poetics as the defining features of the best tragedy are all over the stories that the saved souls encounter as they hear stories of their particular vices and their consequences.  In other words, Purgatory displays a robust Classical vision of the good things that art can do for the soul, and there’s room there for stories of goodness as well as stories of badness.

With all that, the art in Purgatory isn’t precisely like the art that readers, even those who read Dante, encounter in the world of the living.  After all, exemplars of vice and virtue come to the souls precisely as they need them, in forms that leave no room for ambiguity.  The good exemplars are good and the bad bad, and there’s not much dispute about which one is at play in any given moment.

In other words, art in Purgatory is completely rational.  But it’s not the entirety of the story.

Before You Get to Purgatory, You’ve Got to Raise a Little Hell

My hunch is that the Inferno, the Canticle that most folks read, provides another range of possibilities when it comes to art, but until one reaches Purgatory, that range doesn’t appear as a range but as a totality.  After all, from the time that Dante sees the souls in ante-Inferno racing after the flag, being stung by eternal insects and never stopping to settle into one camp or the other, artistic representations of one sort or another are always pointing the reader–the characters within the story as well as those of us enjoying it centuries later–towards spiritual relationships that, without the images and allegories, would remain obscure in the world of civil wars and inept tyrants and the daily struggles of the living.  What Dante calls contrapasso in the Inferno is one mode of art, the exposure of duplicity and corruption in the name of spiritual truth.  What masquerades as spiritual love gets exposed as lust, as full of wind as any vice; and to deal among the living in philosophies that deny the soul leads, in Inferno, to an eternity locked in a burning box, unable to transcend just as one denied transcendence among the living.

When I think about Inferno in light of Purgatorio, the earlier Canticle strikes me not as identical with but at least related to the hermeneutics of suspicion that was the warp and woof of my own graduate education in literature.  To be sure, there were moments–glorious moments–when the texts were there to teach us, not to be dissected, but many of my encounters with literary text happened for the sake of debunking, of suspecting, and generally of unmasking what was “really” going on with the political and social agendas of those texts most frequently anthologized.

Dante helps me realize that such a practice is not entirely bad.  For Dante, seeing the wickedness of the ancients and of his own contemporaries unmasked means that his descent past the nether parts of Satan becomes, by virtue of his sinking as low as one may sink, an ascent to more Heavenly things.  And, to read Dante allegorically, such might be a good course of things for a student of literature and philosophy and such: when the student faces without obfuscation the darkness that lies at the heart of every human enterprise, what remains at least has a chance of being genuine hope.  (There’s always a chance, perhaps an inevitability, that we only thought we had reached the bottom and that we allowed ourselves to turn back rather than to plow through, but that’s why learning goes on past one’s formal education, no?)

But one distinction remains between the damned and Dante in the Commedia, and I wonder whether we make too little of that in graduate school.

Suspicion as a Road Beyond Suspicion

The damned will never take a single step past the unmasking of evil; for them, all that remains is evil in all of its ugliness.  For Dante, by Heavenly grace alone, to be sure, something lies beyond.  But it’s not Heaven, at least not immediately.

Purgatory challenges my own sense of ambiguity in the world by insisting, at every turn on two realities: that human beings are authentically good and bad, better and worse than they could be; and that only a gift from Heaven will allow me to see what’s good and what’s bad.  Neither of those realities is comfortable, but both together constitute the structure of Purgatory.  Only after Dante moves through both of those realities, terrace after terrace, can he ascend, and only after the saved have come to desire what Heaven gives, and ultimately the Lord of Heaven, will they truly enjoy Heaven.  It’s a bit of an affront to those of us moderns who have internalized Hume’s dictum that beauty is a function of the one seeing an object, not the beauty of the object itself, to think that the seer needs transformed, but there it is.  And if one takes Dante seriously, the implication seems to be that anyone reading the poem isn’t there yet.

And that’s what takes me back to remembering graduate school.  All of my professors, on some level, seemed to have at least a sense that literary narratives and lyric poems and good novels might, for those who learn to be suspicious, carry us forward beyond the suspicion.  (Some of you readers might think I’m being naive again there.)  My suspicion is that, as a young graduate student, I just wasn’t ready to enter Purgatory yet.

 

The Christian Humanist Podcast, Episode #144: Allegory

Michial Farmer

the-pilgrim-at-the-gate-of-idleness-1893.jpg!BlogDavid Grubbs holds forth with Nathan Gilmour and Michial Farmer about allegory, both as a mode of reading and as a literary genre. The debate hinges on what terms mean in which contexts: is a literary text defective because it’s an allegory, or are there good or bad allegories? Explore that and other questions with us.

“Against Allegory”

Our theme music this week is Radiohead’s “Packd Like Sardines in a Crushd Tin Box,” from 2001’s Amnesiac. I’m a reasonable man. Get off my case. Get off my case.

Christian Humanist Profiles 12: Structuralism, Modern Literature, and Christianity

Michial Farmer

Anyone who’s spent any time at all with the New Testament is familiar with the opening dew-on-a-spider-websentences of the Gospel of John: “In the beginning was the Word, and the Word was with God, and the Word was God. He was in the beginning with God. All things came into being through Him, and apart from Him nothing came into being that has come into being.” And anyone who’s spent any time at all in graduate studies of language and literature is familiar with another view of the word, and words. To quote Ferdinand de Saussure, the great Swiss structuralist linguist, “The linguistic sign unites, not a thing and a name, but a concept and a sound-image.” And in the next section, he says, “the linguistic sign is arbitrary,” meaning that no intrinsic relationship exists between the English word tree and the class of entities we use that word to refer to. Word is severed from thing. This is the foundation of structuralist linguistics and thus the foundation of much of twentieth- and 21st-century thinking.

Our guest today on Christian Humanist Profiles is Dr. Roger Lundin, the Arthur F. Holmes Professor of Faith and Learning at Wheaton College and the president of the Conference on Christianity and Literature. Dr. Lundin is the author of numerous books, including Literature Through the Eyes of Faith, a standard text for Introduction to Literature courses at Christian colleges. His latest book is Beginning with the Word: Modern Literature and the Question of Belief, which takes on the structuralist conception of language as a sign-system and proposes a different way of viewing language

Anonymity, Fame, and Alienation

Michial Farmer

In 1966, Ralph Harper, the Episcopal priest and expositor of existentialism, found himself in the middlea-personification-of-fame.jpg!Blog of the alienating twentieth century. Spiritual alienation, of course, existed long before 1966, and long before Harper’s “century of homelessness and exile, of nervous disorder and persecution, of actual enslavement and barbaric cruelty.” And yet the alienation of Harper’s age was different from that of previous centuries, in that it was bound up with the twentieth century’s peculiar anonymity. “To-day,” he writes, “isolation itself must be regarded as the chief symptom of the pressures on man. To-day isolation and anonymity are synchronous.” The mass age is the age of the faceless—and thus the age of estrangement.

It’s tempting to see this as a charmingly outdated analysis of a previously modern condition. After all, the internet has in some ways done away with anonymity altogether—social media broadcasts our images and opinions 24 hours a day; the government and the multinational corporations that work alongside it have almost unlimited access to our ostensibly private lives; and, according to several well-publicized polls over the last decade, Millennials are more interested in becoming famous than the generations that preceded them. But these facts, especially the last of them, actually suggest that Harper’s age of anonymity and estrangement is as present today as in 1966—or even more so.

Harper, in fact, suggests that in an alienated era, the need for recognition becomes a means of self-verification: “Men have to be known by others so that they can be sure they know themselves; there are no objective means to evaluate what one is and what one does. Living has become so subjective that one must appeal to other subjects for a guarantee of one’s position.” Our obsession with celebrity—the sheer amount of time we give over to thinking about the lives of the rich and famous on the one hand or the “everyone’s a star” milieu of the Internet on the other—is, in Harper’s view, an indication of a weak personality. It suggests a person who is fundamentally unsure of himself, unable to ground his identity in anything solid. Ultimately, our hunger for fame is a kind of perverted drive for transcendence held by people with a diseased spiritual sense—and I include myself (a Millennial) in that category, since I’m as bound to the vicissitudes of low culture and to the longing for academic fame as anyone who’s gone through a PhD program in the humanities. I seek what we all seek on some level: I want to be published and to have the self-verification of recognition: I am bright; I am insightful; I have that identity, at least. The idea of publishing this essay anonymous fills me with terror.

The desire for recognition, for fame, is an identity-disease, or, to mix my metaphors, it is a cloak meant to cover the nakedness of the modern soul. Millennials may or may not have the disease worse than previous generations, but no era has ever been immune to the condition. In the seventeenth century, Andrew Marvell had to chastise his own tendency toward it, most notably in his poem “The Coronet”:

When for the thorns with which I long, too long,
With many a piercing wound,
My Saviour’s head have crowned,
I seek with garlands to redress that wrong.

Marvell, over the course of this remarkable poem, moves back and forth between pride and self-reprimand. He writes devotional poetry, in praise of Christ, and he naturally wants it to be of the very highest quality, befitting its subject—but the very act of service is a snare:

Alas! I found the Serpent old,
That, twining in his speckled breast,
About the flowers disguised, does fold
With wreaths of fame and interest.

This is the constant temptation for the artist (and for the academic, for that matter, although our heights of fame are even smaller than those of the poet). In a world of instant celebrity—flash-paper celebrity, instantly ignited and immediately forgotten—even our virtues can be turned into vices. And any attempt to keep them as virtues will only make them more vicious, for success would be something to be proud of. Marvell’s solution is to turn back to devotion:

But thou who only couldst the Serpent tame,
Either his slippery knots at once untie,
And disentangle all his winding snare,
Or shatter too with him my curious frame,
And let these wither—so that he may die—
Though set with skill, and chosen with care;
That they, while thou on both their spoils dost tread,
May crown Thy feet, that could not crown Thy head.

This is also the solution posited by Flannery O’Connor in the best of her short stories, “Revelation,” in which Mrs. Turpin, a woman who has always prided herself on having the wits to do what needs to be done, has a vision of an afterlife in which the first are truly last: “They were marching behind the others with great dignity, accountable as they had always been for good order and common sense and respectable behavior. They alone were on key. Yet she could see by their shocked and altered faces that even their virtues were being burned away.” The answer to the problem of our misdirected longing toward fame, it seems, is a forced return to anonymity. Christ removes the crown from Marvell’s head and, just to demonstrate how paltry a thing celebrity is, puts it at His own feet, not even on his head. Mrs. Turpin watches as the things that make her gloriously herself are painfully and violently removed from her.

That the same solution is posed by Christian thinkers as diverse as Marvell and O’Connor suggests that it is not a matter of gender or of denomination or even of historical era. In fact, I don’t think it’s even necessarily a matter of religion. The indie folk band Fleet Foxes were getting at something very similar on their 2011 song “Helplessness Blues,” in which they come to terms with the fallout from our culture’s “everyone’s a star” mentality:

I was raised up believing I was somehow unique
Like a snowflake distinct among snowflakes, unique in each way you can be
And now after some thinking, I’d say I’d rather be
A functioning cog in some great machinery serving something beyond me

The members of Fleet Foxes, as far as I can tell, have no particular commitment to any organized religion, and yet their solution to the problem of the fame-drive is strikingly similar to Marvell’s and O’Connor’s. It is to restore the false transcendence of the fame-drive to a genuine transcendence—to move from the center of the stage to a position in the audience.

In other words, the solution to the problem of the fame-drive is to restore ourselves to a kind of anonymity. But this is not the brutal anonymity of the mass age, the sort of cold facelessness that exacerbates the fame-drive (though it does not create it). This is instead the anonymity of the devotee—Marvell’s laying the crown of thorns at Christ’s feet, or Mrs. Turpin’s long, painful march through Purgatory, or Fleet Foxes’ service as cogs in a larger machine (though this industrialized and mechanized image suggests a certain late-modern spiritual poverty, compared to my other two examples). It is a recognition that the personality of the artisan and even the quality of the art are not as important as the audience to whom the art is offered.

But this movement back into anonymity has a Kierkegaardian flavor to it. Kierkegaard famously admires Abraham not merely because he was willing to sacrifice Isaac when God demanded it but because he simultaneously “believed that God would not demand Isaac of him, while still he was willing to offer him if that was indeed what was demanded.” Thus he sacrifices Isaac and keeps Isaac at the same time—and thus Marvell sacrifices his glory to Christ and nevertheless ends up as one of the most famous poets of his century, and Fleet Foxes renounce their individuality, only to produce one of the most celebrated albums of the year. Not all of us can hope for such a result, of course, but Mrs. Turpin is a better model for us. Her virtues, her glory, perhaps even her personality are burned away from her—and yet presumably her vision will not end with her as a nameless face in the heavenly crowd but with her as herself, with a new identity given her by the God whom she pursues through the river of fire.

Again, then, the lesson taught by all these works is the renunciation of a false route to a phony transcendence—and with it, the renunciation of an identity we feel to be our own but which comes actually from the very crowd we’re hoping to escape. We must work for God, or for some other higher purpose; in so doing we will be given an identity that is as permanent as the thing we love, an identity that goes beyond mere recognition and thus actually transcends the anonymity of our age.

Gene Simmons, Capitalism, and Why I’m Torn about the Death of Rock

Nathan P. Gilmour

Gene Simmons: “Rock Is Finally Dead”

The Parable of the Madman from The Gay Science might seem a strange partner for a Gene Simmons interview, but the latter made me think of the former.  Of course, both of them deal with moments of cultural transition, the sorts of things about which historians argue and philosophers speculate and interview subjects pontificate.  But beyond that, both bring a striking metaphor, murder to be precise, to bear on phenomena that, in the way I normally think of things, aren’t vulnerable to murder.  And in both cases, I’m not sure what I should make of the metaphor, whether I should scoff along with the crowds or begin to write my dirge for the murdered.  Either way I don’t come out liking Gene Simmons any more than I did before I read the interview.  (A hint for the reader: that wasn’t much.)

A Culture Killed

If you’ve not listened to our podcast episode on the clause “God Is Dead,” you should, but I don’t want to make you listen to it before you finish this post, so I’ll provide the beginning of the text of the rightly-famous parable:

Have you not heard of that madman who lit a lantern in the bright morning hours, ran to the market place, and cried incessantly: “I seek God! I seek God!”—As many of those who did not believe in God were standing around just then, he provoked much laughter. Has he got lost? asked one. Did he lose his way like a child? asked another. Or is he hiding? Is he afraid of us? Has he gone on a voyage? emigrated?—Thus they yelled and laughed.

The madman jumped into their midst and pierced them with his eyes. “Whither is God?” he cried; “I will tell you. We have killed him—you and I. All of us are his murderers. But how did we do this? How could we drink up the sea? Who gave us the sponge to wipe away the entire horizon? What were we doing when we unchained this earth from its sun? Whither is it moving now? Whither are we moving? Away from all suns? Are we not plunging continually? Backward, sideward, forward, in all directions? Is there still any up or down? Are we not straying, as through an infinite nothing? Do we not feel the breath of empty space? Has it not become colder? Is not night continually closing in on us? Do we not need to light lanterns in the morning? Do we hear nothing as yet of the noise of the gravediggers who are burying God? Do we smell nothing as yet of the divine decomposition? Gods, too, decompose. God is dead. God remains dead. And we have killed him.

I found the Simmons interview fascinating because, unlike the madman in Nietzsche’s parable, who acknowledges his own part in things, Simmons seems to be saying that Rock is dead, and you have killed him–you but not I.  For Simmons, file-sharing software–and Napster serves as the synechdoche for a whole array of software–is the culprit in this murder case.  The verdict is simple and not examined for too long.

One needn’t look beyond those programs to see what killed rock ‘n roll.  It’s not the evolution of arena-rock, which funneled people who might have been in small clubs paying to see a larger number of bands into football stadiums, where everyone pays to see the same band.  It’s not the record industry, which replaced “live entertainment” (which used to be simply “entertainment”), which required each location to have its own band, with privately-accessible simulacra of musical performances listened to over headphones in isolation or, if in groups, via reproductions of performances rather than performances.  And certainly it’s not the cult of the international rock superstar, which gave  singers license to be sexual predators for a few decades, casting them beyond the reach of the larger culture’s structures of accountability and become embarrassingly wealthy because of it, alienating the masses even as the masses can’t get enough of their guitar gods.  Nope, I’m pretty sure it all died with the advent of Napster.

Now I’ll go ahead and say here that not one of Gene Simmons’s responses surprises me.  Breadth of historical vision is not what makes someone the front-man of KISS, and the ability to acknowledge one’s own contribution to bad things in the world is not a curse I’ve ever seen Simmons burdened with.  But I do want to note here that Simmons laments that the kids learning to play guitar now will never be as big as Gene Simmons, and he blames Napster and other file-sharing services for that fact.

I wonder, though whether Gene Simmons laments, or has ever lamented, the folks of his own generation who never became KISS.

After all, he seems quite concerned with the kid who’s fifteen now, plugging his guitar into an amp, who will not have the structure of the mid-twentieth-century music industry to propel his dreams.  But on the other hand, he seems unaware that kids in 1967 had just as little chance of making it in rock music, in those “iconic” times, as they do now.  To make a living playing rock music was still about as rare as making it in the NBA or onto the faculty of Harvard Medical School.  Many tried, few succeeded, and the rest of the folks moved on.  And what might be even more important, the rise of radio and television, and the subsequent ascendancy of institutions like Billboard‘s charts and ESPN,  likely caused many more aspiring musicians and actors and athletes to quit doing what they do than Napster ever did.

To put things another way, if you read Gene Simmons in the interview, you might get the impression that, at one point, many became “iconic” for recording music, then file-sharing come along, now almost nobody does.  The reality, as I understand it, is that the existence of the “iconic,” on a national scale, itself reduced the number of people actually producing entertainment, shifting the media ecostem towards passive consumers instead.  Or, to put it one more way, in Simmons’s golden age of the sixties and seventies, most of the English-speaking world remained among those who bought records with their day-job paychecks, and now relatively fewer, but not much more minuscule a sliver of society, can do that.

In other words, for most of us, no big loss.

As far as I can tell, Capitalism has been limiting the ability of rock fans to become rock stars since the beginning.  Those who pull the strings in whatever business model dominates (the record labels that Simmons lionizes as well as Apple and Amazon, the new dominant distribution channels for recorded music) know that people will buy the dream of stardom from whoever sells it, that there’s no money in making that democratic.  What Simmons holds up as the system that gave him the chance to be the face of KISS is the same system that only thrives if there aren’t too many creative-types getting big and thus diluting the market.

What I wonder is whether the proliferation of production software and distribution vehicles that came along with Napster, towards the turn of the millennium, might actually be the signs of a time to come that returns the Capitalist world to something more like life before arena rock, when bigger isn’t better any more and when “iconic” is something for which the youngsters mock their grandparents’ nostalgia.  My hunch, though it’s destined to remain a hunch, is that, a hundred years or so from now, there will be as much lamentation for the rock stars of the nineteen-seventies as there are now for the really grand Vaudeville personalities put out of work by the rise of radio and television.  Perhaps there will be graduate students presenting papers on why there was no Gene Simmons in the mid-twenty-first-century, but there won’t be a great sense of cultural loss because of such things.

Living the Funeral of Rock

Now that doesn’t mean I won’t get nostalgic; after all, as I pass from my current middle-age into the ranks of the grumpy old men (right now I’m just a grumpy man, thank you), I’ll be among those who remember being able to talk about a half dozen bands with anyone from the English-speaking world, knowing that folks from Texas and New York and Scotland would all have a basic notion of who the Beatles and the Stones are and why the fan-base tensions between the two are important.  I’ll remember my friends’ stories from big rock concerts (I don’t go to many, because I don’t like big crowds), the buzz that arose when a big show came to Atlanta or Indianapolis.  I’ll still remember watching music videos on MTV rather than on YouTube, for pity’s sake!

With all that, I think that this lament of the “murder of rock” sheds some light on Nietzsche’s parable.  To be sure, the Church had a longer shelf life than the rock star seems destined to have, but the sociological weight of one “murder” sheds light on the other.  Even Gene Simmons isn’t dumb enough to think that nobody will own guitars, play live shows at birthday parties, and perhaps even spawn university graduate programs in rock-and-roll composition-and-performance.  And certainly Nietzsche wasn’t dumb enough to think that there wouldn’t be any Christians after he died.  But both were noting a different sort of passing, a move from a world dominated by one sort of public event into a future in which the public was going to take on a very different cast.  Generations later, there still might be Christians, and generations later, there still might be rockers, but they’ll be of a different sort, a remnant that holds on, remembering and repeating what they hold to be a better way to be human.  The world will change, and so will the Christians and the rockers, and the interesting intersections will happen precisely where the followers of the old traditions articulate and live new ways to do so.

And like Nietzsche’s parable, Simmons’s interview makes me realize just how interesting and just how futile speculating about such a future can be.  Who knew, after all, that the decline of state Christianity in Europe would give way to the Anglican and the Pentecostal expansion of Christianity in the Global South?  That Christianity would rocket out of the tent revivals of Nietzsche’s day in North America and end up becoming megachurches?  That the technologies as yet unimagined by Nietzsche would create the conditions in which something like our little Internet project would be intelligible?  My hunch is that, just as Vaudeville gave way to big-venue concert tours and club-based standup comedy, that rock itself has already provided the seeds for whatever comes next.

The cool part, if you dig watching cultural trends as I do, will be to see where they sprout up.

The Christian Humanist Podcast, Episode #143: Proofs for God

Michial Farmer

Nathan Gilmour hosts a conversation about the five “proofs of God” from the opening sections of Thomas Aquinas’s Summa Theologiae. st-thomas-aqOur discussion ranges over what a proof is for, whether the ontology in the proofs holds up post-Kant, whether reason and revelation can really be friends, and all sorts of groovy philosophical things.

Our intro music this week is “Dear God” by Monsters of Folk, from their 2010 self-titled album.

Christian Humanist Profiles 11: Marvin Wilson on Our Hebraic Heritage

Danny Anderson

For many Christians, their faith was born, wholly formed, 2000 years ago. The covenant established in the-prophet-ezekiel-1510the New Testament provides, for many, all the equipment for living the Christian life requires, and the certainties that come with this confidence are powerful and sometimes lead to an under-appreciation of the wisdom of the Old Testament. Is it possible that this tendency has had a detrimental effect on the life of the Christian mind?

Dr. Marvin Wilson of Gordon College suggests that too many Christians neglect the richness of their tradition and that this has had serious consequences for the depth of their faith experience. Wilson argues, in Exploring Our Hebraic Heritage, that Christianity does not begin with Jesus, but rather with Abraham, and that Christians can learn a great deal from the way Jews have contended with their faith over the centuries.

In this episode of Christian Humanist Profiles, Danny Anderson speaks with Dr. Wilson about the deep connection Christianity has with Judaism, and the lessons Christians might learn from Jewish theological traditions.

The Problem with Doing What You Love

Michial Farmer

My father, a civil engineer, designs enormous conveyors which are used in rock quarries to carry stone the-stone-breaker-1849.jpg!Blogto be crushed. The specifics of the designs are beyond my understanding as an English professor, but I understand enough to know that in a real sense my father’s conveyors are the backbone of modern society. Without them, the stone would be very hard to get to the crusher—and the crushed stone is used, among other things, for asphalt. This long view, I must admit, never got me through my various attempts to work for his companies. If I have seen one blueprint of a conveyor, I’ve seen them all; the tiny mathematical differences that are the line between success and failure are too minute for me to notice; the term troughling idler always sounded too much like an old man’s insult toward young people for me to pay attention long enough to learn what it actually means; and I think rock quarries are hot, dusty, and terminally boring—however essential they are to society.

In this sense, I am not much like my father, who gets excited by the opportunities his quarries give him to solve a problem—to create from his imagination the perfect conveyor for the empty space before him. (I was never very good at Legos, either, and I still can’t get more than a few levels into Tetris.) He has always been a hard worker—I dislike the term workaholic because no one in the history of the world has ever been addicted to workahol—and has regularly put in twelve-hour days since I can remember. This used to confound me. How on earth, I wondered, could anyone spend that much time doing anything—let alone sequester himself into that fluorescent prison that smelled like stale coffee and blueprints to devote himself to conveyors, of all things? I finally worked up the nerve to ask him one day, and his answer has stuck with me: “It doesn’t feel like work if you love it.” The moral is simple: Find your passion and throw yourself into it completely—and then thank God that some people are passionate about, say, civil engineering and not pop music.

I’ve been thinking about my father’s advice a lot lately, in part because of a recent article on The Atlantic’s website called “To Work Better, Work Less.” It’s not necessarily an original argument; it opens with a relevant Bertrand Russell quotation from 1932. But it’s an argument that many of us in America (and increasingly all over the world, as Americans continue to export our lifestyles) desperately need to hear. France is the shining example here, as it often is in think-pieces about the culture of overwork:

Although it has its share of economic problems, France has less than nine percent of its employees working “very long hours” . . . France also has one of the world’s best work-life balances. Working too much is, at best, pointless, and at worst, actively harmful. Overwork dictates our physical health, psychological health, and our time with our family.

The French, famously, adopted a mandatory 35-hour workweek in 2000, and their workers are guaranteed five weeks of vacation time. Paris nearly shuts down, it seems, in July and August, with stores and restaurants closing for les vacances. (And not just stores and restaurants. I subscribe to a podcast version of the France Culture radio program Un Autre Jour Est Possible, and true to form, it disappears for the entire month of August. Un autre mois, apparently, est aussi possible.)

I admire the French and enjoy reading articles like this one, which tempt me to a sort of self-conscious political wisdom: “Why, our government ought to enforce laws like these!” I say, stepping onto the apple crate. But I stumble. After all, I am an English professor, which means I get about three times the mandatory vacation time allotted to the French. Now, professors are not as lazy as you might imagine. Contrary to an infuriating article that made the rounds a few years ago, all but the most supremely tenured among us (and, I would wager, only the top five percent most indolent among the most supremely tenured) work much, much more than fifteen hours per week. But it is true that our hours are often more flexible than the average office worker’s, and many of us do get summers off. (All of this applies only to the tenure-track and not to the increasing numbers of so-called “contingent faculty,” who are used and abused by the higher-education industry. But this is not a post about the very real evils of my field.) For those of us at schools that encourage faculty scholarship but don’t require it, we’re not even forced to spend our summers researching and writing.

And yet I drove to my campus office nearly every weekday this summer—including July 4 and Labor Day. There are a variety of reasons for this. I worked on a few writing projects, yes, and I reorganized some syllabuses for my fall classes. But the truth is that I sat at my desk ten hours per day, five days per week because I wanted to. I like working in the summer because there are no students on campus, and very few faculty members, to bother me. And understand me—when I say “bother me,” I don’t mean “Ask me to do other work.” I mean “Ask me how my day is going, forcing me to interact with other human beings for five minutes.” The building my office is in was once a monastery, and I’d be tempted to call my attitude monkish—except that monks live in community. My attitude is rather, I suspect, strictly 21st-century American. I’m driven to get ahead, except I’m not even really trying to get ahead. It’s closer to the truth to say that I’m driven to be driven. It’s hard not to think of Dante here. The avaricious in Circle 4 of the Inferno are condemned to useless labor:

Ah, God’s avenging justice! Who could heap up suffering and pain as strange as I saw here? How can we let our guilt bring us to this?
As every wave Charybdis whirls to sea comes crashing against its counter-current wave, so these folks here must dance their roundelay.
More shades were here than anywhere above, and from both sides, to the sound of their own screams, straining their chests, they rolled enormous weights.
And when they met and clashed against each other,
they turned to push the other way, one side
screaming, “Why hoard?” the other side, “Why waste?”

If working too hard is a hell, it’s one—as writers as diverse as C.S. Lewis and Jean-Paul Sartre have declared—that we largely choose for ourselves. But let’s not be overly dramatic here. My work—teaching literature and writing about it—may not be the practical backbone of the economy, but it’s hardly the labor of Sisyphus. It has some value. Nor is my motivation, at least most of the time, that of the avaricious. As Virgil explains to Dante,

It was squandering and hoarding that have robbed them of the lovely world, and got them in this brawl:
I will not waste choice words describing it!
You see, my son, the short-lived mockery of all the wealth that is in Fortune’s keep, over which the human race is bickering:
for all the gold that is or ever was
beneath the moon won’t buy a moment’s rest
for even one among these weary souls.

Very few college professors are in it to get rich, and even fewer—even in the tenure-track—actually get rich. (Let me take another moment to again acknowledge my privilege here. I am comfortably middle-class in the United States, which means that on the global scale I am quite wealthy indeed–and unlike my “contingent” co-workers, I don’t have to stack multiple teaching jobs at multiple schools on top of each other to get there.) No, I do what I do—work that I’d like to believe has some social and even some eternal value—because I love it. Most days, I don’t belong in the fourth circle of hell, although I am certain that, as Flannery O’Connor quipped, there’s a berth in Purgatory waiting for me.

What other option is there for me? This is the crux of the problem. It’s an enormous blessing to be allowed to do something I love professionally and an even bigger one to have had some success in it. I’m complaining about winning the lottery in the eyes of some people who are reading this, people who work a job or even multiple jobs that they hate just to keep their heads above water. I recognize my position. And yet such is the sickness of the human soul that we’ll find a way to turn every blessing into a curse, and here is the curse that we’ve made of this particular blessing: When you do what you love, the way all Americans are supposed to, the way that we’re all but guaranteed from childhood that we will, the thing you love moves from being the thing you do to the thing you are, and all of the sudden it’s all-consuming. In other words, doing what you love is wonderful—but then what are you supposed to love when you’re not working? Leisure becomes work.

And if you’ll allow me, for a moment, to slide even deeper into undeserved self-pity, I’ll suggest that the situation is worse for those of us who work in arts-adjacent fields. To return to my father for a moment—it’s undeniable that work and leisure shade into each other at certain points in his life. For example, whenever a major bridge collapses, he spends hours watching the news coverage of the aftermath and angrily explaining to the anchors why they’re wrong about the reasons for the collapse. (This is apparently not an uncommon activity among engineers, God bless them.) What is that if not work and leisure coinciding? But bridge collapses are rare, thankfully, and engineers are not brought into the public eye all that often.

On the other hand, most English professors became English professors because, at some point in their early development, a book grabbed them and wouldn’t let them go. Scholarship begins with love, and one of the purposes in majoring in English is to move from loving books to writing about them. (I am tempted to write “dissecting them,” but that image is too violent. Imagine something half a step down from that.) This doesn’t destroy your ability to love books. I love to read books, and to collect books as physical objects, and to talk about books. But it has profoundly changed the way I read. It’s not that I’m no longer able to read for love—but it is probably true that I am not able to read merely for love, as I once was. And this means that when I read anything, from the pulpiest detective novel to the most intricate Modernist poem, I’m reading it on two levels at once. And what’s more, one of those levels (the one that got me through graduate school) has made itself look an awful lot like the other level. In other words, a good chunk of the fun for me in reading a book comes in the analysis of it—in writing or imagining a paper explaining its ideas, or its structure, or what-have-you. But once that happens, I’m not reading for fun anymore. I’m working.

So what would I do on my five-week French vacation? One of the major modes of leisure open to most people—reading whatever it is you’d like to read—has been closed off to me by my profession. But I can’t just switch over to music or film or television, either, because when you take on the task of cultural criticism, everything becomes a text to be analyzed. I can no more watch a movie without doing the imaginative work of explication than I can read a book, and in fact, I have to be studious in not taking notes when I watch movies. This extends to the silliest and most arcane cultural artifacts. Lately, for example, I’ve been relaxing by listening to radio dramas from the 1970s—but even this can’t escape my work-mind, and I’ve begun hatching an idea for a book about them.

It’s a sickness, and while the specifics of my case are probably different from yours, I suspect many of the people reading this can relate to them. To have a job based in a field you love is to turn what you love into a job. Combine this with a particularly Protestant tendency “to ennoble the act of working, to feel productive (even if we’re not being productive),” and you’ve got the curse that comes from all our blessings. But then, the blessings themselves come—as so many blessings do—out of a curse, one of the first curses given to human beings: “In the sweat of thy face shalt thou eat bread, till thou return unto the ground; for out of it wast thou taken: for dust thou art, and unto dust shalt thou return.”

I am not a theologian, but I can’t help but take this curse as a statement about the relationship between work and death. Since the Fall, we have to work to keep ourselves from the grave. We can no longer merely pluck our food from the trees and sleep in the open without worrying about the elements. We must learn a trade—sometimes physical, perhaps more often intellectual these days, at least in the West. But while work keeps us from the grave, it is also a reminder of it, a reminder that our days are numbered, that every moment we spent working is a moment spent expelled from the Garden.

What does it mean, then, that we are so determined to ennoble our work, to turn it from a way of keeping soul and body together into the very essence of our souls? To make a trite observation: Next time you go to a party and meet someone new, count how many seconds before the inevitable question comes up: So what do you do? The question behind the question is So what are you? But I ask it, too. The diseased world I’ve been describing is my world, and the inverted table of values is my table. I can’t think of a better icebreaker, just as I can’t imagine what I would do on a five-week vacation.

And again: I recognize my remarkable privilege to be able to have a full-time job in a society where so many people do not, and to have a job doing what I love when so many people do not have that option. I recognize the blessings that have been given to me. I just wish I knew a way to keep myself from turning them into curses. But perhaps it is part of the nature of a postlapsarian blessing to always be tinged with a bit of the curse.