Christian Humanist Profiles 13: A Radical Critique of Heidegger

Nathan P. Gilmour

Martin-Heidegger-in-1933--011Certain philosophers shake up the world with a new frame of reference, a new central question, a new way to proceed in doing philosophy.  In the twentieth century Martin Heidegger was one such figure.  His turn from the Enlightenment’s overriding concern for epistemology to a new and refigured investigation of ontology meant a new philosophical project for those who followed him, and along with that quest came a need for critical assessment of his contributions.  One such assessment is S.J. McGrath’s 2008 volume Heidegger: A (Very) Critical Introduction, and Christian Humanist Profiles is happy to welcome Sean McGrath on the program to talk Heidegger with us.

S.J. McGrath, Heidegger: A (Very) Critical Introduction

Wittgenstein Wednesdays, Session 1: Philosophical Investigations Sections 1-25

Nathan P. Gilmour

Want to read along with us? Get Philosophical Investigations from

Recently an intrepid group of students and faculty at Emmanuel College began a school-year-long adventure in philosophy, planning together to read Wittgenstein’s Philosophical Investigations together, a bit at a time, over the fall and spring semesters.  As we got the group rolling, I volunteered to provide some sort of online catch-up for those who couldn’t make this or that meeting.  This series of blog posts will attempt to do that.

Slab!  Or, Why Simple Theories and Solipsism Both Miss the Phenomena

If you ever want to find out whether an English-speaking friend of yours has read this book, send her a one-word (and punctuation mark) email reading simply “Slab!”  If your friend assumes your account has been compromised, then likely he hasn’t read the book.  If your friend asks when you read Wittgenstein, she might just have read this book.

The opening sections of Philosophical Investigations (parenthetical notations in the following refer to sections, not to pages) inquire into the nature of language, starting first with the common conception that words and things correspond on some level, with the word “apple” corresponding to a particular sort of fruit and the word “red” corresponding to this color but not that one (1).  But by means of a series of thought-experiments, Wittgenstein begins to show that such a view of language misses some of the most basic ways in which language does work.  The earliest thought-experiment asks a reader to imagine a language in which those who know the language use the words “block,” “pillar,” “slab,” and “beam” to help them build something together.  When one calls out “Slab!,” the other brings along the slab, and so on (2).

Simple enough, right?  But such a scenario already assumes that a one-word utterance might mean something entirely different depending on where in the story of the construction-project it happens.  So, for instance, two workers who have worked together for some time might proceed as described above, but when someone new joins the crew, a more experienced worker might use the same one-word utterances to demonstrate what work the words do, without the expectation that the new worker will bring the component to a given spot.  Thus the veteran might point and say “Slab!” not to call for another component for the task at hand but to let the rookie know, for future reference, what to pick up when the same worker calls “Slab!” later (6.)

A couple things should be occurring to you already: first, Wittgenstein uses particular scenarios precisely to demonstrate that simple, universal theories of language cannot account for all of the complexity and different rule-sets that govern language use.  Second, neither a simple universal theory would do (because it would have to account for too many uses of “Slab!) nor a solipsistic notion of language, in which each individual decides what words mean, will work here.  “Slab!” means something, and not something else, in each of these moments and in other conceivable moments (for instance, sending one’s friend a one-word email), and those who try to make “Slab!” do other work have a task ahead of them.  (Such a task is not impossible, as Wittgenstein demonstrates with the word “Slab,” but it takes some doing.)

Let’s Play a Language-Game!

For the sake of thinking differently about how language works, he proposes thinking about them not as neat, orderly systems but as clusters of games.  So in one game, “Slab!” is a command to bring a certain sort of building material from there to here, whereas in another, “Slab!” is the name of an object, a word but not a thing, to be discussed by a philosophy seminar.  How those games connect to each other isn’t yet the concern of the book, but Wittgenstein offers an early metaphor of the same, comparing language to the growth of a metropolis, complete with orderly suburbs, from the core of an old city, whose inner streets twist and a turn with a logic lost to time (18).

But back to games, Wittgenstein counters in these early sections certain philosophies of language that posit some sort of proposition (a sentence with a subject and a verb, in that order, making a statement) behind all other sorts of utterance.  One could, Wittgenstein concedes, insist that “Slab!” really means “I want you to hand me a slab,” but just as easily, he jests, could we not imagine that, behind all other sorts of utterance, there’s “really” a question followed by an affirmation?  He supplies one example, and the imagination runs from there: “[F]or instance, ‘Is it raining?  Yes!'” (22)

As I read it, one of the early theses of the book is that “to imagine a language means to imagine a form of life” (19).  Thus language is neither limited nor unlimited because human existence is neither limited nor unlimited.  Before the invention of baseball, to talk about baseball was unintelligible, but that doesn’t mean that the addition of baseball-talk makes language anarchic; instead, it’s intelligibly historical.  The “Slab!” language (yes, by the end of the first session of Wittgenstein Wednesdays, our whole group found the word “slab” utterly hysterical) derives its intelligibility from the human practice of building things with beams and blocks and slabs, and talking about double-plays and the infield fly rule is gibberish in an imagined context in which folks don’t play baseball.  The philosopher of language does well not to say too early (which is to say at all) that the limits of language are already drawn, but the skeptic of linguistics does well to note that, when talking to baseball fans, “designated hitter” does not mean whatever the speaker wants it to: it’s intelligible as evil because it’s part of the discourse of baseball, but that doesn’t change the fact that it’s evil.

As I write this, I know full well I need to leave this segment of Philosophical Investigations (which we discussed two weeks ago) and turn to sections 26-52 (which I’ve read but need to review now).  More on that in the next few days.


Dante’s Purgatorio and Graduate School

Nathan P. Gilmour

Why Can’t Grad School Be Purgatorial?

No, good reader, that would be too easy.  Graduate school itself wasn’t much at all like Purgatory.  After all, the conditions for my leaving graduate school had little to do with desire for God and not much more to do with purging vices.  No, graduate school is a place where one earns one’s departure by writing and defending a dissertation, and I tip my hat to those still earning that exit.  One thought that never occurs to me, while I teach Dante, is that Purgatory is anything like graduate school or vice versa.

Rather, something occurred to me recently while planning a lesson on the Terrace of the Prideful in the Purgatorio.  (That’s stanzas 10 to 12, if you want to go read them now.  You really should.)  Graduate school isn’t Purgatory because one earns one’s way out of it, but it’s also not a place where a resident undergoes the same kind of moral formation that the saved undergo as they prepare for Heaven.  And that might be why we among the living don’t experience art as do the saved dead.

When I teach the middle Canticle of Dante’s Commedia, I frame Purgatory proper as one of the great medieval manifestos on art.  Dante is an Aristotelian most of the time, but with regards to art (along with several other questions, to be fair), he takes the best from Plato and from Aristotle.  On one hand, every terrace involves seeing or hearing stories of virtue, the sorts of narratives of which Socrates approves in the Republic.  On the other, the fear and pity that Aristotle recommends in the  Poetics as the defining features of the best tragedy are all over the stories that the saved souls encounter as they hear stories of their particular vices and their consequences.  In other words, Purgatory displays a robust Classical vision of the good things that art can do for the soul, and there’s room there for stories of goodness as well as stories of badness.

With all that, the art in Purgatory isn’t precisely like the art that readers, even those who read Dante, encounter in the world of the living.  After all, exemplars of vice and virtue come to the souls precisely as they need them, in forms that leave no room for ambiguity.  The good exemplars are good and the bad bad, and there’s not much dispute about which one is at play in any given moment.

In other words, art in Purgatory is completely rational.  But it’s not the entirety of the story.

Before You Get to Purgatory, You’ve Got to Raise a Little Hell

My hunch is that the Inferno, the Canticle that most folks read, provides another range of possibilities when it comes to art, but until one reaches Purgatory, that range doesn’t appear as a range but as a totality.  After all, from the time that Dante sees the souls in ante-Inferno racing after the flag, being stung by eternal insects and never stopping to settle into one camp or the other, artistic representations of one sort or another are always pointing the reader–the characters within the story as well as those of us enjoying it centuries later–towards spiritual relationships that, without the images and allegories, would remain obscure in the world of civil wars and inept tyrants and the daily struggles of the living.  What Dante calls contrapasso in the Inferno is one mode of art, the exposure of duplicity and corruption in the name of spiritual truth.  What masquerades as spiritual love gets exposed as lust, as full of wind as any vice; and to deal among the living in philosophies that deny the soul leads, in Inferno, to an eternity locked in a burning box, unable to transcend just as one denied transcendence among the living.

When I think about Inferno in light of Purgatorio, the earlier Canticle strikes me not as identical with but at least related to the hermeneutics of suspicion that was the warp and woof of my own graduate education in literature.  To be sure, there were moments–glorious moments–when the texts were there to teach us, not to be dissected, but many of my encounters with literary text happened for the sake of debunking, of suspecting, and generally of unmasking what was “really” going on with the political and social agendas of those texts most frequently anthologized.

Dante helps me realize that such a practice is not entirely bad.  For Dante, seeing the wickedness of the ancients and of his own contemporaries unmasked means that his descent past the nether parts of Satan becomes, by virtue of his sinking as low as one may sink, an ascent to more Heavenly things.  And, to read Dante allegorically, such might be a good course of things for a student of literature and philosophy and such: when the student faces without obfuscation the darkness that lies at the heart of every human enterprise, what remains at least has a chance of being genuine hope.  (There’s always a chance, perhaps an inevitability, that we only thought we had reached the bottom and that we allowed ourselves to turn back rather than to plow through, but that’s why learning goes on past one’s formal education, no?)

But one distinction remains between the damned and Dante in the Commedia, and I wonder whether we make too little of that in graduate school.

Suspicion as a Road Beyond Suspicion

The damned will never take a single step past the unmasking of evil; for them, all that remains is evil in all of its ugliness.  For Dante, by Heavenly grace alone, to be sure, something lies beyond.  But it’s not Heaven, at least not immediately.

Purgatory challenges my own sense of ambiguity in the world by insisting, at every turn on two realities: that human beings are authentically good and bad, better and worse than they could be; and that only a gift from Heaven will allow me to see what’s good and what’s bad.  Neither of those realities is comfortable, but both together constitute the structure of Purgatory.  Only after Dante moves through both of those realities, terrace after terrace, can he ascend, and only after the saved have come to desire what Heaven gives, and ultimately the Lord of Heaven, will they truly enjoy Heaven.  It’s a bit of an affront to those of us moderns who have internalized Hume’s dictum that beauty is a function of the one seeing an object, not the beauty of the object itself, to think that the seer needs transformed, but there it is.  And if one takes Dante seriously, the implication seems to be that anyone reading the poem isn’t there yet.

And that’s what takes me back to remembering graduate school.  All of my professors, on some level, seemed to have at least a sense that literary narratives and lyric poems and good novels might, for those who learn to be suspicious, carry us forward beyond the suspicion.  (Some of you readers might think I’m being naive again there.)  My suspicion is that, as a young graduate student, I just wasn’t ready to enter Purgatory yet.


The Christian Humanist Podcast, Episode #144: Allegory

Michial Farmer

the-pilgrim-at-the-gate-of-idleness-1893.jpg!BlogDavid Grubbs holds forth with Nathan Gilmour and Michial Farmer about allegory, both as a mode of reading and as a literary genre. The debate hinges on what terms mean in which contexts: is a literary text defective because it’s an allegory, or are there good or bad allegories? Explore that and other questions with us.

“Against Allegory”

Our theme music this week is Radiohead’s “Packd Like Sardines in a Crushd Tin Box,” from 2001’s Amnesiac. I’m a reasonable man. Get off my case. Get off my case.

Christian Humanist Profiles 12: Structuralism, Modern Literature, and Christianity

Michial Farmer

Anyone who’s spent any time at all with the New Testament is familiar with the opening dew-on-a-spider-websentences of the Gospel of John: “In the beginning was the Word, and the Word was with God, and the Word was God. He was in the beginning with God. All things came into being through Him, and apart from Him nothing came into being that has come into being.” And anyone who’s spent any time at all in graduate studies of language and literature is familiar with another view of the word, and words. To quote Ferdinand de Saussure, the great Swiss structuralist linguist, “The linguistic sign unites, not a thing and a name, but a concept and a sound-image.” And in the next section, he says, “the linguistic sign is arbitrary,” meaning that no intrinsic relationship exists between the English word tree and the class of entities we use that word to refer to. Word is severed from thing. This is the foundation of structuralist linguistics and thus the foundation of much of twentieth- and 21st-century thinking.

Our guest today on Christian Humanist Profiles is Dr. Roger Lundin, the Arthur F. Holmes Professor of Faith and Learning at Wheaton College and the president of the Conference on Christianity and Literature. Dr. Lundin is the author of numerous books, including Literature Through the Eyes of Faith, a standard text for Introduction to Literature courses at Christian colleges. His latest book is Beginning with the Word: Modern Literature and the Question of Belief, which takes on the structuralist conception of language as a sign-system and proposes a different way of viewing language

Anonymity, Fame, and Alienation

Michial Farmer

In 1966, Ralph Harper, the Episcopal priest and expositor of existentialism, found himself in the middlea-personification-of-fame.jpg!Blog of the alienating twentieth century. Spiritual alienation, of course, existed long before 1966, and long before Harper’s “century of homelessness and exile, of nervous disorder and persecution, of actual enslavement and barbaric cruelty.” And yet the alienation of Harper’s age was different from that of previous centuries, in that it was bound up with the twentieth century’s peculiar anonymity. “To-day,” he writes, “isolation itself must be regarded as the chief symptom of the pressures on man. To-day isolation and anonymity are synchronous.” The mass age is the age of the faceless—and thus the age of estrangement.

It’s tempting to see this as a charmingly outdated analysis of a previously modern condition. After all, the internet has in some ways done away with anonymity altogether—social media broadcasts our images and opinions 24 hours a day; the government and the multinational corporations that work alongside it have almost unlimited access to our ostensibly private lives; and, according to several well-publicized polls over the last decade, Millennials are more interested in becoming famous than the generations that preceded them. But these facts, especially the last of them, actually suggest that Harper’s age of anonymity and estrangement is as present today as in 1966—or even more so.

Harper, in fact, suggests that in an alienated era, the need for recognition becomes a means of self-verification: “Men have to be known by others so that they can be sure they know themselves; there are no objective means to evaluate what one is and what one does. Living has become so subjective that one must appeal to other subjects for a guarantee of one’s position.” Our obsession with celebrity—the sheer amount of time we give over to thinking about the lives of the rich and famous on the one hand or the “everyone’s a star” milieu of the Internet on the other—is, in Harper’s view, an indication of a weak personality. It suggests a person who is fundamentally unsure of himself, unable to ground his identity in anything solid. Ultimately, our hunger for fame is a kind of perverted drive for transcendence held by people with a diseased spiritual sense—and I include myself (a Millennial) in that category, since I’m as bound to the vicissitudes of low culture and to the longing for academic fame as anyone who’s gone through a PhD program in the humanities. I seek what we all seek on some level: I want to be published and to have the self-verification of recognition: I am bright; I am insightful; I have that identity, at least. The idea of publishing this essay anonymous fills me with terror.

The desire for recognition, for fame, is an identity-disease, or, to mix my metaphors, it is a cloak meant to cover the nakedness of the modern soul. Millennials may or may not have the disease worse than previous generations, but no era has ever been immune to the condition. In the seventeenth century, Andrew Marvell had to chastise his own tendency toward it, most notably in his poem “The Coronet”:

When for the thorns with which I long, too long,
With many a piercing wound,
My Saviour’s head have crowned,
I seek with garlands to redress that wrong.

Marvell, over the course of this remarkable poem, moves back and forth between pride and self-reprimand. He writes devotional poetry, in praise of Christ, and he naturally wants it to be of the very highest quality, befitting its subject—but the very act of service is a snare:

Alas! I found the Serpent old,
That, twining in his speckled breast,
About the flowers disguised, does fold
With wreaths of fame and interest.

This is the constant temptation for the artist (and for the academic, for that matter, although our heights of fame are even smaller than those of the poet). In a world of instant celebrity—flash-paper celebrity, instantly ignited and immediately forgotten—even our virtues can be turned into vices. And any attempt to keep them as virtues will only make them more vicious, for success would be something to be proud of. Marvell’s solution is to turn back to devotion:

But thou who only couldst the Serpent tame,
Either his slippery knots at once untie,
And disentangle all his winding snare,
Or shatter too with him my curious frame,
And let these wither—so that he may die—
Though set with skill, and chosen with care;
That they, while thou on both their spoils dost tread,
May crown Thy feet, that could not crown Thy head.

This is also the solution posited by Flannery O’Connor in the best of her short stories, “Revelation,” in which Mrs. Turpin, a woman who has always prided herself on having the wits to do what needs to be done, has a vision of an afterlife in which the first are truly last: “They were marching behind the others with great dignity, accountable as they had always been for good order and common sense and respectable behavior. They alone were on key. Yet she could see by their shocked and altered faces that even their virtues were being burned away.” The answer to the problem of our misdirected longing toward fame, it seems, is a forced return to anonymity. Christ removes the crown from Marvell’s head and, just to demonstrate how paltry a thing celebrity is, puts it at His own feet, not even on his head. Mrs. Turpin watches as the things that make her gloriously herself are painfully and violently removed from her.

That the same solution is posed by Christian thinkers as diverse as Marvell and O’Connor suggests that it is not a matter of gender or of denomination or even of historical era. In fact, I don’t think it’s even necessarily a matter of religion. The indie folk band Fleet Foxes were getting at something very similar on their 2011 song “Helplessness Blues,” in which they come to terms with the fallout from our culture’s “everyone’s a star” mentality:

I was raised up believing I was somehow unique
Like a snowflake distinct among snowflakes, unique in each way you can be
And now after some thinking, I’d say I’d rather be
A functioning cog in some great machinery serving something beyond me

The members of Fleet Foxes, as far as I can tell, have no particular commitment to any organized religion, and yet their solution to the problem of the fame-drive is strikingly similar to Marvell’s and O’Connor’s. It is to restore the false transcendence of the fame-drive to a genuine transcendence—to move from the center of the stage to a position in the audience.

In other words, the solution to the problem of the fame-drive is to restore ourselves to a kind of anonymity. But this is not the brutal anonymity of the mass age, the sort of cold facelessness that exacerbates the fame-drive (though it does not create it). This is instead the anonymity of the devotee—Marvell’s laying the crown of thorns at Christ’s feet, or Mrs. Turpin’s long, painful march through Purgatory, or Fleet Foxes’ service as cogs in a larger machine (though this industrialized and mechanized image suggests a certain late-modern spiritual poverty, compared to my other two examples). It is a recognition that the personality of the artisan and even the quality of the art are not as important as the audience to whom the art is offered.

But this movement back into anonymity has a Kierkegaardian flavor to it. Kierkegaard famously admires Abraham not merely because he was willing to sacrifice Isaac when God demanded it but because he simultaneously “believed that God would not demand Isaac of him, while still he was willing to offer him if that was indeed what was demanded.” Thus he sacrifices Isaac and keeps Isaac at the same time—and thus Marvell sacrifices his glory to Christ and nevertheless ends up as one of the most famous poets of his century, and Fleet Foxes renounce their individuality, only to produce one of the most celebrated albums of the year. Not all of us can hope for such a result, of course, but Mrs. Turpin is a better model for us. Her virtues, her glory, perhaps even her personality are burned away from her—and yet presumably her vision will not end with her as a nameless face in the heavenly crowd but with her as herself, with a new identity given her by the God whom she pursues through the river of fire.

Again, then, the lesson taught by all these works is the renunciation of a false route to a phony transcendence—and with it, the renunciation of an identity we feel to be our own but which comes actually from the very crowd we’re hoping to escape. We must work for God, or for some other higher purpose; in so doing we will be given an identity that is as permanent as the thing we love, an identity that goes beyond mere recognition and thus actually transcends the anonymity of our age.

Gene Simmons, Capitalism, and Why I’m Torn about the Death of Rock

Nathan P. Gilmour

Gene Simmons: “Rock Is Finally Dead”

The Parable of the Madman from The Gay Science might seem a strange partner for a Gene Simmons interview, but the latter made me think of the former.  Of course, both of them deal with moments of cultural transition, the sorts of things about which historians argue and philosophers speculate and interview subjects pontificate.  But beyond that, both bring a striking metaphor, murder to be precise, to bear on phenomena that, in the way I normally think of things, aren’t vulnerable to murder.  And in both cases, I’m not sure what I should make of the metaphor, whether I should scoff along with the crowds or begin to write my dirge for the murdered.  Either way I don’t come out liking Gene Simmons any more than I did before I read the interview.  (A hint for the reader: that wasn’t much.)

A Culture Killed

If you’ve not listened to our podcast episode on the clause “God Is Dead,” you should, but I don’t want to make you listen to it before you finish this post, so I’ll provide the beginning of the text of the rightly-famous parable:

Have you not heard of that madman who lit a lantern in the bright morning hours, ran to the market place, and cried incessantly: “I seek God! I seek God!”—As many of those who did not believe in God were standing around just then, he provoked much laughter. Has he got lost? asked one. Did he lose his way like a child? asked another. Or is he hiding? Is he afraid of us? Has he gone on a voyage? emigrated?—Thus they yelled and laughed.

The madman jumped into their midst and pierced them with his eyes. “Whither is God?” he cried; “I will tell you. We have killed him—you and I. All of us are his murderers. But how did we do this? How could we drink up the sea? Who gave us the sponge to wipe away the entire horizon? What were we doing when we unchained this earth from its sun? Whither is it moving now? Whither are we moving? Away from all suns? Are we not plunging continually? Backward, sideward, forward, in all directions? Is there still any up or down? Are we not straying, as through an infinite nothing? Do we not feel the breath of empty space? Has it not become colder? Is not night continually closing in on us? Do we not need to light lanterns in the morning? Do we hear nothing as yet of the noise of the gravediggers who are burying God? Do we smell nothing as yet of the divine decomposition? Gods, too, decompose. God is dead. God remains dead. And we have killed him.

I found the Simmons interview fascinating because, unlike the madman in Nietzsche’s parable, who acknowledges his own part in things, Simmons seems to be saying that Rock is dead, and you have killed him–you but not I.  For Simmons, file-sharing software–and Napster serves as the synechdoche for a whole array of software–is the culprit in this murder case.  The verdict is simple and not examined for too long.

One needn’t look beyond those programs to see what killed rock ‘n roll.  It’s not the evolution of arena-rock, which funneled people who might have been in small clubs paying to see a larger number of bands into football stadiums, where everyone pays to see the same band.  It’s not the record industry, which replaced “live entertainment” (which used to be simply “entertainment”), which required each location to have its own band, with privately-accessible simulacra of musical performances listened to over headphones in isolation or, if in groups, via reproductions of performances rather than performances.  And certainly it’s not the cult of the international rock superstar, which gave  singers license to be sexual predators for a few decades, casting them beyond the reach of the larger culture’s structures of accountability and become embarrassingly wealthy because of it, alienating the masses even as the masses can’t get enough of their guitar gods.  Nope, I’m pretty sure it all died with the advent of Napster.

Now I’ll go ahead and say here that not one of Gene Simmons’s responses surprises me.  Breadth of historical vision is not what makes someone the front-man of KISS, and the ability to acknowledge one’s own contribution to bad things in the world is not a curse I’ve ever seen Simmons burdened with.  But I do want to note here that Simmons laments that the kids learning to play guitar now will never be as big as Gene Simmons, and he blames Napster and other file-sharing services for that fact.

I wonder, though whether Gene Simmons laments, or has ever lamented, the folks of his own generation who never became KISS.

After all, he seems quite concerned with the kid who’s fifteen now, plugging his guitar into an amp, who will not have the structure of the mid-twentieth-century music industry to propel his dreams.  But on the other hand, he seems unaware that kids in 1967 had just as little chance of making it in rock music, in those “iconic” times, as they do now.  To make a living playing rock music was still about as rare as making it in the NBA or onto the faculty of Harvard Medical School.  Many tried, few succeeded, and the rest of the folks moved on.  And what might be even more important, the rise of radio and television, and the subsequent ascendancy of institutions like Billboard‘s charts and ESPN,  likely caused many more aspiring musicians and actors and athletes to quit doing what they do than Napster ever did.

To put things another way, if you read Gene Simmons in the interview, you might get the impression that, at one point, many became “iconic” for recording music, then file-sharing come along, now almost nobody does.  The reality, as I understand it, is that the existence of the “iconic,” on a national scale, itself reduced the number of people actually producing entertainment, shifting the media ecostem towards passive consumers instead.  Or, to put it one more way, in Simmons’s golden age of the sixties and seventies, most of the English-speaking world remained among those who bought records with their day-job paychecks, and now relatively fewer, but not much more minuscule a sliver of society, can do that.

In other words, for most of us, no big loss.

As far as I can tell, Capitalism has been limiting the ability of rock fans to become rock stars since the beginning.  Those who pull the strings in whatever business model dominates (the record labels that Simmons lionizes as well as Apple and Amazon, the new dominant distribution channels for recorded music) know that people will buy the dream of stardom from whoever sells it, that there’s no money in making that democratic.  What Simmons holds up as the system that gave him the chance to be the face of KISS is the same system that only thrives if there aren’t too many creative-types getting big and thus diluting the market.

What I wonder is whether the proliferation of production software and distribution vehicles that came along with Napster, towards the turn of the millennium, might actually be the signs of a time to come that returns the Capitalist world to something more like life before arena rock, when bigger isn’t better any more and when “iconic” is something for which the youngsters mock their grandparents’ nostalgia.  My hunch, though it’s destined to remain a hunch, is that, a hundred years or so from now, there will be as much lamentation for the rock stars of the nineteen-seventies as there are now for the really grand Vaudeville personalities put out of work by the rise of radio and television.  Perhaps there will be graduate students presenting papers on why there was no Gene Simmons in the mid-twenty-first-century, but there won’t be a great sense of cultural loss because of such things.

Living the Funeral of Rock

Now that doesn’t mean I won’t get nostalgic; after all, as I pass from my current middle-age into the ranks of the grumpy old men (right now I’m just a grumpy man, thank you), I’ll be among those who remember being able to talk about a half dozen bands with anyone from the English-speaking world, knowing that folks from Texas and New York and Scotland would all have a basic notion of who the Beatles and the Stones are and why the fan-base tensions between the two are important.  I’ll remember my friends’ stories from big rock concerts (I don’t go to many, because I don’t like big crowds), the buzz that arose when a big show came to Atlanta or Indianapolis.  I’ll still remember watching music videos on MTV rather than on YouTube, for pity’s sake!

With all that, I think that this lament of the “murder of rock” sheds some light on Nietzsche’s parable.  To be sure, the Church had a longer shelf life than the rock star seems destined to have, but the sociological weight of one “murder” sheds light on the other.  Even Gene Simmons isn’t dumb enough to think that nobody will own guitars, play live shows at birthday parties, and perhaps even spawn university graduate programs in rock-and-roll composition-and-performance.  And certainly Nietzsche wasn’t dumb enough to think that there wouldn’t be any Christians after he died.  But both were noting a different sort of passing, a move from a world dominated by one sort of public event into a future in which the public was going to take on a very different cast.  Generations later, there still might be Christians, and generations later, there still might be rockers, but they’ll be of a different sort, a remnant that holds on, remembering and repeating what they hold to be a better way to be human.  The world will change, and so will the Christians and the rockers, and the interesting intersections will happen precisely where the followers of the old traditions articulate and live new ways to do so.

And like Nietzsche’s parable, Simmons’s interview makes me realize just how interesting and just how futile speculating about such a future can be.  Who knew, after all, that the decline of state Christianity in Europe would give way to the Anglican and the Pentecostal expansion of Christianity in the Global South?  That Christianity would rocket out of the tent revivals of Nietzsche’s day in North America and end up becoming megachurches?  That the technologies as yet unimagined by Nietzsche would create the conditions in which something like our little Internet project would be intelligible?  My hunch is that, just as Vaudeville gave way to big-venue concert tours and club-based standup comedy, that rock itself has already provided the seeds for whatever comes next.

The cool part, if you dig watching cultural trends as I do, will be to see where they sprout up.

The Christian Humanist Podcast, Episode #143: Proofs for God

Michial Farmer

Nathan Gilmour hosts a conversation about the five “proofs of God” from the opening sections of Thomas Aquinas’s Summa Theologiae. st-thomas-aqOur discussion ranges over what a proof is for, whether the ontology in the proofs holds up post-Kant, whether reason and revelation can really be friends, and all sorts of groovy philosophical things.

Our intro music this week is “Dear God” by Monsters of Folk, from their 2010 self-titled album.

Christian Humanist Profiles 11: Marvin Wilson on Our Hebraic Heritage

Danny Anderson

For many Christians, their faith was born, wholly formed, 2000 years ago. The covenant established in the-prophet-ezekiel-1510the New Testament provides, for many, all the equipment for living the Christian life requires, and the certainties that come with this confidence are powerful and sometimes lead to an under-appreciation of the wisdom of the Old Testament. Is it possible that this tendency has had a detrimental effect on the life of the Christian mind?

Dr. Marvin Wilson of Gordon College suggests that too many Christians neglect the richness of their tradition and that this has had serious consequences for the depth of their faith experience. Wilson argues, in Exploring Our Hebraic Heritage, that Christianity does not begin with Jesus, but rather with Abraham, and that Christians can learn a great deal from the way Jews have contended with their faith over the centuries.

In this episode of Christian Humanist Profiles, Danny Anderson speaks with Dr. Wilson about the deep connection Christianity has with Judaism, and the lessons Christians might learn from Jewish theological traditions.