Snowpiercer: The Train As Capitalist Society And The Universe

Post-apocalyptic art–whether literature or movies–is provided, sometimes all too easily, ample opportunity for flirting with the grand, for making sweeping statements about human nature and the meaning and purpose of life. After all, it’s the (often violent) end of the world. Time to speculate about the new, phoenix-like world that may rise from the ashes of the old, or to mourn the loss of not-easily replaced intangible moral goods of a world gone missing.  So it is unsurprising that SnowpiercerBong Joon-ho‘s English-language debut, swings for the fences.  It is initially grounded in a facile morality play about global warming, man’s persistently failed and hubristic attempts to play God, and the evils of science. There is more to it though. The ark-like train which is the movie’s stage and centerpiece, which circles the world carrying the survivors of a world frozen over, functions as an extended allegory for capitalist society, its economic inequality, its class structures and exploitation of the weak, its decadence and immorality,  the inevitable revolt of the oppressed, and their rise to the top (or the front).

But Snowpiercer is more intellectually ambitious than that. It also to aims to be an allegory that flirts with God, the Universe, Free Will, Evil, Freedom and Existential Choice. It offers us commentary on ideology and false consciousness and propaganda; it shows us how man may be tempted by evil and can choose moral redemption instead. The structure of the train and the progressively enlightening journey of the rear passengers through its various compartments suggest too that Dante’s Inferno and the Pilgrim’s Progress could be invoked here with some ease. (Ironically, for an allegory, Snowpiercer is sometimes a little too literal and heavy-handed. Yes, man, with all his cunning and scheming, can play the part of a devious God, and God may just be our notion of human powers and goodness and judgment extrapolated to an unimaginable extreme, but these theses can be advanced with a little more subtlety than Snowpiercer allows.)

Snowpiercer‘s grand ambitions are sometimes realized and sometimes not.  Mostly they are not realized because Snowpiercer spends a little too much time trying to be an action movie. Its showings off of technical virtuosity, its nods to the video game and martial arts genres with their extended bloodiness, their glorying in gore, their stylized slow-motion combat, are distractions and deceits. (This action initially provides a possibly invigorating jolt to the movie’s plot, but all too soon it becomes tedious, deadening, and in the movie’s closing stages it is a distraction.) There is much in the movie that is visually interesting and provocative, much that should have been allowed to come to rest in the viewer for to facilitate reflection and introspection.  But this does not happen, largely because the movie believes that some rather archaic cinematic tropes–physical conflict rages elsewhere while two protagonists engage in philosophical debate!–must be relied on to in order to build and generate tension.

All too often, I find myself describing science-fiction movies as missed opportunities. This is one such.

Cultural Associations Do Not Add Up

In reviewing Jonathan Lethem‘s Dissident Gardens (“Leftists in Jeopardy“, New York Review of Books, April 2014), Michael Greenberg writes:

Lethem’s impulse to display his knowingness, his “vernacular” expertise, as he calls it, his belief that “were’ surrounded by signs [and] our imperative is to ignore none of them engenders a narrative noise that drowns out the novel’s subtler chords. His characters become the sum total  of their cultural associations,  creatures of the zeitgeist, a form of determinism, that as determinism does, leaves little or no room for spontaneity and nuance. We know them by their era, their affiliations, the music they listen to, and the products they boycott, or acquire.

Greenberg may be right that such ‘narrative noise…drowns out the novel’s subtler chords.’ But I do not know if the fundamental anxiety he expresses, that the characters subjected to such treatment become entirely relational–the “sum total of their cultural associations…with no room for spontaneity or nuance”–is all that worrisome or even perspicuous.

These associations and affiliations are expressions of taste, evidence of choices. These choices may display the very ‘spontaneity’ and ‘nuance’ whose absence Greenberg is bemoaning. We might know these characters by ‘their era, their affiliations, the music they listen to, and the products they boycott, or acquire’ but that does not mean the particular and peculiar way these are assembled by each individual may not be a ‘style’, a distinct signature, all its own.Greenberg seems to imagine such characters are entirely passive, merely bearing the impress of their cultures. But that would only be so if there is an assumption of, ironically enough, a certain ‘determinism’ on his part. These collections of ‘cultural associations,’ often very distinct from each other, present a different breeding ground for the various influences they subsequently encounter. Those interactions will often result in a quite unique character.

As but a trivial example, the temporal sequencing of these cultural adoptions may significantly affect the particular ‘sum total’; cultural choices and tastes do not follow some commutative law of addition. The teenager who discovers Slayer first, and then Black Sabbath later is very different from the one who listens to Black Sabbath first and finds Slayer later. The former finds his beloved thrashers have their provenance in classic heavy metal; the latter finds his beloved masters continue to live on in the homage paid them by contemporaries. The former may be tempted into an exploration of an older school of music; the latter may seek to find other bands’ expressions of a signature style. Their resultant journeys are likely to be very different. Or, if you prefer a more exalted example, those who read military histories of the Second World War first, and then later read those written by Herodotus, are likely to have a quite different reading experience from those who bring Herodotus to their reading of the Second World War histories.

Conformity is a genuine worry, but not quite in the way that Greenberg worries about it. The notion of a ‘sum total…of cultural associations’, in particular, strikes me as incoherent.

The Indian Non-Fan of Cricket

My latest post at The Cordon at ESPNcricinfo, about that supposedly mystical creature, the Indian non-fan of cricket, is up and running. Here is how the post concludes:

There is nothing essential about cricket’s place in the Indian imagination or sensibility; its position is not protected by any mystical guarantees of durability. It is a cultural activity, one with a history of contingencies propping it up; it must compete for time and attention and emotional investment with all the other offerings of this variegated world. Perhaps, almost unimaginably, it will recede from Indian shores, leaving behind some archaeological traces of its once iron-clad hold on that land.

Re-Reading Cormac McCarthy’s ‘The Road’

I’m re-reading Cormac McCarthy‘s The Road in preparation for discussing it with my students next week. It has been an interesting experience.

First, I am struck by how new the book seems on this second reading. I read it first a year ago, and yet, its prose seems just as pristine. There is some familiarity in the narrative, in the landscape the Man and the Boy traverse, and the central tragedy of their impoverished and stark existence, but the words are read anew all over again. They have not lost–in the slightest–their power to evoke wonder, pity, fear, and sorrow. Indeed, I’m struck by how much more I notice on my second journey through this bitter, desolate land (beginning with the dream on page one of “the great stone room where lay a black and ancient lake” on whose “far shore” is a “creature that raised its dripping mouth from the rimstone pool and stared into the light with eyes dead white and sightless as the eggs of spiders.”)

Second, my first reading occurred after I had already seen John Hillcoat‘s cinematic adaptation. Then, my appreciation of the novel’s imagery had been significantly impacted by that of the movie; I could not but help invoke the movie in my mind’s eye as I read the book. It made for an often jarring experience. Now, a year or so has passed since I last saw the movie, and the novel stands on its own. And I dare say its brilliance is even more acutely on display. I’m more aware of the poetic sparseness and harshness of its language, its deployment of wondrous words and sentences (“lozenges of stone veined and striped”), and the interiority of the Man, his bitter railings and his love and anger. Then, I had read the novel to see what the movie was based on; now, as I read the novel there is nothing else in mind. My attention is commanded entirely by the novel; its spells are ever more efficacious.

Third,  I am not reading alone. My students are reading it with me, even if they are not physically present at the moment. Because I’m reading the book in preparation for a classroom discussion, I’m actively and aggressively marking up passages to bring up in class.  I am looking, keenly, for its suggestions and symbolisms and allegories;  I am thinking anew about the meanings of even the briefest dialogues. I’m anticipating my students’ reactions to each passage, wondering whether one of them will comment on the same line I have marked up in the margins, whether they will be horrified, saddened, and enthralled as I was a year ago, and as I am now. Because I will be bringing my reading of the book to them, and hope they will do the same, my reading is tinged with a sense of foreboding: yet another encounter with it awaits me. Who knows what my students’ backgrounds and lives will produce in their meetings with the text? And perhaps our joint discussion will throw up ever newer meanings and interpretations.

Truly, the classics pay rich dividends for our persistent devotions.

Parable of the Sower: Octavia Butler’s Parable

Octavia Butler‘s Parable of the Sower, the richly symbolic and subversive.story of Lauren Olamina, a prophet in the making, one finding her voice and her people in the midst of an America whose social order is collapsing around her, grows on you.  The story line is sparse: the US’ accumulated social, political and environmental dysfunctions have grown out of control thanks to a myopic complacent populace; some fortunate families shelter in gated communities while urban war rages outside; rape, murder, and mayhem rule rampant; a young girl, convinced she has devised a new religion, Earthseed–whose central principles are that ‘God is Change’ and can be ‘shaped’–finds her sheltering life within these walls untenable, and leaves after yet another attack on them convinces her it is more dangerous inside than outside. From that point on, she accumulates a small band of fellow travelers and heads north to possible safety. On the way she finds further gruesome evidence of the end of the new world and dreams about a new one. The haven promised them turns out to be a burnt-out shelter, the larger world on a smaller scale–but she chooses to drop anchor and get to work on it. (I’ve not read the sequel Parable of the Talents yet, but I intend to. Parable of the Sower was written in 1993, and it sets its action in the years 2025-2027. Though perhaps inevitable, I suspect it is besides the point to wonder if its speculations about a collapsing US are on the mark. The real story lies elsewhere. )

Parable of the Sower is subversive because the prophet is a young black girl, not an old white man. She is wise beyond her years. She is sexually active with young and old men alike; she can be harsh and soft.  She is scientifically literate. She is hyperempathic–she can literally feel the pain of others. (This is a dangerous ‘blessing’ in a world with so much pain but Lauren comes to learn its limits and to live with it.)  She is tough and resourceful and clever; we come to admire her as her dangerous journey progresses. We do not normally associate these qualities with people meeting Lauren’s description–not in this society anyway, with its dominant stereotypes and ideological frames of understanding.  Just for this character, Parable of the Sower would have been an interesting and enlightening read.

But there is more. Earthseed seems a little new-ageish, but teasing out some of Lauren’s pronouncements enable an understanding of it as a kind of existentialist creed, one grounded in a richly interactionist. embedded, dynamic view of man and nature and cosmos. Heaven and hell are found here, around us, made by us, shaped by our actions; the old religions shrouded them in mystery but we live in them everyday. (The shrewd prophet uses that old word ‘God’ to make her religion easier to follow by those accustomed to old anthropomorphic deities.) In a world headed for hell in a handbasket this religion offers no solace, facilitates no finger-pointing; the blame is ours, but so may be the rewards for reconstructing it.  No creed can, or should, offer more.

 

Matthew Arnold On Inequality

In his 1879 essay ‘Equality,’ Matthew Arnold wrote about inequality too:

What the middle class sees is that splendid piece of materialism, the aristocratic class, with a wealth and luxury utterly out of their reach, with a standard of social life and manners, the offspring of that wealth and luxury , seeming out utterly out of their reach also. And thus they are thrown back upon themselves–upon a defective type of religion, a narrow range of intellect and knowledge, a stunted sense of beauty, a low standard of manners. And the lower class see before them the aristocratic class, and its civilization, such as it is, even infinitely more of out of their reach than out of that of the middle class; while the life of the middle class, with its unlovely types of religion, thought, beauty, and manners, has naturally in general, no great attractions for them either. And so they too are thrown back upon themselves; upon their beer, their gin, and their fun. Now then, you will understand what I meant by saying that our inequality materialises our upper class, vulgarises our middle class, brutalises our lower.

And the greater the inequality the more marked is its bad action upon the middle and lower classes….

[O]ur aristocracy…is for the imagination a singularly modern and uninteresting one. Its splendor of station, its wealth, show, and luxury, is then what the other classes really admire in it; and this is not an elevating admiration. Such an admiration will never lift us out of our vulgarity and brutality, if we chance to be vulgar and brutal to start with; it will rather feed them and be fed by them….our love of inequality is really the vulgarity in us, and the brutality, admiring and worshipping the splendid materiality.

[Matthew Arnold: Selected Essays, edited with an introduction by Noel Annan, Oxford University Press, 1964]

Arnold does not speak here of rage, outward or inward directed, but he might as well have. For there is a black envy here, in his mention of an ‘admiration’ that is not ‘elevating’ but that instead ‘feeds’ and is ‘fed’ by ‘vulgarity’ and ‘brutality.’ This corrosion of sensibilities that inequality produces–all the more acute as the inequality grows more pronounced–cannot be anything but a destabilizing force, one that may not restrained too long.

In some cultures it is said staring at someone eating brings bad luck to the person eating. The watcher is urged to show some manners; the eater turns away to consume in peace. A pair of hungry eyes looking at sustenance denied them cannot but ruin the appetite of those conscious of their gaze. Matters, no doubt, are infinitely worse when the food on the plate has been stolen from those watching, when they have been forced to serve it up with their own hands.

The converse, of course, of such a superstition, is that the ostentatious consumer of food denied others reminds others of their misfortune, rubs their faces in it. He runs the risk too, of having his plate snatched out of his hands.

Not Working While Working

Roland Paulsen has an interesting essay over at The Atlantic on not working while working. Shirking, slacking, ‘pretending to add value,’ not having enough to do, boring work, ‘meaningless’ work – whatever it is, whatever the reason – there’s a whole lot of not working while working going on. And yet, we continue to ‘work’ more and more, with increasingly reduced time for families and leisure. The cruelest of all ironies: we have to remove ourselves from  scenes of leisure and familial ties, take ourselves elsewhere, and then, not work.

I often don’t work while working, mainly because I’m distracted by social media and email and just plain old Internet-centered procastination–as I have complained here. But there was a time once, when I wanted to work while working and couldn’t, because someone else wanted to do my work.

In 1998, in an effort to earn a little money that would allow me to work on my dissertation without having to spend a lot of time teaching for peanuts in CUNY’s adjunct-exploiting system, I decided to take six months off and work in New York City’s financial sector–doing UNIX system administration. Jobs were a dime-a-dozen, the Internet gold rush was on, and I found a gig at an online brokerage within three days of applying. (I stupidly asked for too little, of course.)

In any case, once I signed up, I found myself assigned as backup to an older system administrator. He would show me the ropes and I would assist him on all tasks. I quickly realized my colleague had been made extremely nervous by my hiring. He was convinced–thanks to his years in the aerospace industry–that his head was on the proverbial chopping block and that once he had finished ‘training’ me, he would be fired. (Asking around behind the scenes, I was told that no such plan was in the works, but all reassurance to this effect failed to comfort my co-worker.)

So, all too quickly, I realized that any work assigned us as a pair would be done by him alone. All too often, my co-worker would tell me to ‘relax’, saying he could take care of it himself. Once done with it, he would report to our supervisor, informing him in great detail just how efficiently he had accomplished his objectives. (He also insisted on keeping the emergency beeper with him at all times; I was only too happy to let him hold on to it.) I would sometimes accompany him as he went about these chores but soon enough I gave up even that pretense and retired to my desk to browse, drink coffee, and chat with my neighbors. I knew there was little danger of my being fired; my employers wanted a pair of system administrators on duty at all times, and there was little chance they would let me go in the job market that existed then, which featured a shortage of folks with my ‘skills.’ As before, this fact did not make a dent in my co-worker’s anxieties. Truth be told, there was something pitiful about it all.

And so it went. I would show up on time at work, convey a reasonable impression of being occupied, take long lunch and coffee breaks, attend meetings, and all of the rest. A few months later, when according to my calculations I had earned enough to take care of my living expenses for a few months of teaching-free dissertation writing, I handed in my resignation.

In six months, I had barely worked the equivalent of two weeks.

Chronicle Of A Cryptic Reminder

Sometimes I scribble little notes to myself–mostly on pieces of paper, but increasingly, on a little electronic notepad on my smartphone. Sometimes they are prompted by observations while walking, sometimes by a passage read in a book, sometimes by a scene in a movie. Sometimes they make sense when I return to them a little later, and an expanded thought based on them finds its way into my writing–perhaps here on this blog, or elsewhere. But sometimes, when I look at them, they make little or no sense. I have no idea what prompted them, and they find their way into a physical or virtual wastebasket. This forgetfulness stems, in part, from their provenance. When I write them down, I am possessed by a panic that the momentary thought will disappear, leaving no trace behind. So, cutting corners, I rush to commit to permanence. Haste makes waste indeed. Or rather, anxiety does.

Here is one such cryptic missive: “A few years after I became older than my father, I met him again.” I do not know what this means, I do not know what inspired this line. But let me see if I can make some sense of it.

I am now older than my father. I became older than him in September 2010. Indeed, if I could be bothered to do so again, I could calculate the exact date on which I did so. It had been a little obsession of mine–this going past my father in the chronological stakes–in the days and months leading up to it. It marked a strange supersession–I had racked up more days on this planet than he ever managed.  Once I passed that magic marker, every day beyond it bore a curious stamp: it was associated with an age my father had never experienced. (It is for this reason, of course, that I cannot imagine my father as an old man; he always appears as a young man in the images I associate with him, even as he became prematurely balding in his late thirties.) So, now, even as I experience some of the same bodily and mental changes he underwent in the forties, I can also surmise, optimistically, that I will have some experiences he never underwent.

But I have no idea what “I met him again” means. Perhaps he had starred in one of my dreams, appearing as he normally does, a shadowy figure, not quite distinct, all too easily morphing into the background, becoming hidden again, a quick vision, a quick obscurity. Or perhaps I meant it more figuratively, as an encounter with something I associate indelibly with him: Russian literature, military aviation, or his gloomy prognostication, made back in the seventies, that the coming century would see increasingly bloody and intractable conflict between the haves and the have-nots.  Or perhaps, it might just be that one day I looked at my daughter and saw some distinctive feature which announced her genetic inheritance. At that moment, perhaps, I saw, just for an instant, my father all over again.

A Rankings Tale (That Might Rankle)

This is a story about rankings. Not of philosophy departments but of law schools. It is only tangentially relevant to the current, ongoing debate in the discipline about the Philosophical Gourmet Report. Still, some might find it of interest. So, without further ado, here goes.

A half a dozen years ago, shortly after my book Decoding Liberation: The Promise of Free and Open Source Software had been published, and after I had begun work on attempting to develop the outlines of a legal theory for artificial intelligence, I considered applying to law school. For these projects, I had taught myself a bit of copyright, patent and trade secret law; I had studied informational privacy, torts, contracts, knowledge attribution, agency law; but all of this was auto-didactic. Perhaps a formal education in law would help my further forays into legal theory (which continue to this day). Living in New York City meant I could have access to some top-class departments–NYU, Columbia, Yale–some of whose scholars would also make for good collaborators in my chosen field of study. I decided to go the whole hog: the LSAT and all of the rest. (Yes, I know it sounds ghastly, but somehow I overcame my instinctive revulsion at the prospect of taking that damn test.)

An application for law school requires recommendation letters. I anticipated no difficulty with this. I knew a few legal scholars–professors at law schools–who were familiar with my work, and I hoped they would write letters for me, perhaps describing the work I had produced till that point in time. The response was gratifying; my acquaintances all said they’d be happy to write me letters.  I went ahead with the rest of my application package, even as I had begun to feel that law school looked like an impractical proposition–thanks to its expenses. Taking out loans would have meant a second mortgage and that seemed a rather bizarre burden to take on.

In any case, I took the LSAT. I did not do particularly well. I used to be good in standardized tests back in my high school and undergraduate days but not any more. My score was a rather mediocre 163 (in the 90th percentile), clearly insufficient for admission to any of the departments I was interested in applying to. Still, I reasoned, perhaps the admissions committees would look past that score. Perhaps they’d consider my logical acumen as being adequately demonstrated by my publications in The Journal of Philosophical Logic; perhaps a doctorate in philosophy would show evidence of my ability to parse arguments and write; and I did have a contract for a book on legal theory. Perhaps that would outweigh this little lacuna.

One of my letter writers, a professor at Columbia Law School, invited me to have coffee with him to talk about my decision to go to law school. When we did so, he told me he had written me an excellent letter but he wondered whether law school was a good idea. He urged me to reconsider my decision, saying I would do better to stay on my auto-didactic path (and besides, the expenses were not inconsiderable). I said I had started to have second thoughts about the whole business and had not yet made up my mind. He then asked me my LSAT score. When I told him, he guffawed: I did not stand a snowball’s chance in hell of getting into the departments I was interested in. But, surely, I said, with a letter and a good word from you, and my publication record, I stood a chance. He guffawed again. Let me tell you a story, he said.

A few years prior, he had met a bright young computer science student, a graduate from a top engineering school, with an excellent GPA, who had wanted to study law at Columbia. He was interested in patent law, and had–I think, if I remember correctly–even written a few essays on software patents, mounting a critique of existing regimes, and outlining alternatives to them. He had asked my current interlocutor to write him a recommendation letter for Columbia. There was just one problem: his LSAT score was in the low 160s. Just like mine, not good enough for Columbia. Time to talk to the Dean, to see if perhaps an exception could be made in his case. The Dean was flabbergasted: there was no way such an exception could be made. But, my letter writer protested, this student met the profile for an ideal Columbia Law student, especially given his interests: he had a stellar undergraduate record in a relevant field, he had shown an aptitude for law, he had overcome personal adversity to make it through college (his family was from a former Soviet republic and he had immigrated with them to the US a few years before after suffering considerable economic hardship). Couldn’t an exception be made in this case?

The Dean listened with some sympathy but said his hands were tied. Admitting a student with such a LSAT score would do damage to their ‘LSAT numbers’ – the ones the US News and World Report used for law school rankings. Admitting a student with with an LSAT score in the low 160s would mean finding someone with a score in the high 170s to make sure the ‘LSAT numbers’–their median value, for instance–remained unaffected. God forbid, if the ‘LSAT numbers’ were hit hard enough, NYU might overtake Columbia in the rankings next year. The fate of a Dean who had allowed NYU to slip past Columbia in the USNWR rankings did not bear thinking about. Sorry, there was little he could do. Ask your admittedly excellent student to apply elsewhere.

Nothing quite made up my mind not to go to law school like that story did. Still, my application was complete; test scores and letters were in. So I applied. And was rejected at every single school I applied to.

Vampire, Vampire, Burning Bright

When I was ten years old or so, my father and I went to visit an old friend of his at his sprawling home. While they chatted in the living room, I went wandering around the house, looking for books to browse through. (I had asked for, and had been granted permission to do so; my father knew it was the best way to keep me occupied while the adult conversation proceeded.) In a bedroom upstairs, I found a bookshelf, and began looking through its offerings. I quickly found one title that looked good for a read: Vampires. A pair of bloodstained fangs adorned the cover, and it featured a collection of plates mid-volume.

This was no pop book; this was a serious history of the vampire phenomenon in cultural history. I read the first few pages, grimly fascinated by the descriptions of the undead nocturnal blood sucking creatures  that had so terrified and excited the human imagination through the ages. I had seen Dracula: Prince of Darkness a year or so before (somehow, my parents had allowed me to do so); I knew of vampires and their properties. But I had not realized they had been such a perennial fascination across cultures and time. I spent, as can be imagined, more time on the black and white plates than on actual reading–there were enough gruesome drawings and depictions of various forms of the vampyric to send chills down my spine as I sat there, alone, in that bedroom, engrossed and horrified in equal measure.

Soon, I was called downstairs by my father. Our visit was over. I was relieved and disappointed in equal measure. But I did not ask to borrow the book, to continue reading it. Something about it had made me deeply uneasy.

Later that evening, I went out to play soccer in the local park, and stayed out late, kicking around with my mates till dusk fell. At that point, with the sun setting and visibility increasingly poor, we reluctantly called it a day and headed home. I walked down my lane back home, ready for a shower, and then, dinner with my parents. I walked up the stairs to our second level home, and rang the bell.

No one answered. I rang the bell again. Still no answer. I realized, suddenly, that our normally well-let living room, clearly visible through the sliding glass door, was dark, utterly so. So was the bedroom my brother and I used. No one was at home. I was alone, standing on our darkened balcony.

At that moment, every single image I had seen earlier that day, came flooding back, jostling for attention. Every fear that had remained hidden, that I had struggled to keep latent during the day suddenly made itself manifest. This dark balcony was no longer familiar; its corners crawled with menace.

I burst into tears; I was utterly, totally, terrified. I cursed myself for having exposed my vision and my imagination to those depictions of bloody teeth, cowering victims, and caped creatures of the night. I was convinced my death and damnation were at hand.

A few seconds later, my mother, having heard my bawling from the back of the house, the kitchen, where she had been cooking the night’s dinner, came running out.  Salvation was at hand.

I’ve never been as scared since. By the supernatural at least.