Late Work And Shying Away From Decay And Death

In ‘Late Francis Bacon: Spirit and Substance‘ Colm Tóibín writes:

It would be easy to imagine…that Thomas Mann’s Death in Venice was written toward the end of his life. In fact, it was written in 1911, when Mann was thirty-six. It is a young man’s book; its images of desire, decay, and death could not be so easily entertained by a writer facing into late or last work.

Tóibín does not make clear why we would imagine that Death in Venice “was written toward the end of [Mann’s] life” but be that as it may, of more interest here is the claim that an artist (of whatever stripe) would find it difficult to entertain “images of desire, decay, and death” in “late or last work.”

Tóibín has found himself making this claim, I suspect, as a way of pushing further the speculative query by Edward Said with which he opens this essay:

In his book On Late Style…Edward Said ponders the aura surrounding work produced by artists in the last years of their lives. He asks: “Does one grow wiser with age, and are there unique qualities of perception and form that artists acquire as a result of age in the late phase of their career?”….he also questions the very notion of late serenity:….What if age and ill health don’t produce the serenity of ‘ripeness is all’?”

Said further ponders…the sheer strangeness of Ludwig van Beethoven’s late string quartets and his last piano sonatas, their insistence on breaking with easy form, their restlessness, their aura of incompletion…the feeling that they are striving toward some set of musical textures that have not yet been imagined and cannot be achieved in Beethoven’s lifetime….these late pieces wish to represent the mind or the imagination not as it faces death but rather as it faces life, as it sets out to reimagine a life with new beginnings and new possibilities.

The obvious counterpoint to Tóibín’s claim is implicit in Said’s first query: as artists approach death, they, like other humans, find a new openness to the very idea of death and non-existence and the bodily decay that precedes it. An excellent example of this might be Roger Angell‘s essay ‘This Old Man’ which begins with the following lines:

Check me out. The top two knuckles of my left hand look as if I’d been worked over by the K.G.B. No, it’s more as if I’d been a catcher for the Hall of Fame pitcher Candy Cummings, the inventor of the curveball, who retired from the game in 1877. To put this another way, if I pointed that hand at you like a pistol and fired at your nose, the bullet would nail you in the left knee. Arthritis.

Now, still facing you, if I cover my left, or better, eye with one hand, what I see is a blurry encircling version of the ceiling and floor and walls or windows to our right and left but no sign of your face or head: nothing in the middle. But cheer up: if I reverse things and cover my right eye, there you are, back again. If I take my hand away and look at you with both eyes, the empty hole disappears and you’re in 3-D, and actually looking pretty terrific today. Macular degeneration.

And so it goes on. Angell, of course, is writing an essay on old age; he is not working these images into his sports writing. What about other writers? I will rest content with another example: Phillip Roth, whose late work was replete with such ‘images’; he never  shied away from them. Comments with support or refutation for Tóibín’s claim welcome.

The New York Times’ Op-Ed Page Is An Intellectual Dark Web

The New York Times Op-Ed page has been an intellectual dark web for a long time. Few corners of the Internet can lay claim to both Thomas Friedman and David Brooks, two of the most widely ridiculed, mocked, and parodied ‘thought leaders’ ever to have deigned to grace us swine with their pearls of wisdom; so extensive and ubiquitous is the scorn sent their way and so, correspondingly, entirely self-unaware is this pair that they continue to write on as before, unaware that they are now parodying themselves. The Times’ Op-Ed page also includes Maureen Dowd, who slipped into irrelevance during the Bush years, and only makes periodic, pitiful attempts to show up on readers’ radars–sometimes by penning unhinged rants about clueless consumption of marijuana edibles in legal jurisdictions. Then there is Sophist-in-Chief-And-Apologist-For-Religion Ross Douthat, whose rambling, self-pitying pieces about the marginalization of conservative thought by remorseless liberals have also settled into their own familiar and head-scratching template: see, liberalism, you so mean, you just shot yourself in your own foot while you thought you was picking out distant conservative targets.

And then, we have Bari Weiss and Bret Stephens.

I must confess to knowing little about these two writers before they were promoted to their own space on one of the nation’s most prominent media platforms; the former apparently distinguished herself by attacking the academic freedom of Arab scholars to criticize Israel, the latter by cheerleading for the Iraq War. But their settling down into the boring, predictable output emanating from the New York Times Op-Ed page was rapid enough, and Weiss’ latest offering cements her own particular corner in that outpost: a paean to those intellectuals who have thrown their toys out of the pram because they are not being recognized–it remains entirely unclear by whom–for the intellectual revolutionaries they imagine themselves to be. Here they are: Jordan Peterson, Sam Harris, Ben Shapiro, Joe Rogan etc. They have giant book deals, extensive media presence and connections, YouTube channels and podcasts whose audience runs into the millions; indeed, you might even imagine them ‘thought leaders’ of a kind. Their ideas are, sadly enough, disappointingly familiar: sexism and racism and the wonders of the free market find scientific grounding here, as do dark imprecations about the conceptual connections between particular religions and social dysfunction, and so on. No matter: what really unites the intellectuals is that they imagine themselves iconoclasts and pioneers and brave outsiders. And those writing on them imagine themselves to be similar intellectual heroes: they are, after all, speaking up on behalf of the rebels and outsiders and outliers.

A more depressing display of intellectual cluelessness cannot be imagined; the essay’s astonishing photo-spread, which showcases the various profiled ‘intellectuals’ in the act of getting caught peeing in the bushes confirms this assessment. The ‘intellectuals’ profiled by Weiss are not on the margins; they are right at the center, and they aren’t keen to share the spotlight with anyone; an elementary examination of their cultural placement would reveal this fact rather quickly. It is hard to know how this pitch was first made by Weiss; it is equally hard to fathom the editorial reasoning that led to its approval and to the final finished form.

Before Weiss is alarmed by the scornful response to her piece, she should remember that she is not being ‘silenced,’ that her ‘essay’ was published at the New York Times, and that, despite the writerly incompetence on display, she is not being sacked. She’s right where she belongs: on the intellectual dark web.

Leaving Facebook: You Can Run, But You Can’t Hide

I first quit Facebook in 2010, in response to a talk Eben Moglen gave at NYU about Facebook’s privacy-destroying ways; one of his most memorable lines was:

The East German Stasi used to have to deploy a fleet of undercover agents and wiretaps to find out what people did, who they met, what they ate, which books they read; now we just have a bunch of Like buttons and people tell a data monetizing corporation the same information for free.

That talk–in which Moglen referred to Mark Zuckerberg as a ‘thug’–also inspired a couple of young folk, then in attendance, to start Diaspora, an alternative social network in which users would own their data. I signed up for Diaspora soon after kicked off; I also signed up for Google+. I returned to Facebook in 2012, a few months after starting my blog, because it was the only way I could see to distribute my posts. Diaspora and Google+ never ‘took off’; a certain kind of ‘first-mover status, and its associated network effects had made sure there was little social networking on those alternative platforms.

Since then, I’ve stayed on Facebook, sharing photos, bragging about my daughter and my various published writings, and so on. I use the word ‘bragging’ advisedly; no matter how much you dress it up, that’s what I’ve been doing. But it has been a horrible experience in many ways: distraction, lowered self-esteem, envy, have been but its most prominent residues. Moreover, to have substantive discussions  on Facebook, you must write. A lot. I’d rather write somewhere else, like here, or work on my books and essays. So, I desperately want to leave, to work on my writing. But, ironically, as a writer, I feel I have to stay on. Folks who have already accomplished a great deal offline, can afford to stay off; those of us struggling to make a mark, to be noticed, have to stay here. (Consider that literary agents now want non-fiction writers to demonstrate that they have a ‘social media presence’; that they have a flourishing Facebook and Twitter presence, which will make the marketing of their writings easier.) I know, I know; as a writer, I should work on my craft, produce my work, and not worry about anything else. I know the wisdom of that claim and reconciling it to the practical demands of this life is an ongoing challenge.

So, let’s say, ‘we,’ the user ‘community’ on Facebook decide to leave; and we find an alternative social network platform. I’m afraid little will have changed unless the rest of the world also changes; the one in which data is monetized for profit, coupled with a social and moral and economic principle that places all values subservient to the making of profit. The problem isn’t Facebook. We could migrate to another platform; sure. They need to survive in this world, the one run by capital and cash; right. So they need to monetize data; ours. They will. Money has commodified all relationships; including the ones with social network platforms. So long as data is monetizable, we will face the ‘Facebook problem.’

Steven Pinker Should Read Some Nietzsche For Himself

Steven Pinker does not like Nietzsche. The following exchange–in an interview with the Times Literary Supplement makes this clear:

Question: Which author (living or dead) do you think is most overrated?

Pinker: Friedrich Nietzsche. It’s easy to see why his sociopathic ravings would have inspired so many repugnant movements of the twentieth and twenty-first centuries, including fascism, Nazism, Bolshevism, the Ayn Randian fringe of libertarianism, and the American alt-Right and neo-Nazi movements today. Less easy to see is why he continues to be a darling of the academic humanities. True, he was a punchy stylist, and, as his apologists note, he extolled the individual superman rather than a master race. But as Bertrand Russell pointed out in A History of Western Philosophy, the intellectual content is slim: it “might be stated more simply and honestly in the one sentence: ‘I wish I had lived in the Athens of Pericles or the Florence of the Medici’.”

The answers that Pinker seeks–in response to his plaintive query–are staring him right in the face. To wit, ‘we’ study Nietzsche with great interest because:

1. If indeed it is true that Nietzsche’s ‘ravings…inspired so many repugnant movements’–and these ‘movements’ have not been without considerable import, then surely we owe it to ourselves to read him and find out why they did so. Pinker thinks it ‘It’s easy to see why’ but surely he would not begrudge students reading Nietzsche for themselves to find out why? Moreover, Nietzsche served as the inspiration for a great deal of twentieth-century literature too–Thomas Mann is but one of the many authors to be so influenced. These connections are worth exploring as well.

2. As Pinker notes with some understatement, Nietzsche was a ‘punchy stylist.’ (I mean, that is like saying Mohammad Ali was a decent boxer, but let’s let that pass for a second.) Well, folks in the humanities–in departments like philosophy, comparative literature, and others–often study things like style, rhetoric, and argumentation; they might be interested in seeing how these are employed to produce the ‘sociopathic ravings’ that have had such impact on our times. Moreover, Nietzsche’s writings employ many different literary styles; the study of those is also of interest.

3. Again, as Pinker notes, Nietzsche ‘extolled the individual superman rather than a master race,’ which then prompts the question of why the Nazis were able to co-opt him in some measure. This is a question of historical, philosophical, and cultural interest; the kinds of things folks in humanities departments like to study. And if Nietzsche did develop some theory of the “individual superman,” what was it? The humanities are surely interested in this topic too.

4. Lastly, for Pinker’s credibility, he should find a more serious history of philosophy than Bertrand Russell‘s A History of Western Philosophy, which is good as a light read–it was written very quickly as a popular work for purely commercial purposes and widely reviled in its time for its sloppy history. There is some good entertainment in there; but a serious introduction to the philosophers noted in there can only begin with their own texts. If Pinker wants to concentrate on secondary texts, he can read Frederick Copleston‘s Friedrich Nietzsche: Philosopher of Culture; this work, written by a man largely unsympathetic to Nietzsche’s views and who indeed finds him morally repugnant, still finds them worthy of serious consideration and analysis. So much so that Copleston thought it worthwhile to write a book about them. Maybe Pinker should confront some primary texts himself. He might understand the twentieth century better.

An Ode To The Semicolon

I discovered semicolons in the fall of 1992. I had asked–on a lark of sorts–to read a term paper written by my then-girlfriend, who was taking a class in literary theory at New York University. In it, I noticed a ‘new’ form of punctuation; I had seen the semicolon before, but I had not seen it pressed so artfully into service. Here and there, my girlfriend had used it to mark off clauses–sometimes two, sometimes three–within a sentence; her placement turned one sentence into two, with a pause more pronounced than that induced by a comma. The two separated clauses acquired a dignity they did not previously progress; there was now a dramatic transition from one to the other as opposed to the blurring, the running on, induced by the comma. I had not read writing like this before; it read differently; it spoke to a level of sophistication in expression that seemed aspirational to me. I immediately resolved to use the semicolon in my own writing.

And so I did; I plunged enthusiastically into the business of sprinkling semicolons over my writing; they sprung up like spring wildflowers all over my prose, academic or not. Like my girlfriend, I did not stop at a mere pair of clauses; triplets and sometimes quadruplets were common. Indeed, the more the merrier; why not just string all of them along?

Needless to say, my early enthusiasm for semicolon deployment necessitated a pair of corrections. (My girlfriend herself offered one; my ego was not too enlarged to make me reject her help.) One was to use the semicolon properly. That is, to use it as a separator only when there were in fact separate clauses to be separated, and not just when a mere comma would have sufficed. The other, obviously, was to cut down just a tad on the number of clauses I was stringing together. Truth be told, there was something exhilarating about adding on one clause after another to a rapidly growing sentence, throwing in semicolon after semicolon, watching the whole dramatic edifice take shape on the page. Many editors of mine have offered interventions in this domain; I’ve almost always disagreed with their edits when they delete semicolons I’ve inserted in my writing. To my mind, they ran together too much and produced clunkier sentences in the process.

I don’t think there is any contest; the semicolon is my favorite piece of punctuation. The period is depressing; it possesses too much finality. The comma is a poser; it clutters up sentences, and very few people ever become comfortable with, or competent in, using them. (I often need to read aloud passages of prose I’ve written in order to get my comma placement right.) The colon is a little too officious. (My ascription of personalities to punctuation marks comes naturally to a synesthete like me.) The semicolon combines the best of all three, typographically and syntactically. It looks good; it works even better. What’s not to like?

Virginia Woolf On Autobiography And Not Writing ‘Directly About The Soul’

In Inspiration and Obsession in Life and Literature, (New York Review of Books, 13 August, 2015), Joyce Carol Oates writes:

[Virginia] Woolf suggests the power of a different sort of inspiration, the sheerly autobiographical—the work created out of intimacy with one’s own life and experience….What is required, beyond memory, is a perspective on one’s own past that is both a child’s and an adult’s, constituting an entirely new perspective. So the writer of autobiographical fiction is a time traveler in his or her life and the writing is often, as Woolf noted, “fertile” and “fluent”:

I am now writing as fast & freely as I have written in the whole of my life; more so—20 times more so—than any novel yet. I think this is the proof that I was on the right path; & that what fruit hangs in my soul is to be reached there…. The truth is, one can’t write directly about the soul. Looked at, it vanishes: but look [elsewhere] & the soul slips in. [link added above]

I will freely confess to being obsessed by autobiography and memoir. Three planned book projects of mine, each in varying stages of early drafting and note-taking, are autobiographical, even as I can see more similar ventures in the offing; another book, Shapeshifter: The Evolution of a Cricket Fan, currently contracted to Temple University Press, is a memoir; yet another book Eye on Cricket, has many autobiographical passages; and of course, I often write quasi-autobiographical, memoirish posts on this blog all the time. In many ways, my reasons for finding myself most comfortable in this genre echo those of Woolf’s: I find my writing within its confines to be at its most ‘fertile’ and ‘fluent’–if at all, it ever approaches those marks; I write ‘fast’ and ‘freely’ when I write about recollections and lessons learned therein; I find that combining my past sensations and memories with present and accumulated judgments and experiences results in a fascinating, more-than-stereoscopic perspective that I often find to be genuinely illuminating and revealing. (Writing memoirs is tricky business, as all who write them know. No man is an island and all that, and so our memoirs implicate the lives of others as they must; those lives might not appreciate their inclusion in our imperfect, incomplete, slanted, agenda-driven, literary recounting of them. Still, it is a risk many are willing to take.)

Most importantly, writing here, or elsewhere, on autobiographical subjects creates a ‘couch’ and a ‘clinic’ of sorts; I am the patient and I am the therapist; as I write, the therapeutic recounting and analysis and story-retelling kicks off; the end of a writing session has at its best moments, brought with it moments of clarity and insight about myself to the most important of quarters: moi. More than anything else, this therapeutic function of autobiographical writing confirms yet another of Woolf’s claims: that “one can’t write directly about the soul. Looked at, it vanishes.” Sometimes, one must look at the blank page, and hope to find the soul take shape there instead.

 

‘Prison Literature: Constraints And Creativity’ Up At Three Quarks Daily

My essay, ‘Prison Literature: Constraint and Creativity,’ is up at Three Quarks Daily.  Here is an introduction/abstract:

In his Introduction to Hegel’s Metaphysics (University of Chicago Press, 1969, pp 30-31), Ivan Soll attributes “great sociological and psychological insight” to Hegel in ascribing to him the insight that “the frustration of the freedom of act results in the search of a type of freedom immune to such frustration” and that “where the capacity for abstract thoughts exists, freedom, outwardly thwarted, is sought in thought.”

In my essay I claim that the perspicuity of this “insight” of Hegel is best illustrated by a species of intellectual production intimately associated with physical confinement: prison literature. The list of this genre’s standout items–The Consolations of Philosophy, The Pilgrim’s Progress for instance–is populated with luminaries–Boethius, Bunyan, De Sade, Gramsci, Solzhenitsyn, Jean Genet etc. Here, constraint is conducive to creativity; the slamming shut of one gate is the prompt to the unlocking of another. For the prison writer, confinement may produce a search for “substitute gratification”–whether conscious or unconscious–and the channeling of the drive toward freedom into the drive for concrete expression of abstract thought. Where freedom to act is not appropriately directed toward alternative artistic expression it can become pathologically repressed instead (as the Nietzsche of The Genealogy of Morals indicated.)

For the prison writer, freedom has changed from being a purely practical affair to one grounded in the act of writing. I explore this stance of the prison writer, its resonances with the perennial struggles of all writers, everywhere, and the truth of the claim–to which Hannah Arendt’s remarks about totalitarianism and the Orwell of 1984 resonate–that those that place prisoners in solitary confinement are onto a vitally necessary piece of knowledge for oppressors: if confinement is to work as a mode of repression, it must aspire to totality. I explore this via a consideration of the relationship between repression and creativity–a general one, and the  more specific variant to be found in Nietzsche and Freud.