Steven Pinker Should Read Some Nietzsche For Himself

Steven Pinker does not like Nietzsche. The following exchange–in an interview with the Times Literary Supplement makes this clear:

Question: Which author (living or dead) do you think is most overrated?

Pinker: Friedrich Nietzsche. It’s easy to see why his sociopathic ravings would have inspired so many repugnant movements of the twentieth and twenty-first centuries, including fascism, Nazism, Bolshevism, the Ayn Randian fringe of libertarianism, and the American alt-Right and neo-Nazi movements today. Less easy to see is why he continues to be a darling of the academic humanities. True, he was a punchy stylist, and, as his apologists note, he extolled the individual superman rather than a master race. But as Bertrand Russell pointed out in A History of Western Philosophy, the intellectual content is slim: it “might be stated more simply and honestly in the one sentence: ‘I wish I had lived in the Athens of Pericles or the Florence of the Medici’.”

The answers that Pinker seeks–in response to his plaintive query–are staring him right in the face. To wit, ‘we’ study Nietzsche with great interest because:

1. If indeed it is true that Nietzsche’s ‘ravings…inspired so many repugnant movements’–and these ‘movements’ have not been without considerable import, then surely we owe it to ourselves to read him and find out why they did so. Pinker thinks it ‘It’s easy to see why’ but surely he would not begrudge students reading Nietzsche for themselves to find out why? Moreover, Nietzsche served as the inspiration for a great deal of twentieth-century literature too–Thomas Mann is but one of the many authors to be so influenced. These connections are worth exploring as well.

2. As Pinker notes with some understatement, Nietzsche was a ‘punchy stylist.’ (I mean, that is like saying Mohammad Ali was a decent boxer, but let’s let that pass for a second.) Well, folks in the humanities–in departments like philosophy, comparative literature, and others–often study things like style, rhetoric, and argumentation; they might be interested in seeing how these are employed to produce the ‘sociopathic ravings’ that have had such impact on our times. Moreover, Nietzsche’s writings employ many different literary styles; the study of those is also of interest.

3. Again, as Pinker notes, Nietzsche ‘extolled the individual superman rather than a master race,’ which then prompts the question of why the Nazis were able to co-opt him in some measure. This is a question of historical, philosophical, and cultural interest; the kinds of things folks in humanities departments like to study. And if Nietzsche did develop some theory of the “individual superman,” what was it? The humanities are surely interested in this topic too.

4. Lastly, for Pinker’s credibility, he should find a more serious history of philosophy than Bertrand Russell‘s A History of Western Philosophy, which is good as a light read–it was written very quickly as a popular work for purely commercial purposes and widely reviled in its time for its sloppy history. There is some good entertainment in there; but a serious introduction to the philosophers noted in there can only begin with their own texts. If Pinker wants to concentrate on secondary texts, he can read Frederick Copleston‘s Friedrich Nietzsche: Philosopher of Culture; this work, written by a man largely unsympathetic to Nietzsche’s views and who indeed finds him morally repugnant, still finds them worthy of serious consideration and analysis. So much so that Copleston thought it worthwhile to write a book about them. Maybe Pinker should confront some primary texts himself. He might understand the twentieth century better.

An Ode To The Semicolon

I discovered semicolons in the fall of 1992. I had asked–on a lark of sorts–to read a term paper written by my then-girlfriend, who was taking a class in literary theory at New York University. In it, I noticed a ‘new’ form of punctuation; I had seen the semicolon before, but I had not seen it pressed so artfully into service. Here and there, my girlfriend had used it to mark off clauses–sometimes two, sometimes three–within a sentence; her placement turned one sentence into two, with a pause more pronounced than that induced by a comma. The two separated clauses acquired a dignity they did not previously progress; there was now a dramatic transition from one to the other as opposed to the blurring, the running on, induced by the comma. I had not read writing like this before; it read differently; it spoke to a level of sophistication in expression that seemed aspirational to me. I immediately resolved to use the semicolon in my own writing.

And so I did; I plunged enthusiastically into the business of sprinkling semicolons over my writing; they sprung up like spring wildflowers all over my prose, academic or not. Like my girlfriend, I did not stop at a mere pair of clauses; triplets and sometimes quadruplets were common. Indeed, the more the merrier; why not just string all of them along?

Needless to say, my early enthusiasm for semicolon deployment necessitated a pair of corrections. (My girlfriend herself offered one; my ego was not too enlarged to make me reject her help.) One was to use the semicolon properly. That is, to use it as a separator only when there were in fact separate clauses to be separated, and not just when a mere comma would have sufficed. The other, obviously, was to cut down just a tad on the number of clauses I was stringing together. Truth be told, there was something exhilarating about adding on one clause after another to a rapidly growing sentence, throwing in semicolon after semicolon, watching the whole dramatic edifice take shape on the page. Many editors of mine have offered interventions in this domain; I’ve almost always disagreed with their edits when they delete semicolons I’ve inserted in my writing. To my mind, they ran together too much and produced clunkier sentences in the process.

I don’t think there is any contest; the semicolon is my favorite piece of punctuation. The period is depressing; it possesses too much finality. The comma is a poser; it clutters up sentences, and very few people ever become comfortable with, or competent in, using them. (I often need to read aloud passages of prose I’ve written in order to get my comma placement right.) The colon is a little too officious. (My ascription of personalities to punctuation marks comes naturally to a synesthete like me.) The semicolon combines the best of all three, typographically and syntactically. It looks good; it works even better. What’s not to like?

Jerry Fodor And Philosophical Practice

I wrote a short post on Facebook today, making note of the passing away of Jerry Fodor:

Much as I admired Fodor’s writing chops, I deplored the way he did philosophy. The stories of his ‘put-downs’ and sarcastic, ironic, ‘devastating’ objections, questions, or responses in seminars always left me feeling like this was not how I understood philosophy as a practice. The admiration all those around me extended to Fodor was a significant component in me feeling alienated from philosophy during graduate school. (It didn’t help that in the one and only paper I wrote on Fodor–in refuting his supposed critique of Quine‘s inscrutability of reference claim–I found him begging the question rather spectacularly.) I had no personal contact with him, so I cannot address that component of him; all I can say is that from a distance, he resembled too many other academic philosophers: very smart folk, but not people I felt I could work with or for, or converse with to figure out things together.

In response, a fellow philosopher wrote to me:

[H]onestly that was my impression of Fodor also….while I too didn’t ever even meet him in person, I thought much of his rhetoric was nasty and unfair, that he routinely caricatured positions of others and then sort of pranced around about how he had totally refuted them, and that he basically ignored criticism…he was very far from what I would take to be a model for the profession….I got the impression that pretty much every other philosopher he mentioned was just a foil – produce a sort of comic book version of them to show how much better his view was.

There has been plenty of praise for Fodor on social media, much of which made note of precisely the style I pointed out above, albeit in admiring tones. In their obit for FodorThe London Review of Books paid attention to similar issues:

Jerry Fodor, who died yesterday, wrote thirty pieces for the LRB….Many of them were on philosophy of mind…more often than not, lucidly explaining how the books under review had got it all wrong….His literary criticism included a withering review of a pair of ‘amply unsuccessful’ novels about apes; and he had this to say of Steven Pinker’s view of Hamlet in his demolition of psychological Darwinism:

And here [Pinker] is on why we like to read fiction: ‘Fictional narratives supply us with a mental catalogue of the fatal conundrums we might face someday and the outcomes of strategies we could deploy in them. What are the options if I were to suspect that my uncle killed my father, took his position, and married my mother?’ Good question. Or what if it turns out that, having just used the ring that I got by kidnapping a dwarf to pay off the giants who built me my new castle, I should discover that it is the very ring that I need in order to continue to be immortal and rule the world? It’s important to think out the options betimes, because a thing like that could happen to anyone and you can never have too much insurance.

Unsurprisingly, this quote from Fodor was cited as a ‘sick burn’ on Twitter–as an example of his ‘genteel trash talk.’ But a second’s reading of Pinker, and of the response above by Fodor, shows that Fodor is again operating at his worst here. The paragraph cited is a deliberately obtuse and highly superficial reading of Pinker’s claim. Do we have to think about the specific events in Hamlet in order to ponder the ethical dilemmas that the play showcases for us? Is this why people have the emotional responses they do to Hamlet? Or is it because they are able to recognize and internalize the intractability of the issues that Hamlet raises? Do we need to specifically think about rings, dwarfs, and giants in order to specifically ponder the abstract problems that lie at the heart of the tale Fodor cites? Indeed, the many folks who have read these stories over the years seem–in their emotional responses–to have been perfectly capable of separating their concrete particulars from the concepts they traffic in. Fodor does not bother to offer a charitable reading of Pinker; he sets off immediately to scorn and ridicule. This kind of philosophy, and this kind of writing, earns plenty of applause from those who imagine philosophy to be a contact sport. But it does little to advance philosophical thinking on the issues at play.

The Bollywood War Movie And The Indian Popular Imagination  

In 1947, even as India attained independence from colonial subjugation, war broke out in Kashmir as guerrillas backed by Pakistan sought to bring it into the Pakistani fold. That war ended in stalemate after intervention by the UN. Since then, the fledgling nation of India has gone to war four more times: first, in 1962, Jawaharlal Nehru’s darkest hour, against China, a war that ended in a humiliating loss of territory and self-esteem, which left Nehru a broken man, and ultimately finished him off; then, in 1965, India and Pakistan fought their way to another inconclusive stalemate over Kashmir; in 1971, India fought a just war to bring freedom to the erstwhile East Pakistan, producing the new nation of Bangladesh in the process (war broke out on the western and eastern fronts in December 1971 and ended quickly as the Pakistan Army surrendered in Dacca two weeks later); finally, in 1999, India forced its old nemesis, Pakistan, back from the brink of nuclear war by pushing them off the occupied heights of Kargil. War is part of the story of the Indian nation; it continues to shape its present and the future. India, and its understanding of itself, has changed over the years; Bollywood has tried to keep track of these changes through its movies, in its own inimitable style. In a book project that I am working on, and for which I have just signed a contract with HarperCollins (India), I will examine how well it has succeeded in this task.  (I have begun making notes for this book and anticipate a completion date of May 31st 2018; the book will come to a compact sixty thousand words.)

In my book, I will take a close look at the depiction of war and Indian military history in Bollywood movies. I will do this by examining some selected ‘classics’ of the Bollywood war movie genre; by closely ‘reading’ these movies, I will inquire into what they say about the Indian cinematic imagination with regards to—among other things—patriotism, militarism, and nationalism, and how they act to reinforce supposed ‘Indian values’ in the process. Because Bollywood both reflects and constructs India and Indians’ self-image, this examination will reveal too the Indian popular imagination in these domains; how can Indians come to understand themselves and their nation through the Bollywood representation of war?

Surprisingly enough, despite India having waged these four wars in the space of merely fifty-one years, the Bollywood war movie genre is a relatively unpopulated one, and moreover, few of its movie have been commercial or critical successes. The Bollywood war movie is not necessarily an exemplary example of the Bollywood production; some of these movies did not rise to the level of cinematic or popular classics though their songs often did. This puzzling anomaly is matched correspondingly by the poor state of military history scholarship in India. My book aims to address this imbalance in two ways. First, by examining the Bollywood war movie itself as a movie critic might, it will show how these movies succeed or fail as movies qua movies and as war movies in particular. (Not all Bollywood war movies feature war as a central aspect, as opposed to offering a backdrop for the central character’s heroics, sometimes captured in typical Bollywood formulas of the romantic musical. This is in stark contrast to the specialized Hollywood war movie, of which there are many stellar examples in its history.) Second, by paying attention to the place of these wars in Indian popular culture, I will contribute to a broader history of these wars and their role in the construction of the idea of India. Nations are sustained by dreams and concrete achievement alike.

After a brief historical introduction to Bollywood, I will critically analyze selected movies–(Haqeeqat, 1971, Aakraman, Lalkaar, Border, Hindustan Ki Kasam, Hum Dono, Lakshya, LOC Kargil, Deewar (2004 version), Shaurya, Tango Charlie, and Vijeta)–beginning with post-WWII classics and chronologically moving on to more contemporary offerings. Along the way, I hope to uncover–in a non-academic idiom–changing ideas of the Indian nation, its peoples, and the Indian understanding of war and its relationship to Indian politics and culture as Bollywood has seen it. This book will blend cinematic and cultural criticism with military history; the wars depicted in these movies serve as factual backdrop for their critical analysis. I will read these movies like texts, examining their form and content to explore what they teach us about Bollywood’s attitudes about war, the effects of its violence on human beings, on the role of violence in human lives, on how romantic love finds expression in times of war, how bravery, cowardice, and loyalty are depicted on the screen. I will explore questions like: What does Bollywood (India) think war is? What does it think happens on a battlefield? Why is war important to India? What does Bollywood think India is, and why does it need defending from external enemies? Who are these ‘external enemies’ and why do they threaten India? How does Bollywood understand the military’s role in India and in the Indian imagination? And so on.

 

A Modest Proposal To Cull The Human Herd

Feeding the elderly and the young i.e., the economically unproductive, is a terribly wasteful, irrational enterprise–programs like Meals on Wheels and after-school lunches are but the most glaring instances of this catastrophically misdirected act of charity; acts like these will never produce any tangible, meaningful results like an increase in the Gross Domestic Product or the Gross National Product, indeed, the Gross Product of anything whatsoever. The elderly and the young merely consume resources, among which is the most valuable of all, the time and attention of those who could be otherwise engaged in more useful and productive endeavors–all of which may be located in those zones of virtue and redemption, the workspace and the office of the corporation (not the public sector enterprise.) Parents all too often have to turn their eyes away from useful work to attend to the plaintive cries of their useless children, while on the other end of the age spectrum, those same workers have to minister to their useless parents, who continue to occupy space, drink drinking water, eat edible food, and contribute to this planet’s terrible climate change situation by increasing our atmosphere’s carbon dioxide content. Children can at least be mildly amusing, while the elderly are anything but. Enough is enough; our civilization is at a genuine point of crisis.

Any strategy to ameliorate this state of affairs must begin with a recognition of our fundamental human nature: we are individuals, first and foremost. We are born free, radically independent of family and home and state; we die free, hopefully alone, all by ourselves. We take care of ourselves from the moment of our birth, tending to our needs with rugged solitary enterprise; we disdain the helping hand at every step. We feed ourselves, we clean ourselves, we clothe ourselves; we are pioneers of the spirit, heart, and mind. The company of other human beings is always an irritation, one only tolerated in our recognition of them as potential future consumers for the goods we will try to sell them at some point in the future.  The care of others is a burden; we need little care as we grow up, and indeed receive none, so why should we extend our care outwards? We were left by the wayside at birth; so must we do to others.

Faced with these incontrovertible facts about ourselves, a simple plan of action suggests itself for dealing with the problem of the too-young and the too-old: a gentle but firm shove over the edge. No more bleating for attention from the children; no more calls for assistance from the elderly. A population made up entirely of working-age adults is an economist’s delight; it should be our aspirational ideal, guiding our social and economic policies at every step; it should inform the moral instruction we provide to our child..er, each other. The qualms we might feel as we prepare to enact this policy are merely the vestiges of an archaic sensibility, one that must bow its head before the relentless logic of the economic enterprise, and the moral demands it places upon us.

The Inseparability Of The Form And Content Of Arguments

Is it more important for philosophers to argue well than it is to write well? Posed this way, the question sets up a false dichotomy for you cannot argue well without writing well. Logic is not identical with rhetoric, but the logical form of an argument cannot be neatly drawn apart from its rhetorical component. (Classical rhetoric has been insisting forever that we cannot separate form and content.) We define validity and soundness of an argument in formal semantic and syntactical terms; and unsurprisingly, those notions find their greatest traction when evaluating arguments expressed in formal languages. But philosophical disputation takes place using natural  languages; and arguments are made in order to persuade or convince or induce other changes in the epistemic make-up of our interlocutors.

We argue with someone, somewhere, in some time and context; we argue to achieve some end, whether moral, political, economic, legal. Any evaluation of the arguments we make must take these factors into consideration; without them at hand, our evaluations are sterile and pointless. (Why, after all, do we concern ourselves with notions of epistemic justice if not for the fact that some arguments are more likely to be ‘heard’ than others?) Fallacies abound in natural language arguments; correcting them is not just a matter of paying attention to the abstract logical form of the argument ‘underlying’ the sentences we have deployed; it is a matter too, or making sure we have chosen the right words, and deployed them appropriately in the correct context. To use an example from an older post, we reject a smoker’s argument that we should stop smoking on ad-hominem grounds, but the smoker really should have known better than to try to convince someone to quit while puffing away merrily and seemingly enjoying deep lungfuls of smoke. Good argument; terrible form. The same smoker would find a more receptive audience if he spoke with some feeling about how miserable his health has become over the years thanks to his smoking habit.

(On a related note, consider that when programmers evaluate ‘good code,’ they do so on the basis of not just the effective functionality of the code in accomplishing its task, which is a purely technical notion, but also on aesthetic notions: Is the code readable? Can it be modified easily? Is it ‘beautiful’? No programmer of any worth elides these notions in evaluative assessment of written code.)

There is a larger issue at play here. Philosophers do much more than just argue; sometimes they just point in a particular direction, or make us notice something that we had not seen before, or sometimes they clothe the world in a different form. These activities have little to do with arguing ‘correctly.’ They do, however, have a great deal to do with effective communication. Writing is one such form, so is speaking.

Note: The examples of great philosophers who are considered ‘terrible’ or ‘obscure’ writers–by some folks–does not diminish the point made here. Hegel and Heidegger–with due apologies to Hegel-and-Heidegger-philes–achieved their fame not just because of the quality or depth of the arguments they offered in their works but also because they wrote from particular locations, in particular times. (Some think they made terrible arguments, of course!) The sociology of philosophy has a great deal to say about these matters; more philosophers should pay attention to it.

Trump Campaign Rallies And Presidential Imagery

Donald Trump kicked off the 2020 election season with a campaign rally in Florida last night. These campaign rallies enable Trump to keep lines of communication–besides his Twitter account–open to his faithful; they rejuvenate his ego, one presumably battered by the endless ridicule heaped on him by his political opponents; they enable him to switch from his usual self-pitying moaning to his preferred mode of narcissistic boasting; they allow him to send out a message that will be faithfully amplified by a media eager for ‘newsworthy events’; he is, after all, the President.

If the staging of these rallies is any indication, they will supply a stream of rhetorically powerful imagery–the awesome paraphernalia of the American Presidency is now Trump’s to command–that will animate his public presence over the next four years. Trump is not just any ordinary candidate now; he is an elected President running for reelection, supported by a party which controls both houses of the legislative branch.

The American polity should have thought long and hard about how it has, over the years, allowed the pomp and circumstance of the Presidency to continue to increase to levels that resemble those of the monarchs of days gone by. Servant of the people? I think not. Those who occupied the Oval Office before Trump have left many loaded weapons lying around for him to use: the disregard of the legislative branch in the declarations of war; disrespect of the judicial branch; and of course, a wallowing in the perks and privileges of residency in the White House.

During the 2012 election season–in response to Charles Blow criticizing Mitt Romney for speaking ‘rudely’ to Barack Obama during a presidential debate–I made note here of how we seemed to have become excessively reverential of the presidency, and by association, of presidents too:

Blow feels the need to remind us, in a tone of reverential, devotional awe: ‘the president of the united states!’  Is he hoping to make us fall on our knees? This is the president, the unitary executive, the person put in place to ensure a republic which would otherwise do just fine with a legislative branch also possesses an entity capable of making snap decisions. Why, then, the need for such excessive deference?

Blow is not alone in these constant provisions of reminders to respect and be suitably awed by the president and his office. The White House, the presidential galas, the gun salutes; these are archaic expressions of monarchical times gone by. But the president is a political leader; he has arisen from conflict; he presides over conflict. It’s acceptable to be in conflict with him and his office. The president can be disagreed with, he can be debated; he needs to explain himself and his actions like anyone else.  Disagreements with the president need not be confined to print, they can be verbal too. And when they are verbal, they can sound edgy (like most disagreements between adults are). ‘Déclassé and indecorous’? Dunno. Politics isn’t really the space for decorum.

Well, the indecorous are here, and they intend to use the presumption of respect to their fullest advantage.

Note: The perennial election season, a perpetual motion electoral machine, has long been staring the American polity in the face, nipping at its heels, breathing down its neck–pick your favorite metaphor, and it works–for many years now. It is finally here. Talk of opposing Democratic candidates began on November 10th, 2016, and it won’t stop till November 3rd, 2020. Talk of the 2024 election will, of course, begin on November 4th 2020. Trump filed papers as a candidate for the 2020 election on the day he was inaugurated. His filing was a deft political move:

Having filed…as a candidate, Trump would be able to coordinate with PACs and other similar organizations. More importantly, 501(c)(3) nonprofit organizations would no longer be able to engage in “political speech” which could theoretically affect the results of the 2020 U.S. Presidential Election without running the risk of losing their nonprofit status. The move effectively bars interest groups from creating nonprofits which they could funnel money into for the purposes of opposing Trump’s initiatives. This will likely create chaos for political opponents of Trump such as George Soros, who has sunk significant amounts of money into various nonprofit groups with the intent of opposing Trump’s government.