Steven Pinker Should Read Some Nietzsche For Himself

Steven Pinker does not like Nietzsche. The following exchange–in an interview with the Times Literary Supplement makes this clear:

Question: Which author (living or dead) do you think is most overrated?

Pinker: Friedrich Nietzsche. It’s easy to see why his sociopathic ravings would have inspired so many repugnant movements of the twentieth and twenty-first centuries, including fascism, Nazism, Bolshevism, the Ayn Randian fringe of libertarianism, and the American alt-Right and neo-Nazi movements today. Less easy to see is why he continues to be a darling of the academic humanities. True, he was a punchy stylist, and, as his apologists note, he extolled the individual superman rather than a master race. But as Bertrand Russell pointed out in A History of Western Philosophy, the intellectual content is slim: it “might be stated more simply and honestly in the one sentence: ‘I wish I had lived in the Athens of Pericles or the Florence of the Medici’.”

The answers that Pinker seeks–in response to his plaintive query–are staring him right in the face. To wit, ‘we’ study Nietzsche with great interest because:

1. If indeed it is true that Nietzsche’s ‘ravings…inspired so many repugnant movements’–and these ‘movements’ have not been without considerable import, then surely we owe it to ourselves to read him and find out why they did so. Pinker thinks it ‘It’s easy to see why’ but surely he would not begrudge students reading Nietzsche for themselves to find out why? Moreover, Nietzsche served as the inspiration for a great deal of twentieth-century literature too–Thomas Mann is but one of the many authors to be so influenced. These connections are worth exploring as well.

2. As Pinker notes with some understatement, Nietzsche was a ‘punchy stylist.’ (I mean, that is like saying Mohammad Ali was a decent boxer, but let’s let that pass for a second.) Well, folks in the humanities–in departments like philosophy, comparative literature, and others–often study things like style, rhetoric, and argumentation; they might be interested in seeing how these are employed to produce the ‘sociopathic ravings’ that have had such impact on our times. Moreover, Nietzsche’s writings employ many different literary styles; the study of those is also of interest.

3. Again, as Pinker notes, Nietzsche ‘extolled the individual superman rather than a master race,’ which then prompts the question of why the Nazis were able to co-opt him in some measure. This is a question of historical, philosophical, and cultural interest; the kinds of things folks in humanities departments like to study. And if Nietzsche did develop some theory of the “individual superman,” what was it? The humanities are surely interested in this topic too.

4. Lastly, for Pinker’s credibility, he should find a more serious history of philosophy than Bertrand Russell‘s A History of Western Philosophy, which is good as a light read–it was written very quickly as a popular work for purely commercial purposes and widely reviled in its time for its sloppy history. There is some good entertainment in there; but a serious introduction to the philosophers noted in there can only begin with their own texts. If Pinker wants to concentrate on secondary texts, he can read Frederick Copleston‘s Friedrich Nietzsche: Philosopher of Culture; this work, written by a man largely unsympathetic to Nietzsche’s views and who indeed finds him morally repugnant, still finds them worthy of serious consideration and analysis. So much so that Copleston thought it worthwhile to write a book about them. Maybe Pinker should confront some primary texts himself. He might understand the twentieth century better.

No, Aristotle Did Not ‘Create’ The Computer

For the past few days, an essay titled “How Aristotle Created The Computer” (The Atlantic, March 20, 2017, by Chris Dixon) has been making the rounds. It begins with the following claim:

The history of computers is often told as a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II. In fact, it is better understood as a history of ideas, mainly ideas that emerged from mathematical logic, an obscure and cult-like discipline that first developed in the 19th century. Mathematical logic was pioneered by philosopher-mathematicians, most notably George Boole and Gottlob Frege, who were themselves inspired by Leibniz’s dream of a universal “concept language,” and the ancient logical system of Aristotle.

Dixon then goes on to trace this ‘history of ideas,’ showing how the development–and increasing formalization and rigor–of logic contributed to the development of computer science and the first computing devices. Along the way, Dixon makes note of the contributions-direct and indirect–of: Claude Shannon, Alan Turing, George Boole, Euclid, Rene Descartes, Gottlob Frege, David Hilbert, Gottfried Leibniz, Bertrand Russell, Alfred Whitehead, Alonzo Church, and John Von Neumann. This potted history is exceedingly familiar to students of the foundations of computer science–a demographic that includes computer scientists, philosophers, and mathematical logicians–but presumably that is not the audience that Dixon is writing for; those students might wonder why Augustus De Morgan and Charles Peirce do not feature in it. Given this temporally extended history, with its many contributors and their diverse contributions, why does the article carry the headline “How Aristotle Created the Computer”? Aristotle did not create the computer or anything like it; he did make important contributions to a fledgling field, which took several more centuries to develop into maturity. (The contributions to this field by logicians and systems of logic of alternative philosophical traditions like the Indian one are, as per usual, studiously ignored in Dixon’s history.) And as a philosopher, I cannot resist asking, “what do you mean by ‘created'”? What counts as ‘creating’?

The easy answer is that it is clickbait. Fair enough. We are by now used to the idiocy of the misleading clickbait headline, one designed to ‘attract’ more readers by making it more ‘interesting;’ authors very often have little choice in this matter, and very often have to watch helplessly as hit-hungry editors mangle the impact of the actual content of their work. (As in this case?) But it is worth noting this headline’s contribution to the pernicious notion of the ‘creation’ of the computer and to the idea that it is possible to isolate a singular figure as its creator–a clear hangover of a religious sentiment that things that exist must have creation points, ‘beginnings,’ and creators. It is yet another contribution to the continued mistaken recounting of the history of science as a story of ‘towering figures.’ (Incidentally, I do not agree with Dixon that the history of computers is “better understood as a history of ideas”; that history is instead, an integral component of the history of computing in general, which also includes a social history and an economic one; telling a history of computing as a history of objects is a perfectly reasonable thing to do when we remember that actual, functioning computers are physical instantiations of abstract notions of computation.)

To end on a positive note, here are some alternative headlines: “Philosophy and Mathematics’ Contributions To The Development of Computing”; “How Philosophers and Mathematicians Helped Bring Us Computers”; or “How Philosophical Thinking Makes The Computer Possible.” None of these are as ‘sexy’ as the original headline, but they are far more informative and accurate.

Note: What do you think of my clickbaity headline for this post?

Bertrand Russell On Deterrence By Making ‘Freedom More Pleasant’

In ‘What I Believe,’ an essay whose content–selectively quoted–was instrumental in him having his appointment at the City College of New York revoked¹, Bertrand Russell wrote:

One other respect in which our society suffers from the theological conception of ‘sin’ is the treatment of criminals. The view that criminals are ‘wicked’ and ‘deserve’ punishment is not one which a rational morality can support….The vindictive feeling called ‘moral indignation’ is merely a form of cruelty. Suffering to the criminal can never be justified by the notion of vindictive punishment. If education combined with kindness is equally effective, it is to be preferred; still more is it to be preferred if it is more effective….the prevention of crime and the punishment of crime are two different questions; the object of causing pain to the criminal is presumably deterrent. If prisons were so humanized that a prisoner got a good education for nothing, people might commit crimes in order to qualify for entrance. No doubt prison must be less pleasant than freedom; but the best way to secure this result is to make freedom more pleasant than it sometimes is at present.

Russell was a logician, so he cannot resist making a simple logical point here: if you want prison to represent an uncomfortable alternative to ‘the world outside’ that constitutes an effective deterrent to crime, you have two choices: make prison conditions much worse, or make the state of ‘the world outside’ much better. Our reactions to the world we encounter rely on contrasts and conditioning; it took a princess used to the utter luxury of royal palaces to find the pea under the pile of mattresses unbearable; the parched wanderer in the desert finds the brackish water of a dusty oasis the sweetest nectar of all. It is not inconceivable that many who are used to endemic and grinding poverty, hunger, and violence might find prison not such a bad alternative, and find that its supposed terrors, when viewed from afar, are entirely lacking in deterrent effect. (That sad old saw about criminals committing crimes in order to get three square meals and a roof over their heads perhaps bears repeating here.)

Unsurprisingly, the vindictive and retributive mentality of societies informed at heart by the “theological conception of ‘sin’,” entirely unconcerned with the actual and effective amelioration of social ills, chooses the former of the options listed above. Moreover, the emphasis on retribution acts as a powerful distraction from clear thinking on what might have made criminals act the way they did–perhaps if ‘the world outside’ were improved, some of the causal chains leading to the commission of crime could be disrupted.

Note 1: The details of this shameful scandal and its gross violation of academic freedom  are still worth reading after all these years (especially because, as the Steven Salaita affair reminds us, academic freedom remains under assault.) Paul Edwards‘ ‘Appendix’ in Why I Am Not A Christian (Allen and Unwin, New York, 1957) contains the sordid and infuriating details. Edwards’ essay is in turn based on The Bertrand Russell Case (eds. Horace Kallen and John Dewey, Viking Press, 1941).

Yertle The Turtle, Cosmological Anti-Foundationalism, And Political Change

Bertrand Russell and William James were informed, in rather arch fashion, we are told, that the solution to the age-old cosmological problem was that it was turtles all the way down. Another no less distinguished philosopher, Theodor Seuss Geisel suggested, however, that the chain of turtles, rather than extending into the deepest recesses of a cosmic infinitude, might terminate in a particular turtle, one named Mack, with a proclivity for burping at inopportune moments. Especially if you happened to be the top of the heap, and gazing fondly over the vast reaches of your empire, as Yertle, Mack’s oppressor, was. This rather simple opposition, however, does not do justice to the political and metaphysical sophistication of Geisel’s vision of this world’s orderings and the motive powers that might disturb them.

Consider that Mack does not remove himself from the bottom of the stack by shrugging, which might be considered the correct way to displace off one’s shoulders, the weight of an oppressive, individuality-destroying hierarchy (as Ayn Rand so memorably suggested in, er, Atlas Shrugged); rather, he burps. Why does Geisel chose the emission of the burp as the prime stack-toppling agent?

A burp is, as Wikipedia informs us, “the release of gas from the digestive tract (mainly esophagus and stomach) through the mouth. It is usually accompanied with a typical sound and, at times, an odor.”  As mention of “gas,” “mouth,” “digestive,” “stomach,” and “odor” suggest, a burp is a lowly, bodily thing. There is little reason in it; no premeditation, no planning, no ratiocination; it is irredeemably sordid with nary a hint of the sublime. It is all praxis and no theory. It is pure, elemental physicality, a surge of bodily power, a summoning up from the primeval depths of forces beyond our control. The murk stirs at the bottom of the pit, double bubble toil and trouble, and a gaseous charge is emitted, racing upwards for release at our oral orifice. (Modesty forbids me mention the alternate passageways that may be followed by such vaporous emissions when they race downwards instead.) The burp is the roiling forces of the Id, the dark Unconscious, made palpable and manifest. Especially to our olfactory and aural senses.

Thus does Geisel suggest that the greatest manifestations of power, the most towering reaches of human vanity, arrogance, and hubris, which seek to place themselves beyond the reaches of grubby, earthly powers, will be displaced by, not the high winds that blow at the summits, but rather by forces that reach up from below, from the deepest depths, from those most repressed, those that are the least visible. The political lesson here is clear and stark: do not expect change to come from the top, it must become from below, from those who are the most reviled, the most oppressed, the ones whose voices are all too easily ignored and shouted over. And when they do rise up, they will not do so in fancy, prettified ways; they will revel in the ugliness that was always ascribed to them. It will be their greatest weapon.

The Philosophical Education Of Scientists

Yesterday, in my Twentieth Century Philosophy class, we worked our way through Bertrand Russell‘s essay on “Appearance and Reality” (excerpted, along with “The Value of Philosophy” and “Knowledge by Acquaintance and Knowledge by Description” from Russell’s ‘popular’ work The Problems of Philosophy.) I introduced the class to Russell’s notion of physical objects being inferences from sense-data, and then went on to his discussions of idealism, materialism, and realism as metaphysical responses to the epistemological problems created by such an understanding of objects. This discussion led to the epistemological stances–rationalism and empiricism–that these metaphysical positions might generate. (There was also a digression into the distinction between necessary and contingent truths.)

At one point, shortly after I had made a statement to the effect that science could be seen as informed by materialist, realist, and empiricist conceptions of its metaphysical and epistemological presuppositions, I blurted out, “Really, scientists who think philosophy is useless and irrelevant to their work are stupid and ungrateful.”  This was an embarrassingly intemperate remark to have made in a classroom, and sure enough, it provoked some amused twittering from my students, waking up many who were only paying partial attention at that time to my ramblings.

While I always welcome approving responses from my students to my usual lame attempts at humor, my remark was too harshly phrased. But I don’t think it is false in at least one sense. Too many scientists remain ignorant of the philosophical presuppositions of their enterprise, and are not only proud of this ignorance, but bristle when they are reminded of them. Too many think claims of scientific knowledge are only uselessly examined for their foundations; too many assume metaphysics and physics don’t mix. And all too many seem to consider their scientific credentials as being burnished by making a withering attack on the intellectual competence of philosophers and intellectual sterility of their work. Of course, many will do so by making a philosophical argument of some sort, like perhaps that philosophical questioning of the foundations of science is in principle irrelevant to scientific practice.

I get some of the scientists’ impatience. Who likes pedantry and hair-splitting? And yes, many philosophers are embarrassingly ignorant about actual scientific theory and practice. But not most of it. I wonder: Did they never take a class on the history of science? Do they never study the process by which theories come to be advanced, challenged, modified, rejected, formed anew?

I have long advocated–not in any particular public forum, but in some private conversations–that the Philosophy of Science class taught by philosophy departments should really be a History and Philosophy of Science class. You can’t study the history of science without ‘doing’ the philosophy of science, and you can’t study the philosophy of science without knowing something about its history. One can only hope that those who study science with an eye to becoming its practitioners would at least be exposed to a similar curricular requirement. (I made a similar point in a post that was triggered by the Lawrence Krauss-David Albert dispute a while ago.)

Incidentally, I’m genuinely curious: Is it just me or does it seem that this kind of ‘scientific’ rejection of the philosophical enterprise is a modern–i.e., late twentieth-century onwards–disease?

Bertrand Russell On Toddlers, The ‘Little Devils’

In ‘The Superior Virtue of the Oppressed’ (Unpopular Essays, 1960; Routledge Classics 2009, pp. 60-61), Bertrand Russell writes,

Children, after being limbs of Satan in traditional theology and mystically illuminated angels in the minds of education reformers, have reverted to being little devils–not theological demons inspired by the Evil One, but scientific Freudian abominations inspired by the Unconscious. They are, it must be said, far more wicked than they were in the diatribes of the monks; they display, in modern textbooks, an ingenuity and persistence in sinful imaginings to which in the past there was nothing comparable except St. Anthony.  [link added]

Lord Russell is here inclined to be skeptical of the notion of the ‘innocent monster’ that is suggested to us by the Freudian notion of the child being all Id and nothing but the Id–with no regulation by the Ego or the Super Ego–but I wonder if that was because he had little experience with toddlers, especially two-year olds. (Russell had four children–two sons and two daughters–but I cannot recall if he spent much time rearing them.)

The ‘terrible twos‘ is a modern child-rearing cliché; prospective parents are warned about it–with bloodcurdling tales–by those that have passed through its terrible gauntlet. My wife and I are almost there, for our daughter is almost two, but I’m inclined to think the Terror began a little earlier, around the eighteen-month mark. By then, our daughter had grown, and her increasing physical maturity brought in its wake many interesting embellishments of important behavioral patterns.

Her crying, for instance, became louder and lustier, reaching impressive decibel levels capable of alarming neighbors; she could now strike and scratch out with greater vigor; she could buck and convulse her body with greater force (one such bucking escapade, prompted by her reluctance to be changed out of her night-clothes–or perhaps it was a diaper change–resulted in her headbutting my wife and cutting her lip), and of course, she had learned to say ‘no’ loudly and emphatically (and endlessly) for just about everything (including, of course, that perennially popular target of rejection, life-sustaining and growth-producing food.)

My wife is far more patient and understanding, far more possessed of forbearance, than I. So it is with some wonder and considerable respect that I observe her interactions with my daughter, as she skilfully and gracefully negotiates the temperamental meltdowns that often occur these days. In contrast, all too often, I have to walk away from an encounter with my child, alarmed and apprehensive at the thought that I might be approaching an explosive outer expression of my inner feelings.

I should not overstate the monstrous aspects of my daughter, of course. She continues to amaze and astonish us everyday; she is learning new words all the time; she has learned some habits that I hope will persist into her adult life (like sitting in her play space by herself, ‘reading’ her many books); and in her dealings with other toddlers,  she is, by and large, not an aggressor or ‘snatcher.’

As I noted here a while ago, she will continue to change and acquire new identities; there will be a point in the not-so-distant future when we will look back, with the usual selective nostalgia, at even this often-trying stage of her continuing development.

 

Don’t Tell Me What You Think of Me

Over at the Anxiety blog at The New York Times Tim Kreider gives voice to a common fear, that of finding out what other people really, really think of us:

I’ve often thought that the single most devastating cyberattack a diabolical and anarchic mind could design would not be on the military or financial sector but simply to simultaneously make every e-mail and text ever sent universally public. It would be like suddenly subtracting the strong nuclear force from the universe; the fabric of society would instantly evaporate, every marriage, friendship and business partnership dissolved. Civilization, which is held together by a fragile web of tactful phrasing, polite omissions and white lies, would collapse in an apocalypse of bitter recriminations and weeping, breakups and fistfights, divorces and bankruptcies, scandals and resignations, blood feuds, litigation, wholesale slaughter in the streets and lingering ill will…. Hearing other people’s uncensored opinions of you is an unpleasant reminder that you’re just another person in the world, and everyone else does not always view you in the forgiving light that you hope they do, making all allowances, always on your side. There’s something existentially alarming about finding out how little room we occupy, and how little allegiance we command, in other people’s heads.

Kreider is on the money here, of course. The thought of finding out how others refer to us in our absence, how even those who have most cause to adore us still do not so unreservedly, is enough to fill any reasonable human’s heart with dread. That terror generally finds its grounding both in an overly optimistic assessment of our worth and in an unrealistic desire to not rest content till we have attained a suitably high position in the ranking of the ‘rest.’ As Bertrand Russell noted in opening his chapter on ‘Fear of Public Opinion’ in The Conquest of Happiness:

Very few people can be happy unless on the whole their way of life and their outlook on the world is approved by those with whom they have social relations, and more especially with whom they live.

As a personally memorable instance of a variant of this behavior, after I received a teaching evaluation in which twenty-four out of twenty-five students answered ‘Yes’ to the question ‘Would you recommend this instructor to other students?’ I spent a considerable amount of time during my next lecture tormenting myself wondering about the identity of the exception to the rule. (And like Kreider, I’ve read emails not meant for my eyes in which friends of mine have expressed considerably unflattering opinions of me; some of those people are still my friends.)

I have long tried to insulate myself from the disappointment of the discovery that I’m not universally adored and the crushing horror of universal loathing by ceaseless repetition of the mantra that no one quite likes or dislikes me as much as I might imagine. This reminder of the ‘golden mean‘ of public opinion only has limited effectiveness; like the folks I refer to above, I retreat a little too easily into delusional comforts.

Addendum 6/24/2013: A discussion with David Post on Facebook suggests to me that my use of ‘reasonable’ in ‘..any reasonable human’s heart..” is confusing. I’m going to leave it up there but it really should just read ‘..human’s heart..”.