Talking About Natural Law With Children

Last Thursday, thanks to New York City public schools taking a ‘mid-winter break,’ my daughter accompanied me to Brooklyn College and sat in on two classes. My students, as might be expected, were friendly and welcoming; my daughter, for her part, conducted herself exceedingly well by taking a seat and occupying herself by drawing on a piece of paper and often, just paying attention to the class discussion. She did not interrupt me even once; and I only had to ask her to pipe down a bit when she began humming a little ditty to herself. After the second class–philosophy of law, which featured a discussion of St. Thomas Aquinas and natural law theory–had ended, I asked her what she thought the class was about. She replied, “it was about good and bad.” This was a pretty good answer, but things got better the next day.

On Friday, as we drove to gym for my workout and my daughter’s climbing session, I picked up the conversation again, asking my daughter what she made of the class discussion and whether she had found it interesting. She said she did; so I pressed on and the following conversation resulted:

“Let me ask you something. Would you always obey the law?”

“Yes”

“What if the law told you to do something bad?”

“I would do it.”

“Why? Why would you do something bad?”

“Because I don’t want to go to jail.”

“You know, I’ve been to jail twice. For breaking the law.”

“Why?”

“Well, one time, I was angry with one country for attacking people and dropping bombs on them, so I went to their embassy and protested by lying down on the street. When the police told me to move, I didn’t, and so they arrested me and put me in jail for a day. Another time, I protested our university not paying the teachers enough money for their work, and I was arrested again for protesting in the same way.” [Strictly speaking this is a bad example of civil disobedience; I wasn’t breaking a law I thought unjust, rather, I was breaking a law to make a point about the unjustness of other actions.]

“Did they feed you in jail?”

“Yes, they did.”

“Oh, that’s good.”

“Well, so what do you think? Would you break the law if it told you to do something bad?”

“No.”

“Why not? The law is asking you to do something bad.”

“What if I was wrong?”

“What do you mean?”

“What if I was wrong, and it wasn’t bad, and the policeman put me in jail?”

“What if you were sure that you were being asked to do something bad?”

“Then I wouldn’t do it.”

“Why?”

“Because I don’t want do bad things.”

“But isn’t breaking the law a bad thing?”

“Yes.”

“So, why are you breaking the law?”

“Because it’s asking me to do a bad thing.”

At this point, we were close to our turn-off for the gym and our parking spot, and so our conversation ended. A couple of interesting takeaways from it:

1. We see the social construction of a legal order here in the making; at the age of five, my daughter has already internalized the idea that breaking the law is a ‘bad thing’ and that bad things happen to those who break the law. She can also identify the enforcers of the law.  This has already created a normative hold on her; she was inclined to obey the law even if it asked her to do something bad because she was worried about the consequences.

2. My daughter displayed an interesting humility about her moral intuitions; she wasn’t sure of whether her thinking of some act as ‘bad’ was infallible. What if she was wrong about that judgment?

Note: My reporting of the conversation above might be a little off; I’m reproducing it from memory.

Steven Pinker Should Read Some Nietzsche For Himself

Steven Pinker does not like Nietzsche. The following exchange–in an interview with the Times Literary Supplement makes this clear:

Question: Which author (living or dead) do you think is most overrated?

Pinker: Friedrich Nietzsche. It’s easy to see why his sociopathic ravings would have inspired so many repugnant movements of the twentieth and twenty-first centuries, including fascism, Nazism, Bolshevism, the Ayn Randian fringe of libertarianism, and the American alt-Right and neo-Nazi movements today. Less easy to see is why he continues to be a darling of the academic humanities. True, he was a punchy stylist, and, as his apologists note, he extolled the individual superman rather than a master race. But as Bertrand Russell pointed out in A History of Western Philosophy, the intellectual content is slim: it “might be stated more simply and honestly in the one sentence: ‘I wish I had lived in the Athens of Pericles or the Florence of the Medici’.”

The answers that Pinker seeks–in response to his plaintive query–are staring him right in the face. To wit, ‘we’ study Nietzsche with great interest because:

1. If indeed it is true that Nietzsche’s ‘ravings…inspired so many repugnant movements’–and these ‘movements’ have not been without considerable import, then surely we owe it to ourselves to read him and find out why they did so. Pinker thinks it ‘It’s easy to see why’ but surely he would not begrudge students reading Nietzsche for themselves to find out why? Moreover, Nietzsche served as the inspiration for a great deal of twentieth-century literature too–Thomas Mann is but one of the many authors to be so influenced. These connections are worth exploring as well.

2. As Pinker notes with some understatement, Nietzsche was a ‘punchy stylist.’ (I mean, that is like saying Mohammad Ali was a decent boxer, but let’s let that pass for a second.) Well, folks in the humanities–in departments like philosophy, comparative literature, and others–often study things like style, rhetoric, and argumentation; they might be interested in seeing how these are employed to produce the ‘sociopathic ravings’ that have had such impact on our times. Moreover, Nietzsche’s writings employ many different literary styles; the study of those is also of interest.

3. Again, as Pinker notes, Nietzsche ‘extolled the individual superman rather than a master race,’ which then prompts the question of why the Nazis were able to co-opt him in some measure. This is a question of historical, philosophical, and cultural interest; the kinds of things folks in humanities departments like to study. And if Nietzsche did develop some theory of the “individual superman,” what was it? The humanities are surely interested in this topic too.

4. Lastly, for Pinker’s credibility, he should find a more serious history of philosophy than Bertrand Russell‘s A History of Western Philosophy, which is good as a light read–it was written very quickly as a popular work for purely commercial purposes and widely reviled in its time for its sloppy history. There is some good entertainment in there; but a serious introduction to the philosophers noted in there can only begin with their own texts. If Pinker wants to concentrate on secondary texts, he can read Frederick Copleston‘s Friedrich Nietzsche: Philosopher of Culture; this work, written by a man largely unsympathetic to Nietzsche’s views and who indeed finds him morally repugnant, still finds them worthy of serious consideration and analysis. So much so that Copleston thought it worthwhile to write a book about them. Maybe Pinker should confront some primary texts himself. He might understand the twentieth century better.

Dear ‘Fellow’ Indians, Please Spell My Fucking Name Correctly

It’s ‘Samir’, not ‘Sameer.’ That, really, should be enough. Here is the correct spelling of someone’s name; please abide by it. But Indians will simply not comply. I’m a middle-aged man, about to hit fifty-one in a few weeks time, and my entire life,  Indians have been systematically misspelling and butchering my name with this horrendous lexicography. All are equally guilty: strangers, family, and friends. I can excuse those who have only heard my name and written to me–for after all, the pronunciation of ‘Samir’ is ‘Sameer’ and for those used to spelling phonetically, this might suggest itself as a plausible spelling. But what excuse do those have who have seen my name in print, who indeed are corresponding with me by email and have seen my name in the message header? Or even worse, what excuse do members of my family and my many friends of many years have, who continue to misspell my name? Some of these folks have known me for over thirty years, some for over twenty years. The prize must go to those who begin an email correspondence with me using the correct spelling and then a few messages later, decide they have had enough, and decide to start using ‘Sameer’ instead. On the many occasions I’ve tried to issue corrections, my pleas have been greeted with some bemusement, and never have I been granted the courtesy of a simple mea culpa.

‘Samir’ is, of course, a common name in the Arab world (especially, I believe, in Egypt, Lebanon, and Palestine.) There, it means: “jovial, loyal or charming companion.” (I’ve rarely had this description used for me.) The Arabic spelling is (سمير); the English spelling is as indicated (and preferred by me.) In India, where it means ” gust of wind or gentle breeze”–though my friends prefer to think of me as “hot air”–the Hindi spelling is (समीर) while both Samir’ and ‘Sameer’ are used as English spellings. That is, in India, the spelling ‘Samir’ is not unknown, though perhaps just a little less common than ‘Sameer.’ To reiterate, Indians simply have no excuse for their misspelling of my name.

Americans cannot pronounce my name correctly; I’ve slowly grown used to this frustrating state of affairs where I’m referred to as ‘Shamir,’ ‘Smear,’ ‘Sameyer’ and so on. (Pride of place though, must go to the Irish lad who called me ‘Izmir.’ No, no, call me Ishmael. Please. It shares more vowels with my name.) I suppose it’s the price that an immigrant must pay: lose your ‘homeland,’ lose your name, and so on. I’ll deal with it. (Though it will remain a mystery to me that people capable of mastering the pronunciation of ‘Arkansas’ and ‘Massachusetts’ cannot flex their linguistic muscles for a much simpler word; perhaps my ‘foreignness’ trips up their tongues.) With one rare, recent exception, Americans don’t misspell my name; once they see my name in print, they spell it correctly. Indians pronounce my name correctly; how could they not? But they can’t spell it. I wonder if those Indian kids who win the spelling bees year after year in the US could pull it off. Or perhaps their parents’ sins have been visited on them, and they too, would mangle my name.

I will make sure, in my will, to include the provision that no Indian should be allowed anywhere near the writing of my epitaph; I have no faith they will get the spelling right.

US Elections Invite External Intervention, As They Well Might

The Robert Mueller indictment of thirteen Russians for ‘interfering’ in the American elections of 2016 confirms the bad news: those elections were ‘influenced’–in some shape or form–by non-Americans. The extent of this ‘influence’ is unclear–whether they decisively swung the election to Donald Trump or not–but be that as it may, one fact remains established: among the various forces aiming to influence American voters minds as they exercised their electoral franchise were non-American ones. It is unclear whether the Russian Internet Agency coordinated with the Kremlin or with the Trump campaign, but they did ‘participate’ in the American electoral process.

One might well ask: why not? The entire world looks on with bated breath as an American president is elected; some wonder whether their country will benefit from US largess, yet others whether they will need to scurry for cover as cruise missiles, drones, and aircraft carriers are sent their way. Russians are not immune to such concern; they, like many of the world’s citizens, are as keen to see their national interests protected by the new US administration. They too have favorites: they would rather see one candidate elected than another. This is as true for American ‘friends’ as it is for ‘foes,’ precisely because those nations too, have varied interests and inclinations, which line up in varied and interesting ways behind different American candidates. Those ‘interests and inclinations’ too, jostle for representation in the American elections.

The US involves and implicates itself in the affairs of many sovereign nations; it places conditions on the aid it sends them; it too, is interested in who gets elected and where (or who comes to power through a coup); the American record of influencing elections and the choice of political leaders and administrations the world over is well known. (Consider just Iraq and Iran as examples.) The US cannot reasonably expect that such involvement and implication will remain unilateral; it especially cannot expect that the rest of the world will not express its interest in American elections by attempting to influence American voters’ choices. For instance, it is not at all unreasonable to expect that leading newspapers like the Guardian or Der Spiegel might write editorials endorsing particular American candidates and expressing sentiments like “We hope the American people will elect X; X‘s polices speak to the establishment of world peace, something that we here in country Y are most eager for.”

American elections have, by virtue of their increased prominence in the American political calendar, also become worldwide entertainment events; they invite punters to lay bets; they drive up television ratings of many television stations and websites–worldwide–on the night of the presidential debates and the election results. Americans are proud of this: look, the whole world is watching as we elect our leaders. Well, those folks want to participate too; they know the folks getting elected could make them lose their jobs, or worse, their lives. American election campaigns are conducted on the Internet; a global platform for communication and information transfer. This invites participation of a kind not possible in yesteryear, when non-Americans could only look on from afar as Americans debated among themselves on who to vote for; now, on Facebook and Twitter and many other internet forums those same folks can converse with Americans and participate in the American electoral process. Americans are used to this kind of participation and influencing on an informal basis: our European and South American and Asian and African friends often exclaim loudly how they hope we will elect X, not Y.

A global player, one as powerful and important as the US, one used to ‘participating’ in the affairs of the world, invites a corresponding participation in its policies; the world has long thought it would be nice if they got a say in electing the American president because of the reach and extent of American power. With American elections now ‘opened’ to the world–thanks to the Internet, that participation has begun.

Gide’s Immoralist And The Existential Necessity Of The Colony

The immoralist at the heart of André Gide‘s The Immoralist, Michel, does not travel just anywhere; he travels to French colonies like Algeria and Tunisia; the boys who he meets, is attracted to, and falls in love with, are not just any boys; they are Muslim Arab boys. He is old; they are young. He is white; they are brown. He is sick and tubercular; they are young and exuberant, bursting to the seams with health and vitality. Their blood is redder, and flows more freely; Michel’s blood is black, and hideous, and disgusting. He is diseased, but as he spends time among his new companions, whose bodies and nakedness underneath their clothes he cannot take his eyes off of, his health improves and he begins to describe the arc of a journey to greater health and well-being, away from disease; he begins a journey from flirting with death to welcoming life in all its fullness. The language that Gide uses to describe Michel’s journey or passage is richly symbolic and metaphorical, and invites multiple interpretations, mingling as it does, these descriptions of the physical with those of the mental, so that we are tempted to see Michel’s journey from bad to good health as his journey from being ‘a lost soul’ to being ‘a found self’; that much is straightforward.

But why place this journey in colonized lands, why make the vehicles of Michel’s transformation and self-discovery be the colonized, the subjugated, the colonial subject? For one, we can see the colonizer use both the land and the peoples of the colony as his experiential space for self-discovery; it becomes one more of the services or functions that the colonized provides; besides markets, it provides an avenue and domain for self-construction; it becomes one more of the means by which the colonizer comes to realize himself. Because the colonized inhabits a world in which the colonizer has been, as it were, ‘marketed’, Michel finds in the colonies and in the gaze of the colonial subject, one component of his identity: how a Frenchman is understood by those he has colonized. If the colonial identity is an indissoluble part of what it meant to be a Frenchman in the twentieth century then Michel has done the right thing by traveling to a French colony; it is there that he will find out what a Frenchman truly is.

But this salvation need not be individual; all of French culture and Western civilization may be redeemed in the colonies; it is where a decadent, dying civilization looks to being revitalized; to literally being brought back to life. French and Western civilization has become old and tubercular, its blood is polluted. But the Muslim Arab world is younger, even if immature, it promises a new vision of life to a culture on its death-bed and drags it back from its flirtation with death.

The colony is a material and spiritual and existential necessity; it extends the life of the colonizer; the journey to a new form of life for the colonizer begins there.

Neuroscience’s Inference Problem And The Perils Of Scientific Reduction

In Science’s Inference Problem: When Data Doesn’t Mean What We Think It Does, while reviewing Jerome Kagan‘s Five Constraints on Predicting Behavior, James Ryerson writes:

Perhaps the most difficult challenge Kagan describes is the mismatching of the respective concepts and terminologies of brain science and psychology. Because neuroscientists lack a “rich biological vocabulary” for the variety of brain states, they can be tempted to borrow correlates from psychology before they have shown there is in fact a correlation. On the psychology side, many concepts can be faddish or otherwise short-lived, which should make you skeptical that today’s psychological terms will “map neatly” onto information about the brain. If fMRI machines had been available a century ago, Kagan points out, we would have been searching for the neurological basis of Pavlov’s “freedom reflex” or Freud’s “oral stage” of development, no doubt in vain. Why should we be any more confident that today’s psychological concepts will prove any better at cutting nature at the joints?

In a review of Theory and Method in the Neurosciences (Peter K. Machamer, Rick Grush, Peter McLaughlin (eds), University of Pittsburgh Press, 2001), I made note¹ of related epistemological concerns:

When experiments are carried out, neuroscientists continue to run into problems. The level of experimental control available to practitioners in other sciences is simply not available to them, and the theorising that results often seems to be on shaky ground….The localisation techniques that are amongst the most common in neuroscience rely on experimental methods such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), electroencephalography (EEG), and magnetoencephelography (MEG). [In PET] a radioactive tracer consisting of labelled water or glucose analogue molecules is injected into a subject, who is then asked to perform a cognitive task under controlled conditions. The tracer decays and emits positrons and gamma rays that increase the blood flow or glucose metabolism in an area of the brain. It is now assumed that this area is responsible for the cognitive function performed by the subject. The problem with this assumption, of course, is that the increased blood flow might occur in one area, and the relevant neural activity might occur in another, or in no particular area at all….this form of investigation, rather than pointing to the modularity and functional decomposability of the brain, merely assumes it.

The fundamental problem–implicit and explicit in Kagan’s book and my little note above–is the urge to ‘reduce’ psychology to neuroscience, to reduce mind to brain, to eliminate psychological explanations and language in favor of neuroscientific ones, which will introduce precise scientific language in place of imprecise psychological descriptions.  This urge to eliminate one level of explanation in favor of a ‘better, lower, more basic, more fundamental’ one is to put it bluntly, scientistic hubris, and the various challenges Kagan outlines in his book bear out the foolishness of this enterprise. It results in explanations and theories that rest on unstable foundations: optimistic correlations and glib assumptions are the least of it. Worst of all, it contributes to a blindness: what is visible at the level of psychology is not visible at the level of neuroscience. Knowledge should enlighten, not render us myopic.

Note: In Metascience, 11(1): March 2002.