Dear ‘Fellow’ Indians, Please Spell My Fucking Name Correctly

It’s ‘Samir’, not ‘Sameer.’ That, really, should be enough. Here is the correct spelling of someone’s name; please abide by it. But Indians will simply not comply. I’m a middle-aged man, about to hit fifty-one in a few weeks time, and my entire life,  Indians have been systematically misspelling and butchering my name with this horrendous lexicography. All are equally guilty: strangers, family, and friends. I can excuse those who have only heard my name and written to me–for after all, the pronunciation of ‘Samir’ is ‘Sameer’ and for those used to spelling phonetically, this might suggest itself as a plausible spelling. But what excuse do those have who have seen my name in print, who indeed are corresponding with me by email and have seen my name in the message header? Or even worse, what excuse do members of my family and my many friends of many years have, who continue to misspell my name? Some of these folks have known me for over thirty years, some for over twenty years. The prize must go to those who begin an email correspondence with me using the correct spelling and then a few messages later, decide they have had enough, and decide to start using ‘Sameer’ instead. On the many occasions I’ve tried to issue corrections, my pleas have been greeted with some bemusement, and never have I been granted the courtesy of a simple mea culpa.

‘Samir’ is, of course, a common name in the Arab world (especially, I believe, in Egypt, Lebanon, and Palestine.) There, it means: “jovial, loyal or charming companion.” (I’ve rarely had this description used for me.) The Arabic spelling is (سمير); the English spelling is as indicated (and preferred by me.) In India, where it means ” gust of wind or gentle breeze”–though my friends prefer to think of me as “hot air”–the Hindi spelling is (समीर) while both Samir’ and ‘Sameer’ are used as English spellings. That is, in India, the spelling ‘Samir’ is not unknown, though perhaps just a little less common than ‘Sameer.’ To reiterate, Indians simply have no excuse for their misspelling of my name.

Americans cannot pronounce my name correctly; I’ve slowly grown used to this frustrating state of affairs where I’m referred to as ‘Shamir,’ ‘Smear,’ ‘Sameyer’ and so on. (Pride of place though, must go to the Irish lad who called me ‘Izmir.’ No, no, call me Ishmael. Please. It shares more vowels with my name.) I suppose it’s the price that an immigrant must pay: lose your ‘homeland,’ lose your name, and so on. I’ll deal with it. (Though it will remain a mystery to me that people capable of mastering the pronunciation of ‘Arkansas’ and ‘Massachusetts’ cannot flex their linguistic muscles for a much simpler word; perhaps my ‘foreignness’ trips up their tongues.) With one rare, recent exception, Americans don’t misspell my name; once they see my name in print, they spell it correctly. Indians pronounce my name correctly; how could they not? But they can’t spell it. I wonder if those Indian kids who win the spelling bees year after year in the US could pull it off. Or perhaps their parents’ sins have been visited on them, and they too, would mangle my name.

I will make sure, in my will, to include the provision that no Indian should be allowed anywhere near the writing of my epitaph; I have no faith they will get the spelling right.

US Elections Invite External Intervention, As They Well Might

The Robert Mueller indictment of thirteen Russians for ‘interfering’ in the American elections of 2016 confirms the bad news: those elections were ‘influenced’–in some shape or form–by non-Americans. The extent of this ‘influence’ is unclear–whether they decisively swung the election to Donald Trump or not–but be that as it may, one fact remains established: among the various forces aiming to influence American voters minds as they exercised their electoral franchise were non-American ones. It is unclear whether the Russian Internet Agency coordinated with the Kremlin or with the Trump campaign, but they did ‘participate’ in the American electoral process.

One might well ask: why not? The entire world looks on with bated breath as an American president is elected; some wonder whether their country will benefit from US largess, yet others whether they will need to scurry for cover as cruise missiles, drones, and aircraft carriers are sent their way. Russians are not immune to such concern; they, like many of the world’s citizens, are as keen to see their national interests protected by the new US administration. They too have favorites: they would rather see one candidate elected than another. This is as true for American ‘friends’ as it is for ‘foes,’ precisely because those nations too, have varied interests and inclinations, which line up in varied and interesting ways behind different American candidates. Those ‘interests and inclinations’ too, jostle for representation in the American elections.

The US involves and implicates itself in the affairs of many sovereign nations; it places conditions on the aid it sends them; it too, is interested in who gets elected and where (or who comes to power through a coup); the American record of influencing elections and the choice of political leaders and administrations the world over is well known. (Consider just Iraq and Iran as examples.) The US cannot reasonably expect that such involvement and implication will remain unilateral; it especially cannot expect that the rest of the world will not express its interest in American elections by attempting to influence American voters’ choices. For instance, it is not at all unreasonable to expect that leading newspapers like the Guardian or Der Spiegel might write editorials endorsing particular American candidates and expressing sentiments like “We hope the American people will elect X; X‘s polices speak to the establishment of world peace, something that we here in country Y are most eager for.”

American elections have, by virtue of their increased prominence in the American political calendar, also become worldwide entertainment events; they invite punters to lay bets; they drive up television ratings of many television stations and websites–worldwide–on the night of the presidential debates and the election results. Americans are proud of this: look, the whole world is watching as we elect our leaders. Well, those folks want to participate too; they know the folks getting elected could make them lose their jobs, or worse, their lives. American election campaigns are conducted on the Internet; a global platform for communication and information transfer. This invites participation of a kind not possible in yesteryear, when non-Americans could only look on from afar as Americans debated among themselves on who to vote for; now, on Facebook and Twitter and many other internet forums those same folks can converse with Americans and participate in the American electoral process. Americans are used to this kind of participation and influencing on an informal basis: our European and South American and Asian and African friends often exclaim loudly how they hope we will elect X, not Y.

A global player, one as powerful and important as the US, one used to ‘participating’ in the affairs of the world, invites a corresponding participation in its policies; the world has long thought it would be nice if they got a say in electing the American president because of the reach and extent of American power. With American elections now ‘opened’ to the world–thanks to the Internet, that participation has begun.

Gide’s Immoralist And The Existential Necessity Of The Colony

The immoralist at the heart of André Gide‘s The Immoralist, Michel, does not travel just anywhere; he travels to French colonies like Algeria and Tunisia; the boys who he meets, is attracted to, and falls in love with, are not just any boys; they are Muslim Arab boys. He is old; they are young. He is white; they are brown. He is sick and tubercular; they are young and exuberant, bursting to the seams with health and vitality. Their blood is redder, and flows more freely; Michel’s blood is black, and hideous, and disgusting. He is diseased, but as he spends time among his new companions, whose bodies and nakedness underneath their clothes he cannot take his eyes off of, his health improves and he begins to describe the arc of a journey to greater health and well-being, away from disease; he begins a journey from flirting with death to welcoming life in all its fullness. The language that Gide uses to describe Michel’s journey or passage is richly symbolic and metaphorical, and invites multiple interpretations, mingling as it does, these descriptions of the physical with those of the mental, so that we are tempted to see Michel’s journey from bad to good health as his journey from being ‘a lost soul’ to being ‘a found self’; that much is straightforward.

But why place this journey in colonized lands, why make the vehicles of Michel’s transformation and self-discovery be the colonized, the subjugated, the colonial subject? For one, we can see the colonizer use both the land and the peoples of the colony as his experiential space for self-discovery; it becomes one more of the services or functions that the colonized provides; besides markets, it provides an avenue and domain for self-construction; it becomes one more of the means by which the colonizer comes to realize himself. Because the colonized inhabits a world in which the colonizer has been, as it were, ‘marketed’, Michel finds in the colonies and in the gaze of the colonial subject, one component of his identity: how a Frenchman is understood by those he has colonized. If the colonial identity is an indissoluble part of what it meant to be a Frenchman in the twentieth century then Michel has done the right thing by traveling to a French colony; it is there that he will find out what a Frenchman truly is.

But this salvation need not be individual; all of French culture and Western civilization may be redeemed in the colonies; it is where a decadent, dying civilization looks to being revitalized; to literally being brought back to life. French and Western civilization has become old and tubercular, its blood is polluted. But the Muslim Arab world is younger, even if immature, it promises a new vision of life to a culture on its death-bed and drags it back from its flirtation with death.

The colony is a material and spiritual and existential necessity; it extends the life of the colonizer; the journey to a new form of life for the colonizer begins there.

Neuroscience’s Inference Problem And The Perils Of Scientific Reduction

In Science’s Inference Problem: When Data Doesn’t Mean What We Think It Does, while reviewing Jerome Kagan‘s Five Constraints on Predicting Behavior, James Ryerson writes:

Perhaps the most difficult challenge Kagan describes is the mismatching of the respective concepts and terminologies of brain science and psychology. Because neuroscientists lack a “rich biological vocabulary” for the variety of brain states, they can be tempted to borrow correlates from psychology before they have shown there is in fact a correlation. On the psychology side, many concepts can be faddish or otherwise short-lived, which should make you skeptical that today’s psychological terms will “map neatly” onto information about the brain. If fMRI machines had been available a century ago, Kagan points out, we would have been searching for the neurological basis of Pavlov’s “freedom reflex” or Freud’s “oral stage” of development, no doubt in vain. Why should we be any more confident that today’s psychological concepts will prove any better at cutting nature at the joints?

In a review of Theory and Method in the Neurosciences (Peter K. Machamer, Rick Grush, Peter McLaughlin (eds), University of Pittsburgh Press, 2001), I made note¹ of related epistemological concerns:

When experiments are carried out, neuroscientists continue to run into problems. The level of experimental control available to practitioners in other sciences is simply not available to them, and the theorising that results often seems to be on shaky ground….The localisation techniques that are amongst the most common in neuroscience rely on experimental methods such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), electroencephalography (EEG), and magnetoencephelography (MEG). [In PET] a radioactive tracer consisting of labelled water or glucose analogue molecules is injected into a subject, who is then asked to perform a cognitive task under controlled conditions. The tracer decays and emits positrons and gamma rays that increase the blood flow or glucose metabolism in an area of the brain. It is now assumed that this area is responsible for the cognitive function performed by the subject. The problem with this assumption, of course, is that the increased blood flow might occur in one area, and the relevant neural activity might occur in another, or in no particular area at all….this form of investigation, rather than pointing to the modularity and functional decomposability of the brain, merely assumes it.

The fundamental problem–implicit and explicit in Kagan’s book and my little note above–is the urge to ‘reduce’ psychology to neuroscience, to reduce mind to brain, to eliminate psychological explanations and language in favor of neuroscientific ones, which will introduce precise scientific language in place of imprecise psychological descriptions.  This urge to eliminate one level of explanation in favor of a ‘better, lower, more basic, more fundamental’ one is to put it bluntly, scientistic hubris, and the various challenges Kagan outlines in his book bear out the foolishness of this enterprise. It results in explanations and theories that rest on unstable foundations: optimistic correlations and glib assumptions are the least of it. Worst of all, it contributes to a blindness: what is visible at the level of psychology is not visible at the level of neuroscience. Knowledge should enlighten, not render us myopic.

Note: In Metascience, 11(1): March 2002.

An Ode To The Semicolon

I discovered semicolons in the fall of 1992. I had asked–on a lark of sorts–to read a term paper written by my then-girlfriend, who was taking a class in literary theory at New York University. In it, I noticed a ‘new’ form of punctuation; I had seen the semicolon before, but I had not seen it pressed so artfully into service. Here and there, my girlfriend had used it to mark off clauses–sometimes two, sometimes three–within a sentence; her placement turned one sentence into two, with a pause more pronounced than that induced by a comma. The two separated clauses acquired a dignity they did not previously progress; there was now a dramatic transition from one to the other as opposed to the blurring, the running on, induced by the comma. I had not read writing like this before; it read differently; it spoke to a level of sophistication in expression that seemed aspirational to me. I immediately resolved to use the semicolon in my own writing.

And so I did; I plunged enthusiastically into the business of sprinkling semicolons over my writing; they sprung up like spring wildflowers all over my prose, academic or not. Like my girlfriend, I did not stop at a mere pair of clauses; triplets and sometimes quadruplets were common. Indeed, the more the merrier; why not just string all of them along?

Needless to say, my early enthusiasm for semicolon deployment necessitated a pair of corrections. (My girlfriend herself offered one; my ego was not too enlarged to make me reject her help.) One was to use the semicolon properly. That is, to use it as a separator only when there were in fact separate clauses to be separated, and not just when a mere comma would have sufficed. The other, obviously, was to cut down just a tad on the number of clauses I was stringing together. Truth be told, there was something exhilarating about adding on one clause after another to a rapidly growing sentence, throwing in semicolon after semicolon, watching the whole dramatic edifice take shape on the page. Many editors of mine have offered interventions in this domain; I’ve almost always disagreed with their edits when they delete semicolons I’ve inserted in my writing. To my mind, they ran together too much and produced clunkier sentences in the process.

I don’t think there is any contest; the semicolon is my favorite piece of punctuation. The period is depressing; it possesses too much finality. The comma is a poser; it clutters up sentences, and very few people ever become comfortable with, or competent in, using them. (I often need to read aloud passages of prose I’ve written in order to get my comma placement right.) The colon is a little too officious. (My ascription of personalities to punctuation marks comes naturally to a synesthete like me.) The semicolon combines the best of all three, typographically and syntactically. It looks good; it works even better. What’s not to like?

A Conversation On Religious Experience

A couple of summers ago, a friend and I waited at a parking lot by Cottonwood Pass in Colorado for a ride back to Buena Vista. Bad weather had forced us off the Colorado Trail, and we now needed transportation to the nearest lodging venue. A pair of daytrippers, a middle-aged couple, appeared, walking back to their parked vehicle, done with their viewing and photography for the day. We walked over and made our request: could we please hitch a ride with them? They were not going back all the way to Buena Vista, but only to a campground nearby; would that work? We said it would; we would find another ride from there. In we hopped, and off we went.

As we drove, introductions were made; we were from Brooklyn, our benefactors were visiting from Texas, sans children and grandchildren.  When asked what I ‘did,’ I said I was a professor of philosophy at the City University of New York. Intrigued, they asked which classes I taught; when I listed philosophy of religion as one of them, they asked me if I was religious. I said that I wasn’t religious in belief or practice, but was very interested in the problems and questions that arose in that domain of philosophy. I asked my newly made friends if they were religious; they both said they were devout Christians. I asked them if their faith had been a matter of being born into a devout family, and they both replied that while they had been born into churchgoing families, each had had an individual ‘experience’ that had cemented their faith, given it a new dimension, and indeed caused them to say they ‘found Christ’ only later in life–in each case, following a profound ‘crisis’ of one sort or the other. When Christ ‘had come’ to them, they had ‘felt’ it immediately, and had never had any doubt ‘in their hearts’ from that point on. I listened, fascinated. I made note of the fact that I taught an section on ‘religious experiences’ in my class, and mentioned William James and St. Teresa of Avila‘s theoretical and personal accounts of its phenomena.

When my new friends were done recounting the story of their journey to their faith, they asked me again if I was sure I wasn’t religious. I said that I was quite sure I had no theistic belief–even as I remained fascinated by religion as a social, cultural, psychological, and intellectual phenomena and by the nature of religious feeling–which is why, of course, I had inquired into the nature of their religious belief and how they came by their beliefs. In response, my friends said they were ‘relieved’ to hear of my attitude, that they frequently skirted the subject in conversation with ‘strangers’ because they didn’t want anyone to feel they were proselytizing; I assured them I didn’t think they were, and that I had found the conversation singularly illuminating.

We had driven on past the campground that was supposed to be our destination; our friends said they found our conversation worth continuing; they would drop us to Buena Vista. Rain clouds were still threatening, so this offer was most welcome. Soon, we found ourselves on Buena Vista’s Main Street, our destination for the day. We alighted, grabbed our backpacks, posed for a photograph or two, and then bade them farewell; I asked for their names, but did not write them down, and so have forgotten them. But not that conversation; there was a refreshing warmth and openness on display that was refreshing, and in the context of the US and its endless ‘religious wars,’ a genuine sense of novelty.

Death Of A Password

Time to bid farewell to an old, dear, and familiar friend, a seven-character one whose identity was inscribed, as if by magic, on my fingertips, which flew over the keyboard to bring it to life, time and time again. The time has come for me to lay it to rest, after years and years of yeoman service as a gatekeeper and sentry sans pareil. For years it guarded my electronic stores, my digital repositories of files and email messages. It made sure no interlopers trespassed on these vital treasures, perhaps to defile and destroy, or worse, to embarrass me by firing off missives to all and sundry in the world signed by me, and invoking the wrath of those offended and displeased upon my head. It’s ‘design’ was simple, the artful placement of a special character between a pair of triplet letters that served to produce a colloquial term referring to a major rock band. (Sorry for being coy, but I have hopes of resurrecting this password at some point in the future when the madness about overly-secure passwords and yet utterly useless passwords has broken down.) Once devised this password worked like magic; it was easy to remember, and I never forgot it, no matter how dire the circumstances.

Once my life became sufficiently complicated to require more than one computer account, as an increasing number of aspects of my life moved online, this password was pressed into double and later, triple and quadruple duty: email clients, utilities billing accounts, mortgage payments, online streaming sites, and all of the rest. I knew this was a security risk of sorts but I persisted; like many other computer users, I dreaded having to learn new, increasingly complicated passwords, and of course, I was just plain lazy. And yet, I was curiously protective of my password; I never shared it with anyone, not even a cohabiting girlfriend. My resistance broke down once I got married; my life was now even more intertwined with another person, our affairs messily tangled up; we often needed access to each others’ computer accounts. And so, it came to be: I shared my password with my wife. I wondered, as I wrote it down for her, whether she’d notice my little verbal trick, my little attempt to be clever. Much to my disappointment she did not; she was all business; all she wanted was a string of letters that would let her retrieve a piece of information that she needed.

The end when it came, was prompted by a series of mishaps and by increasingly onerous security policies: my Twitter account was hacked and many new online accounts required new passwords whose requirements could not be met by my old password. With some reluctance, I began adopting a series of new passwords, slowly consolidating them into a pair of alphanumeric combinations. My older password still worked, but on increasingly fewer accounts. Finally, another security breach was the last straw; I had been caught, and found wanting; the time had come to move on. So I did. But not without the odd backward glance or two, back at an older and simpler time.