Chemical Weapons and the ‘Unnecessary Roughness’ Rule

Fans of the NFL will be familiar with the unnecessary roughness rule; it’s one of those features of America’s most popular game that  sometimes causes bemusement, even to those who consider themselves long-time devotees. In a game memorably described as ‘young men running around risking spinal injury’ or ‘an endless series of head-on collisions’, there is something rather charming about a rule that attempts–with only limited success–to circumscribe its mayhem.

I am reminded of that rule’s strictures as I observe the anguished reaction to the use of chemical weapons in the ongoing conflict in Syria. A ‘red line’–a moral one–has been crossed, and punishment for the offenders is in the offing. But I do not yet understand fully what marks this line out for us, how its boundaries are to be observed.

It cannot just be the associated violence or the loss of life. Conventional weapons also flay flesh, gouge out eyes, destroy limbs; they drive red-hot metal into your abdomen, your skull, tear out your intestines; they will kill your children just as effectively as chemical weapons will. Do chemical weapons cause more pain? Reading reports of children ‘writhing on the ground and foaming at the mouth’ might make one think so, but this is delusional. If your entrails are ripped out by the shrapnel from an artillery shell, and no aid is forthcoming, your death is very likely to be a prolonged, miserable affair. Moving on from guns and cannons, shells and bullets, how about a knife to the eye or a bayonet in the neck?  (Ever heard that gurgling sound someone stabbed in the neck makes?)  Are we against slow deaths and for quick ones? How slow is problematic? And why should that make a difference?  And clearly, the loss of civilian life does not seem to bother too many; if the rationale for the positioning of the ‘red line’ is to be taken seriously, killing civilians with conventional weaponry–laser guided bombs? drones?–would be just fine.

You can ban the use of napalm but you cannot stop civilians burning to death when a high-explosive shell sets their house on fire. That same ban will not prevent children seeing their parents die in front of them, or parents their children. Being buried alive should sound like a pretty horrible death to us; that can easily be brought about by those conventional weapons whose use is approved of by our laws and morality.

Our civilization’s qualms about the use of chemical weapons betray a certain inconsistency: if the death of innocents and causing gratuitous pain to them is our central concern then all forms of armed warfare should constitute a ‘red line’ not to be crossed. But like the NFL and its endless concussions and spinal injuries, we like to keep the dogs of war handy, straining at their leashes, ready to run out and dispatch their prey, and reserve the right to turn up our delicate noses when they cross some arbitrary bounds of savagery.

War, apparently, is inevitable. We just don’t want to get our hands too dirty with it. We are a fastidious species.

I’d Rather Be ‘Working’?

A New Yorker cartoon shows us a car careening down the street; from the rear, we can make out the silhouettes of a mother and three children in their car-seats; a ball is being thrown up in the air; and on the back of the car, a bumper sticker reads ‘I’d rather be working.’ Parents and non-parents alike chuckle; kids are a pain in the ass, aren’t they? So bothersome, that we’d rather return to the workplace, and its sundry oppressions, its dreaded co-workers, its bosses and meetings, its resident bullies and clowns.  Yes sir, taking care of kids is no walk in the park, and certainly not even a leisurely drive around the block. We’d rather be dealing with the Dilbertian stupidities of our fellow sufferers in employment than engaging with the exhausting follies of our offspring. (Once this hilarity recedes, some misgivings set in.  We see that old pernicious classification at play, the one that says ‘work’ happens in the ‘workplace’ and not at ‘home’, the one that renders the labor of stay-at-home parents invisible.  It’s a categorization that has been internalized by the caretakers themselves, of course.  ‘Are you working today?; No, I’m staying at home to look after the kids.’)

I was reminded of this cartoon yesterday as I made some tentative inquiries yesterday at Brooklyn College about suspending my sabbatical in the spring semester and returning to teaching duties. (I would return to my sabbatical either in the fall of 2014 or the spring of 2015.) My first mention of this to a colleague–and a fellow parent–prompted a guffaw and the rejoinder, ‘You’d rather be back at work, right?’ Other reactions were more incredulous. Why would you want to suspend a sabbatical and return to teaching? Why would you want to return to grading, meetings, committee work and dealing with university administrators?

Well, for one thing, I’ve been home-bound too long. I was on paternity leave in the spring semester with no teaching duties and spent most days of the week attending to my daughter and taking care of college administrative work–from home. Over the summer, I’ve been a full-time caretaker, a situation that has only changed in the past couple of weeks with the addition of a babysitter for a couple of hours a day and even more recently, some daycare. I’ve become infected with a version of cabin fever, and as I noted in a post a while ago, I do miss teaching.

Curing myself means leaving home, heading to a library or two, and consequently, seeking more time at the daycare center for my daughter. But daycare is expensive, very expensive; the bills for it can easily rise to a staggering twenty thousand dollars a year. And my sabbatical entails a twenty percent pay cut. (This is still a radical improvement from the situation of a few years ago, when the sabbatical meant a fifty percent decrease.) That’s not great news in a city like New York.

So, perhaps a return to teaching and a full-time salary is on the cards. Perhaps next year, with some savings in the bank, I’ll try the sabbatical again. Decisions, decisions.

Is Economics a Science?

Eric Maskin, 2007 Nobel Prize winner in Economics, responds to Alex Rosenberg and Tyler Curtain’s characterization of economics:

They claim that a scientific discipline is to be judged primarily on its predictions, and on that basis, they suggest, economics doesn’t qualify as a science.

Prediction is certainly a valuable goal in science, but not the only one. Explanation is also important, and there are plenty of sciences that do a lot of explaining and not much predicting. Seismology, for example, has taught us why earthquakes occur, but doesn’t tell Californians when they’ll be hit by “the big one.”

And through meteorology we know essentially how hurricanes form, even though we can’t say where the next storm will arise.

In the same way, economic theory provides a good understanding of how financial derivatives are priced….But that doesn’t mean that we know whether the derivatives market will crash this year.

Perhaps one day earthquakes, hurricanes and financial crashes will all be predictable. But we don’t have to wait until then for seismology, meteorology and economics to become sciences; they already are.

Maskin’s examples should really indicate that seismology and meteorology do make predictions; they just happen to be probabilistic ones like ‘there will almost certainly be an earthquake measuring 7 on the Richter scale in California in the next hundred years’ or ‘this summer’s Atlantic hurricane season will most likely see more hurricanes in the Caribbean than last year’; it is on the basis of these rough and ready predictions and the historical record (and, of course, the extra-scientific assumption that the laws of physics will endure) that building codes in the relevant regions have changed in response.

Still, Maskin is on to something: most careless characterizations of science attribute far too many essential features to science.

Consider for instance, a definition of science that says a scientific discipline necessarily relies on experimentation and produces law-like statements about nature. The former would exclude cosmology, the latter biology. (Rosenberg and Curtain have been careful enough to not talk about laws or experimentation in their description of the ‘essential’ features of science.)

The model of science that Rosenberg and Curtain work with is, unsurprisingly enough, based on physics. Furthermore, the examples they use–predicting the orbit of a satellite around Mars, the explanation of chemical reactions in terms of underlying atomic structure, predictions of eclipses and tides, the prevention of bridge collapses and power failures–are derived from the same terrain.  In general, there seems to be much consensus that a putative candidate for scientific status succeeds the more closely a description of it matches that of paradigmatic theoretical and experimental physics. As this similarity fades, more work has to be done to include that discipline in the scientific cluster.

It is still not clear to me that economics is a science. But that is not because it fails to meet some ‘essential feature’ of science; rather, it is because we still lack a complete understanding of what makes a discipline a science.  There is a persistent difficulty of the characterization problem in the philosophy of science: most definitions of science–as any undergraduate in a philosophy of science class quickly comes to realize–fail to do justice to scientific practice through history and to the actual content of scientific knowledge.

The debate about whether economics is a science is most interesting because it shows the prestige associated with scientific knowledge; a successful classification as a science entails greater acceptance and entrenchment of its claims, and concomitantly, greater support–possibly financial–for its continued practice.

In the marketplace of competing knowledge claims, this is the truly important issue at hand.

Ambition, the ‘Dangerous Vice’ and ‘Compelling Passion’

In reviewing William Casey King‘s Ambition, a History: From Vice to Virtue (‘Wanting More, More, More‘, New York Review of Books, 11 July 2013), David Bromwich writes:

Machiavelli thought ambition a dangerous vice…for Machiavelli ambition was also a compelling passion—a large cause of the engrossing changes of fortune that happen because “nature has created men so that they desire everything, but are unable to attain it.” All men, the grandees and the populace alike, are implicated in the “nature” that created this unreasoning desire….Francis Bacon was deeply influenced by both Machiavelli and Montaigne….a useful “means to curb” the ambitious, says Bacon, “is to balance them by others as proud as they.” The dry realism of that suggestion would be echoed by Madison in Federalist Number 51: “Ambition must be made to counteract ambition.”…Bacon made his acutest observations on ambition in another essay, “Of Great Place.” Men in great places, he writes, are servants of the state, of fame, and of business:

They have no freedom; neither in their persons, nor in their actions; nor in their times. It is a strange desire, to seek power, and to lose liberty; or to seek power over others, and to lose power over a man’s self.

The path to great place may involve base actions and so “by indignities, men come to dignities.” They buy their power at the price of their own liberty. There is a freedom of the spirit, Bacon seems to say, that has nothing to do with political leverage or social success.

The terrible irony of ambition, which Machiavelli and Bacon so perspicuously capture, is that the same drive that can make us happy by spurring us on to great, hopefully fruitful effort, can be the source of the greatest unhappiness as well. Not for nothing is it said that the time of the greatest melancholia in one’s life is when we come to realize we must downsize our ambitions, cease our endless prospecting, give up our illusions, and look around for a suitable bower on which to rest our heads and begin the process of reconciling ourselves to a life unfulfilled. The greater the original ambition, the steeper the fall into the darkest recesses of gloom.

Ambition does not just make the ambitious unhappy, of course. All those singed by its flame suffer: sometimes those who support the ambitious and are then cast aside; sometimes those whose ambitions must give way in the face of a greater one.

If ambition is to be a virtue, then it must be infected by yet another one, that of moderation. But the balancing of ambition with realism, the tempering of our drives, the recognition of the presence of the reality principle in our lives, is not an easy task. For we remain haunted by the worry that we might have simply fallen prey to weakness of the will, to laziness and indolence, and sought the easy way out. Homilies like ‘obstacles are what you see when you take your eyes off the goal’ don’t help. This cognitive dissonance might be even more painful than that caused by the dousing of the flames of ambition.

Bacon and Madison’s remarks about balancing ambition suggest a possible means of amelioration: when giving up one ambition, replace it by another, just as great. The ambitious artist may then, for instance look elsewhere, perhaps inward, considering himself a work in progress, or perhaps outward, finding in some other work a potential reward as great as the ones that drove him previously.

So, there might be no getting rid of ‘ambition,’ but that might be because it may only be a compound description of a host of other, necessary, life-sustaining drives.

Of First and Second Languages – I

Costica Bradatan‘s essay ‘Born Again in a Second Language‘ made me think my own homes in the two languages I speak: English and Hindi/Urdu/Hindustani.

Because I grew up in India, English is often termed my ‘second language.’ I, however, describe English as my ‘first language’ because it is the language in which I posses the greatest fluency, vocabulary, and reading and writing proficiency. My reading and writing fluency in Hindi/Urdu/Hindustani is on a sharp decline; I have not read a book in Hindi nor written more than a line in it for over thirty years now.  As I noted in a post here some time ago, one of my reading projects is to read three novels in Hindi by the great Indian novelist Premchand; they sit there on my shelf, waiting for me to muster up the courage to approach them.

I grew up in a mixed language household; my parents spoke a mixture of English and Hindi to each other; my father spoke predominantly in English with my brother and myself; my mother, who had a graduate degree in English literature, spoke in both English and Hindi with us, but the latter often took precedence.  The language of the streets around us was Hindi/Urdu/Hindustani but our social milieu, made up of Air Force officers drawn from all over  polyglot India, relied on English. The language of instruction in the schools I attended was English; we learned Hindi as a language in a separate class. The movies we watched in theaters were in English; the weekly Sunday movie was in Hindi/Urdu/Hindustani.

So I grew up bilingual, but the combinatorial explosion of language that takes place in a child occurred, for me, in English, because it was the language of instruction in school, the language in which I was introduced to bookish knowledge, and as such, the language in which I began to read outside of school. It became the language in which I dreamed, fantasized, speculated, wondered and schemed. I spoke Hindi with some family members and English with yet others; I spoke Hindi with some friends of mine and English with others; but, when I was by myself and my books, which was a great deal of the time, I thought  and imagined in English. It became, very quickly, my ‘first language.’

I stopped studying Hindi in the tenth grade.  I had, through sheer tenacity, improved my Hindi reading and writing skills to the point that I secured, after years of embarrassingly bad performances, a decent grade in my last school exam. It was my last hurrah; from then on, I stopped reading Hindi, other than signage and the occasional newspaper.

Over the years, I have learned a semester of German (the grundstufe eins), a smattering of Spanish (how could you not, living in the US?) and acquired some proficiency in the language of my ‘home state’, Punjabi.  I dream of attaining fluency in all three and will describe my struggles with them in future posts.

In the meantime, I continue to speak Hindi/Urdu/Hindustani with a certain colloquial fluency (I can certainly curse in it with some elan). But my primary language for communication remains English; it’s what I speak, it’s what teach, read, and write in.  I enjoy switching back and forth between the two, but I know where my home is.

More on these languages, and my relationships with them, soon.

Edward Mendelson on Anthony Hecht and the Palliations of Poetry

In writing on Anthony Hecht‘s poetry in  (‘Seeing is Not Believing‘, The New York Review of Books, 20 June 2013), Edward Mendelson remarks:

In a familiar paradox of art, Hecht’s poems got their structure and strength from his irrational judgments and defensive vulnerability. But Hecht did something deeper and more complex than finding compensations in the perfections of art for the faults of life. What is uniquely unsettling about his poetry is his insistence that its aristocratic poise is helpless against the inner terror that gave rise to it. As he suggests in ‘A Birthday Poem,’ he finds in art ‘a clarity that never was,’ a clarity outside of time that offers only an illusion of escape from the tangled misery of actual and specific moments, naming as an example ‘that mid-afternoon of our disgrace.’

These statements need some untangling.

First, I think Mendelson means to say that it is ‘a familiar irony of art.’ There is no contradiction here.

Second, it is not entirely clear what Mendelson has in mind when he talks about ‘finding compensations in the perfections of art’. Does he mean the compensations are found in the acts and processes of creation, or in the contemplation of the finished work of art? (It is also not clear what Mendelson means by the ‘perfections of art’ but I’ll let that slide for a moment.) To use the taxonomy of palliative measures that provide relief from life’s miseries that was suggested by Freud in Civilization and its Discontents, the former might be considered a deflection, the latter a substitutive satisfaction. The former is a re-channeling of our desires into domains where their satisfaction is more tractable, more easily attained. Freud included in this category scientific activity and other methods of professional achievement (the world of business and finance, for instance). By their grounding in the everyday and their engagement with other forms of human activity this method of escape from the trials and tribulations of life retains the most connection with reality. The latter is a form of compensation for lack of pleasure elsewhere. Freud included in this activity all forms of illusion or fantasy: religious fervor, day-dreaming, the enjoyment of artistic products such as music, sculpture and painting.

It should be clear why the former is a deflection; even in the act of creation, the artist may be confronted with the familiar frustrations of life and unblinking presence of the reality principle–the blank page or canvas, the long torturous path from conception to final product–but expressed in a form that he has the means and resources to combat. In the latter, we are merely consumers of art–we may not be artists ourselves–giving ourselves over to the enjoyment of the work before us.

Mendelson, of course, is suggesting that Hecht’s poetry makes the claim that these palliations do not work, that their relief is illusory. But this should not be ‘uniquely unsettling’; such a notion is present in the very idea of a palliative measure itself: it does not cure, it merely provides temporary relief.

Skyler White, The Anti-Muse?

Yesterday I wrote a short response to Anna Gunn‘s New York Times Op-Ed about the negative reaction to the Skyler White character on Breaking Bad. I want to add a couple of points to that today.

Some of the adverse reaction to Skyler finds its grounding in her instantiation of an archetype that I alluded to yesterday: the domestication, and hence taming, of the artist. Walter White is an auteur, a maven who marries science and art to produce the purest crystal meth possible, who worries incessantly, and proudly, about the quality of his ‘product.’ This is a man obsessed, like all good artists are, about whether his vision has been realized, who is capable of endless ‘revision’ and ‘drafting’ to get things to come out just right. His pride may be his downfall, as in when he cannot stop himself from bragging to Hank about how Gale was a mere child compared to Heisenberg, but it is a justified pride: his work is just that damn good. Skyler, however, is no such thing. Remember that in the first season we are told that she writes short stories and sells items on Ebay. The former activity marks her not as creative but as delusional, like all those people who imagine they will write the next Great American Novel, the latter as a not particular edifying combination of a hustler and parasite. Later she becomes book-keeper for Beneke Fabricators.

The contrast is clear: in one corner, creativity, innovation and enterprise, in the other, dull, stodgy, mundane beancounting. And more significantly, the brilliant male artist, bought to heel by the cackling, nagging, domesticity of the home and hearth, his rising star brought back to earth by the dead weight of the home. An old joke has it that one mathematician wrote to his colleague after his marriage, ‘Congratulations, you can do more mathematics now’, but in general, the received wisdom is that the artist’s work suffers after marriage. He is called away from his easel, his desk, because of the calls from the kitchen and the nursery. Skyler is thus the sand in the wheels for Walter’s artistry; she gets in the way of his work. she prevents him from realizing his potential. We are invited to see her as a millstone and barrier.

There is an interesting visual grammar to the contrast drawn between Skyler and Walter. As the show progresses, Walter becomes sharper: he loses his hair, starts dressing in black, speaks with gritted teeth, delivers his lines with barely controlled violence, and his actions follow a trajectory of decreasing compromise (like all good artists’!). His rough edges are smoothed, he becomes menacing, not just in his deeds, but in his appearance as well. Compared to him, Skyler appears rooted in the ordinary. indeed, for a while, she is visibly weighed down with pregnancy, viewed here not as fertility, but rather, as a symbol of the artist’s enslavement.

It is little wonder Skyler provokes such visceral reactions; her character carries the burden of many pernicious tropes.