The Convenient Construction Of The Public-Private Distinction

Revolutions are public affairs; revolutionaries bring them about. They fight in the streets, they ‘man’ the barricades, they push back the forces of reaction. And then, they go home for the night, to a meal and a warm bed. There, they rest and recuperate, recharging the batteries of uprising, ready to battle again the next day. Revolutionaries are men, doing the real work, out in the public sphere; their home fronts are staffed by women, whose job is to sustain the revolution’s domestic aspects.

In The Revolutionary Career of Maximilien Robespierre (University of Chicago Press, Chicago, 1985, pp. 57-58), David P. Jordan writes:

Although Robespierre was most at his ease in the midst of bourgeois domesticity, he depended upon others to create such an environment for him. Left to himself, he would have perpetuated his solitude in bleak rented rooms. It is worth noting that he fought the Revolution from the comfort of a bourgeois home. His passivity, his willingness to have others look after him, bespeaks an indifference to the mundane. He knew nothing of the marketplace; in Paris, as it had been in Arras, food awaited him at table, including the fruits he adored. Similarly, he knew nothing of the conditions of the desperately poor, with whom he never fraternized extensively. And there is no record that he ever went next door at the Duplays’ to talk to the carpenters in the shop. [citation added]

“An indifference to the mundane.” The home is the site of the mundane, the ordinary, the dull and dreary. Outside, the public sphere, where the non-domestic happens, is where the extraordinary takes place. That is the zone of men, the revolutionaries; the home is where women (and perhaps some servants), like a pit-stop crew, get the smooth machine of revolution up and running again with an oil and tire change for the body and mind. The revolutionary, from his lofty perch, can look down on and disdain these mundane offerings, the labor underlying which is not worthy of recognition in manifestos intended to stir the masses to action.

The excerpt above is drawn from a book published in 1985, two years after Carole Pateman‘s classic feminist critique of the public-private dichotomy appeared in print.¹ It shows in paradigmatic form, the standard (male and patriarchal) construction of the public-private distinction in political theory. The ancient Aristotelean understanding of polis as sphere for politics and civic life and home as venue for a much lower form of life persists here. Jordan does not make note of Robespierre’s detachment from the domestic with approval, but he does not find anything problematic in it either; instead, it appears as the sort of bemused indifference that we associate, quite romantically, with artists, writers, poets, and others too intent on cultivating their creativity to be bothered with the ‘mundane’ particulars of life. In this history, the public fray rises above domestic scurrying; the men hover above the women below.

Note 1: Carole Pateman, ‘Feminist Critiques of the Public/Private Dichotomy,’ in Public and Private in Social Life 281,281 (S. I. Benn & G. F. Gaus eds., 1983)

RIP Muhammad Ali: Once And Always, The Greatest

Muhammad Ali was the first Black Muslim American I heard of. Before his name entered my immature consciousness,  I did not know Americans could be Black or Muslim. (This revelation came to me during a classroom trivia quiz; ‘Muhammad Ali’ was the answer to the question ‘Who is the world heavyweight champion?’) It is hard now, more than forty years later, to adequately describe the presence that Muhammad Ali had in the lives of young boys like me in the 1970s. Ali was the Greatest; there was no disputing it. He went down, and he came back up. He had his jaw broken; he lost his title; he went to jail. But he kept fighting, literally and figuratively. I read his The Greatest as a young boy and quickly memorized its details: his Louisville childhood, his Golden Gloves bouts, his 1960 Olympic gold medal, his throwing the medal into a river in response to Jim Crow experiences at a local restaurant, the going professional, the precocious career, the fights with Sonny Liston, the draft resistance, the surrender of the title, the loss to Ken Norton, the epic bouts with Joe Frazier and George Foreman, the comebacks. (The Greatest ends with the Zaire fight; a postscript mentions the Thrilla in Manila with Joe Frazier.)

It was a story that once again, introduced me to a side of America  I did not  know about; it was an awakening and an enlightenment.

I think I dimly understood as I read Ali’s autobiography that I was not reading the story of an ordinary sportsman, that there was no way to make sense of Ali’s life without thinking about the racial politics in which it was embedded. You just could not. ‘Nigger’ is a very common word in The Greatest; you hear it when Cassius Clay wants to be served at a whites-only restaurant; you hear it in the story Ali is told by a black man about how he was castrated by the Ku Klux Klan in the American South; you hear it in the dismay over his conversion to Islam, as he changed his name to ‘Cassius X’ and then, ‘Muhammad Ali’; you hear it in the epithets hurled Ali’s way after he refused to participate in the war crime known as ‘Vietnam’; you hear it in the glee of those cheering for his opponents; you could, if you cocked your ear at the right angle, hear it in the recurring fantasy of the Great White Hope who would show up to whip this upstart black man’s ass and teach him some manners.

Because that’s what Ali didn’t have. He didn’t have manners. He was rude; he spoke about things people didn’t want sportsmen to talk about: racism, apartheid, white supremacy, an immoral foreign policy. He gatecrashed a party in which sports champions, especially black ones, were expected to be polite and deferential and grateful to their white backers for having been lifted out of the poverty that was otherwise their birthright. Ali would not settle for such handouts; he wanted nothing less than a full seat at the table.

Ali was a very good boxer too. He was the lightest heavyweight of all; he had great footwork; he threw a mean jab. He was never famous as a big puncher, but he still knocked out many of his opponents. His most incredible achievement still remains his beating George Foreman in 1974. It is worth remembering that Foreman had knocked out–in very early rounds–Joe Frazier and Ken Norton, two men who had taken Ali the distance in long, brutal, fifteen-round bouts of battering; Ali was expected to lose comprehensively to him. Instead, Ali knocked Foreman out in the eighth round with a straight right. An astonishing result in an astonishing fight.

There are people today who still imagine that sports can, and should be, divorced from politics. Muhammad Ali married the two; he was a sportsman who was a politician. He fought political battles every time he stepped into a ring and dropped into his fighting crouch; he fought them every time he answered a question at a press conference, knowing that reporters wanted copy that would confirm stereotypes of dumb, hulking, brutes who directed primeval force at their civilized white opponents. Ali walked away from fame and fortune when he was at the height of his powers; he could have simply taken up a cushy military job behind the front lines, visited some troops, performed the modern equivalent of a minstrel show and done his bit to ‘keep the troops happy’ with a few witty lines. He would have come back to safety soon enough, and could have fought every fight from that point on under the banner of ‘American soldier’ or ‘war veteran.’ He could have kissed the collective ass of national self-righteousness, and asked the nation to shower its kind blessings on him; instead he handed out generous helpings of stubborn defiance.

A couple of years after I had arrived in the US, I spent an afternoon drinking with a friend in a bar in New Jersey. As the evening crept up, an old man at the counter went on a rant about Ali, about how he could have been the greatest, but he threw it all away: “all he had to do was to serve in the military just like every other young man in his time did.” Yes, that’s ‘all’ he had to do. And he didn’t. He knew the simplicity and ease of the path not taken; he knew the difficulties of the fork he did choose. It is crucial to the Ali legend that we understand his greatest bravery lay not in his ability to take a beating, to withstand a punch; it lay in his defiance of commonsense and consensus, in his refusal to seek out easy popularity, to swim with the tide.

Ali was a black man in America; that fact alone made him a fighter. He knew that every time he stepped into the ring; and he knew his fights didn’t end when he stepped back out. His participation in that continuing struggle, and his awareness of it, made him the Greatest, once and always.

RIP Muhammad Ali.

The ‘Ideal Marriage’ And Its Painful Sexual Ignorance

In Making Love: An Erotic Odyssey (Simon and Schuster, New York, 1992, pp. 32-33), Richard Rhodes writes:

Somehow I acquired a copy of Dutch physician T. H. van de Velde‘s Ideal Marriage, published in the United States in 1926, the most popular marriage manual in American until The Joy of Sex came along. Ideal Marriage was wise and tender about love abut euphemistically vague and sometimes criminally misinformed about sex. Van de Velde promulgated the sexist conviction that both partners in an act of intercourse should come to orgasm at the same time. “In normal and perfect coitus,” I read in his book and believed for years afterward, “mutual orgasm must be almost simultaneous; the usual procedure is that the mans’ ejaculation begins and sets the acme of sensation in train at once.” Impossible to measure how much pain that single ignorant sentence caused. It must have baffled hundreds and thousands of men and agonized hundreds of thousands, at least, of women. I took it for God’s truth when I read it–wasn’t it printed in a book? How did Van de Velde arrive at such a bizarre conclusion? From his own experience? From unsupported theory?

Color me baffled too, even if I cannot, like Rhodes, blame Van de Velde for this state of affairs. I did lay my hands on de Velde’s book as a pre-teen boy–a furtive glance or two at a copy that my parents owned, tucked away in some secret hiding place, which I had miraculously uncovered. My heart racing as I realized I was dealing with an illicit text that purported to reveal the secrets and mysteries of an increasingly intriguing and alluring zone of human interaction, I quickly leafed through its pages before hastily replacing it in its sanctum sanctorum and backing away. I promised to return when I had more time, when I was less worried about being caught, but that moment never came again.

But the myth that de Velde sought to perpetuate made the rounds anyway; perhaps in the softcore pulp fiction that I read like a maniac in my pre-teen and teen years, or perhaps in the way that sex was depicted on screen where matters proceeded smoothly between two equally competent partners with nary a touch of awkwardness, anxiety, insecurity, clumsiness, or dissatisfaction. An education–in many dimensions–awaited me in my sexually mature years. Euphemisms and bravado would count for little; only the right kind of hand waving would do.

Note: I own a copy of Alex Comfort‘s The Joy of Sex; a girlfriend and I bought it as a giggle many years ago, and we took turns snickering at its artful pencil drawings and sometimes purple prose. It had dated a little too quickly and now seemed corny (and sexist in all too many of its recommendations and observations.) As I browsed its pages, I was reminded of the computer nerd’s response to Comfort’s catchy title: a guidebook to the X-Windows System titled The Joy of X. I don’t own a copy of that but I wish I did.

India’s IIT Graduates Go Mainstream: Via Campus Shooting, The American Way

Graduates of the prestigious Indian Institutes of Technology (IIT) are part of American life: professors, technology officers, and scientists at Ivy League universities, Silicon Valley start-ups, and industrial research and development laboratories.  But these are rarefied environs, exclusive precincts for the technocratic elite; the IIT graduate’s presence here places his cultural achievements in a fringe zone visible only to a select minority. But now with news of a participation in a campus shooting IIT graduates might have finally gone mainstream in the most American of ways: by using a firearm to settle a dispute.

The man who fatally shot a UCLA professor in his office before turning the gun on himself Wednesday has been identified as Mainak Sarkar. He was a former doctoral student who had once called his victim William Klug a “mentor” but in recent months he had written angry screeds accusing him of stealing his computer code.

Police have identified Sarkar as the gunman in yesterday’s murder-suicide that locked down the UCLA campus…Sarkar submitted his doctoral dissertation in 2013, and in the 2014 doctoral commencement booklet, Klug, a mechanical engineering professor, is listed as his advisor…Sarkar had previously earned a master’s degree at Stanford University and an undergraduate degree in aerospace engineering at the Indian Institute of Technology in Kharagpur. Until August of last year, he had worked as an engineering analyst for a rubber company called Endurica LLC.

The academic CV follows a standard template: an IIT, a top flight American institution, some technical professional experience. And then, things go wrong: a personal relationship deteriorates:

In his acknowledgements, he wrote to Klug, “Thank you for being my mentor.” A source told the Times that Klug bent over backwards to help Sarkar on his dissertation and to graduate, even though Sarkar’s work wasn’t always high-quality. This source is appalled that Sarkar would later accuse Klug of stealing his code to give to another student: “The idea that somebody took his ideas is absolutely psychotic.”

On March 10, Sarkar wrote on a blog now archived:

William Klug, UCLA professor is not the kind of person when you think of a professor. He is a very sick person. I urge every new student coming to UCLA to stay away from this guy. […] My name is Mainak Sarkar. I was this guy’s PhD student. We had personal differences. He cleverly stole all my code and gave it another student. He made me really sick. Your enemy is your enemy. But your friend can do a lot more harm. Be careful about whom you trust. Stay away from this sick guy.

Sarkar resolved his personal crisis with his former mentor and adviser with a gun. Admittedly, only an unglamorous 9mm semi-pistol (perhaps even legally owned and registered), not one of those devastating ‘assault rifles’ that normally gets everyone ire up after the latest mass shooting. And Sarkar didn’t go for the full-fledged massacre; he settled for a ‘one and done’ deal. But in his cleaving to the Way of the Gun, he made his pledge of allegiance, his desire to be All-American, his assimilation strategy of choice, all too clear.

Does Donald Trump’s ‘Pragmatism’ Mean Pragmatism Is Incoherent?

A devastating accusation is making the rounds in America: Donald Trump is a pragmatist; therefore pragmatism is an incoherent ethical and political philosophy. This breathtakingly simple argument establishes its solitary premise by making note of Trump’s assertions that he will do what it takes to fix America’s problems. His supposed inconstancy–his curious admixture of populism, authoritarianism, the occasional progressive standpoint–furnishes the best possible proof: Trump is no ideologue, committed to a rigid manifesto; there are no sacred cows; all is at play when it comes to devising solutions for whatever ails America. (This claim underwrites assessments by Joe Dan Gorman, Mychal Massie, P. M. Carpenter, John Porter, and achieves its condemnatory form in a Washington Post article by Christopher Scalia.)

The picture of pragmatism that is implicit here is that of a toolkit of solutions geared to solving problems. So far, so good. Things get terrible, as Scalia seems to assert, when those means and ends are divorced from values:

Ultimately what sets pragmatists apart from traditional conservatives or liberals is not their faith in the effectiveness of their ideas, it’s their originality — the whatever, not the works….there’s nothing in the Pragmatist’s Playbook that forbids mocking a rival’s face, height, footwear, eating habits, energy level or spouse, or even encouraging supporters to physically assault protesters. And although it’s certainly reprehensible to promote absurd conspiracy theories — like Trump’s suggestion that my father, Justice Antonin Scalia, was assassinated — it’s not necessarily unpragmatic.

The condemnation of pragmatism now immediately follows: because there are no abiding values to guide the pragmatist–all is up for contestation and revision–the pragmatist is as likely to flirt with fascist principles as he is with socialist democratic ones. The pragmatist is at heart unprincipled, committed to a brutally reductive and desiccated means-ends cost-benefit, outcomes-oriented analysis.

You say that like it’s such a bad thing.

This critique of pragmatism glibly commits two fallacies. First, it assumes that such an outcome oriented analysis is devoid of values; but au contraire, the choice of ends–which guides the choice of solutions–is very much informed by values. For instance: Which ends should we concentrate on first? Which ones are most ‘important’ for us? Which ones can we ‘afford’ to ignore for now? Ends are not so easily divorced from values.

Second, the critic of pragmatism assumes that he or she has at hand a set of values which will ‘correctly’ guide the supposedly amoral, purely instrumentalist pragmatist in problem-solving; moreover, these values will be the ‘right ones’ to set the offending pragmatist back on the path of moral rectitude. Add some values–the ones I have in mind–and all will be well. The problem is that disagreement about values is the most interesting part of being evaluative and normative; what if the value-guided non-pragmatist happens to be inspired by ‘wrong’ values like racism or sexism?

Pragmatism did not aim to banish values from ethical and moral discourse; it only bid us examine ours more closely to see why we hold them, and under what circumstances we would be willing to relinquish them. Our values reflect our ends, and our ends reflect our values; this is the inseparability that lies at the heart of pragmatism, and which the facile claims and critiques above all too easily elide.

Donald Trump might be a pragmatist, but that does not mean he escapes normative political critique; that option remains as open for the pragmatist as it does for anyone else.

CUNY And The Public University That Couldn’t

In the fall of 2015 I taught my philosophy of law class in a hostile environment: my classroom.  With windows and doors open, it was too noisy to be heard; with windows and doors closed and the air conditioner turned on, it was too noisy. With the air conditioner turned off, it was too hot. We–my students and I–struggled with this state of affairs into November, till the time it finally became cool enough to allow us to conduct the class with the door and windows closed. Till then, sometimes we shouted, sometimes we sweated; mostly we fretted and fumed, irate and vexed by this latest evidence of the City University of New York’s inability to provide a working infrastructure to facilitate its educational mission.

Over the weekend, the New York Times finally brought to this city’s attention a state of affairs at CUNY that for its students and staff has been a grim reality for too long: a severely underfunded educational institution that has gone from being an ‘engine of mobility’ to the little public university that couldn’t. A crumbling physical foundation; no contracts for its staff and faculty; overpaid administration; reliance on underpaid contingent labor; all the pieces for eventual failure are here.  A strike might yet happen in the fall.

It is common, among progressives, to bewail the continued under funding of public education as an act of class warfare, one animated by racist prejudice. It is worth making that claim explicit: public education is a threat to established social, economic, and political orders; it threatens to bring education–not just textual knowledge, but critical thinking, reading, and writing–to the disenfranchised and politically dispossessed; that fact, on its own, paints a bulls eye on public education’s back, inviting pointed assaults by a surrounding neo-liberal order. Make no mistake about it: public education is under attack because it seen as serving the wrong communities for the wrong reasons.

New York City’s financial health is considerably better than it was during those periods of time when the university was fully funded by the city and the state, when it was able to educate the children of immigrants and send them out to work the engines of the nation’s economy and move themselves and their families up the rungs of American life. But priorities have changed over the years. Now city and state budgets must attend to: university administrators and their desires for bigger salaries and plusher offices; management consultants and their latest pie-charted dreams for ‘process’ and ‘best practices’ and ‘unique selling propositions’; capital projects that do not advance core educational missions; and a host of other diversions that have nothing to do with learning. Run education like a business: shortsightedly, with an eye to the next quarter’s profits; learning be damned.

A nation that denies the value of public education, that makes it into the privileged property of a few, to be paid for under severely usurious terms, is not a republic any more; it has dynamited the wellsprings of its social and political orders.

 

The ‘But The Supreme Court’ Argument For Hillary Clinton

One ‘hold-your-nose-and-vote-for-the-lesser-evil’ argument currently making the rounds for the Hillary Clinton candidacy–ostensibly intended to address the ‘schism’ in the Democratic Party, among the ‘Left’ and ‘progressives’–goes something like this. Vote for Hillary Clinton, even if you disagree with many of her policies, do not consider her entirely trustworthy, and would much rather vote for Bernie Sanders–because she will nominate the right person, the right Justice, to the US Supreme Court. (The Senate will not confirm a nominee put up by President Obama, so this will be one of the first tasks awaiting the new President next year.) No matter what you think, you cannot allow a President Trump to nominate a right-wing ideologue to the Supreme Court, who will then roll back years of hard-won legal victories in many domains: perhaps abortion restrictions, perhaps voting rights, perhaps the power of regulatory administrative agencies to keep our work spaces safe and our drinking water clean.

It is worth noting how much this argument presumes and concedes.

First, and most importantly, the American political system is broken. There is no separation of powers; the judiciary and the executive branch are the new legislatures. The Supreme Court is now a full-blown political institution. Political change will not come about because people’s representatives will legislate their desires and demands into existence; rather, an unelected group of Yale and Harvard educated lawyers will respond directly to petitioners who seek to address some perceived injustice. Persuade the justices; do not bother with the ballot box. Unless you are voting for President.

Second, it places too much faith in the ability of the Supreme Court to drive substantive social and political change. The poster child for this sort of claim is Brown v. Board of Education, which left segregation intact; and as a vigorous debate among professional court watchers–a motley crew of legal scholars and political scientists–confirms, supporting examples can be found quite easily. Despite the expressive impact of the courts and their rulings, political change does not happen because courts direct the polity to change; rather, it occurs because citizens organize and exert pressure at and on the right places and the right actors–in a variety of political domains and institutions.

Third, it suggests–as if acknowledging the unprecedented obstruction of a sitting President by the Republicans over the last eight years–that the President is a lame-duck from the moment he or she drops his or her right hand on being sworn in. No substantive legislation can be driven by that office; the Constitution offers no escape; a recalcitrant House of Representatives and Senate cannot be forced to do perform their legal duties. The President can merely nominate a Supreme Court Justice; and that too before the final year of office (apparently the new normal now given what has transpired since Justice Scalia’s death.)

It is into this impoverished and diminished political landscape that we are steered by the ‘but the Supreme Court’ argument for Hillary Clinton. We are being asked to settle for an immensely diminished Republic.

Gerard Manley Hopkins’ Mountains Of The Mind

A few years ago, while visiting my brother in India, I browsed through his collection of mountaineering books (some of them purchased by me in the US and sent over to him.) In Robert MacFarlane‘s Mountains of the Mind: Adventures in Reaching the Summit, I found the following epigraph:

O the mind, mind has mountains  – Gerard Manley Hopkins c. 1880

It wasn’t the first time I had read Hopkins’ immortal line. And my first reaction to it, and its embedding in the poem in which it features made me question MacFarlane’s deployment of it as an epigraph to his book, and indeed, in its title.

MacFarlane’s book is, as an excellent critical review on Amazon notes, “a series of essays following the development and transitional phases of Western European conceptions of the “mountains” and exploring the mountains.” Man is fascinated by the mountains; bewitched and bewildered, we seek to climb them, hoping to find on their slopes and summits nothing less than our true selves, brought forth and revealed by adversity. Or perhaps mountains will grant us access to the key to this world’s mysteries; visions will be induced in our journeys that will pull back the curtains and reveal what lies beneath the surface and appearance of reality. Mountains have many roles to play in our projects of self-imagination and construction–in MacFarlane’s narrowly conceived Anglocentric sphere. (This last critical point is the primary focus of the review linked above.)

But what is Hopkins’ line doing, serving as an epigraph to such a book? Hopkins’ poem is about melancholia; indeed, it might be one of the most powerful and moving explorations of the mind’s travails. Here is how I read his line: our mind is capable of entertaining thoughts and feelings which contain within them chasms of despair, points at which we stare into a dark abyss, an unfathomable one, with invisible depths. These are our own private hells, glimpses of which we catch when we walk up to the edge and look. The effect on the reader–especially one who has been to the mountains–is dramatic; you are reminded of the frightening heights from which you can gaze down on seemingly endless icy and windswept slopes, the lower reaches of which are shrouded with their own mysterious darkness; and you are reminded too, of the darkest thoughts you have entertained in your most melancholic moments.

In MacFarlane’s book, the fear that mountains evoke in us is a prominent feature of man’s fascination with mountains (this suggests too, the interplay between terror and beauty that Rilke wrote about in the Duino Elegies.) But melancholia does not feature in MacFarlane’s analysis. MacFarlane seems to quote the line as saying that our fascination with mountains stems from the fact that our mind itself contains mountains, that some part of our primeval sense responds to them. This is not what Hopkins was writing about. He uses mountains as an image to convey the depths visible from their heights, as a symbol of how far we may fall in our melancholia. Fear is present for Hopkins but in a wholly different manner; we dread the depths to which we may sink in our ruminations. That is not the kind of fear MacFarlane addresses; it is related only peripherally.

Robespierre On The Iraq War

In 1792, Revolutionary France debated, and prepared for, war. It was surrounded by monarchies who cared little for this upstart viper in the nest; and conversely, a sworn “enemy of the ancien regime” could not but both despise and fear what lay just beyond its borders: precisely the same entity in kind as was being combated at home. War often seemed inevitable in those days, and it would come, soon enough, on April 20th, when France declared war on Austria. But Revolutionary France, true to its spirit, devoted considerable time and energy to debating the decision to continue politics by such means.

Some, like the war’s most “categorical…passionate [and] persuasive” proponent, Jacques Pierre Brissot, “imagined war rallying the country behind the Revolution and forcing the duplicitous King [Louis XVI] either to support the war and the Revolution or reveal his counterrevolutionary intentions.” But just as important, was the “crusading” aspect of the war:

War would carry liberation to the oppressed peoples of Europe, groaning still under the despotism France had thrown off. [Jordan, pp. 84]

The most passionate opponent of the war was Maximilien Robespierre. His denunciation of war plans had many dimensions to it. They remain remarkably prescient and insightful.

First, Robespierre noted that “during a war the people forget the issues that most essentially concern their civil and political rights and fix their attention only on external affairs.” Because war is conducted by the “executive authority” and the military, during war, the people direct “all their interest and all their hopes to the generals and the ministers of the executive authority.” Such slavish devotion to those in power–especially since in conducting the war, Revolutionary France would have been “fighting under the aegis of the Bourbon Monarchy”–results in a characteristically eloquent denunciation: War is “the familiar coffin of all free peoples.”

But it was the “crusading” and “utopian” aspect of the war that seemingly most troubled Robespierre, for in it he could detect an internal incoherence. The vision of lands and peoples invaded by Revolutionary France welcoming their conquerors struck him as risible:

The most extravagant idea that can be born in the head of a political thinker is to believe that it suffices for a people to enter, weapons in hand, among a foreign people and expect to have its laws and constitution embraced. [It is] in the nature of things that the progress of reason is slow [and] no one loves armed missionaries; the first lesson of nature and prudence is to repulse them as enemies.

Robespierre did not waver from this conviction. A year after France had entered that period of its history which would be characterized by almost incessant outbreaks and declarations of war, Robespierre wrote that “One can encourage freedom never create it by an invading force.”

The historical illiteracy of those who declared war on Iraq is oft-commented on; here is yet more evidence for that claim.

Note: This post is cribbed from David P. Jordan‘s The Revolutionary Life of Maximilien Robespierre (University of Chicago Press, 1985, pp. 82-86; all quotes and citations originate there.)

The Acknowledgments Section As Venue For Disgruntlement

In The Revolutionary Career of Maximilien Robespierre  (University of Chicago Press, 1985) David P. Jordan writes in the ‘Acknowledgments’ section:

With the exception of the Humanities Institute of the University of Illinois at Chicago, whose fellowship gave me the leisure to rethink and rewrite, no fund or foundation, agency or institution, whether public or private local or national, thought a book on Robespierre worthy of support. [pp xi-xii; citation added]

Shortly after I had defended my doctoral dissertation, I got down to the pleasant–even if at times irritatingly bureaucratic–process of depositing a copy with the CUNY Graduate Center’s Mina Rees Library. The official deposited copy of the dissertation required the usual accouterments: a title page, a page for the signatures of the dissertation committee, an abstract page, an optional page for a dedication, and lastly, the acknowledgements. The first four of these were easily composed–I dedicated my dissertation to my parents–but the fifth one, the acknowledgements, took a little work.

In part, this was because I did not want to be ungracious and not make note of those who had tendered me considerable assistance in my long and tortuous journey through the dissertation. I thanked the usual suspects–my dissertation adviser, various members of the faculty, many friends, and of course, family. I restricted myself to a page–I continue to think multi-page acknowledgments are a tad self-indulgent–and did not try to hard to be witty or too effusive in the thanks I expressed.

And then, I thought of sneaking in a snarky line that went as follows:

Many thanks to the City University of New York which taught me how to make do with very little.

I was still disgruntled by the lack of adequate financial support through my graduate studies: fellowships and assistantships had been hard to come by; occasional tuition remissions had somewhat sweetened the deal, but I had often had to pay full resident tuition for a semester; and like many other CUNY graduate students, I had found myself teaching too many classes as an underpaid adjunct over the years. I was disgruntled too, by the poor infrastructure that my cohort contended with: inadequate library and computing resources were foremost among these. (During the last two years of my dissertation, I taught at NYU’s School of Continuing and Professional Studies and so had access to the Bobst Library and NYU’s computing facilities; these made my life much easier.)

In the end, I decided against it; my dissertation was over and done with, and I wanted to move on. A parting shot like the one above would have made felt like I still harbored resentments, unresolved business of a kind. More to the point, the Graduate Center, by generously allowing to me enroll as a non-matriculate student eight years previously, had taken a chance on me, and kickstarted my academic career. For that, I was still grateful.

I deleted the line, and deposited the dissertation.

Note #1: An academic colleague who finished his dissertation around the time I did dedicated his dissertation to his three-year old son as follows:

Dedicated to ‘T’ without whom this dissertation would have been finished much earlier.

Fair enough.