Paying Attention To The Muses’ Visits

In The Year of Magical Thinking–a book on which I will write a bit more anon–Joan Didion quotes her late husband, John Gregory Dunne, as saying that having a notebook handy–to write down a thought, an idea, filed away for future reference and deployment–was the difference between being able to write and not.

There is much truth in this utterance of Dunne’s.

A couple of years ago, when I started blogging here, I would find myself thinking about blogging topics as I walked to and from work (or my gym). On those occasions, I would wait till I got home to scribble my thoughts on a notepad on my desk. But sometimes, those thoughts were too fleeting to survive; I would, with some dismay, and often, mounting panic, rummage in my memory stores, seeking desperately to find that little flash of inspiration that had suggested itself as such a fertile avenue of written exploration. Bizarrely enough, it took a few months before I started to do something about this state of affairs.

Unsurprisingly enough, I relied on a technical aid: the ubiquitous smartphone. I began making tiny notes on a ‘scratchpad’ on my phone, quickly writing down, misspellings and all, the fragments of whatever thought had crossed my mind as I rode the subway and read a book. I hoped to return to these later. Sometimes I did, and found the seed was still a viable one, and I would turn it into a full post. Sometimes, on re-inspection, I found a mere incoherent ramble, a passing fancy that would not bear the weight of writing on it.

I did not just write down ideas for blog posts, of course. On some occasions, a tactic for resolving a  sticky section of writing in a book project would suggest itself to me–‘get rid of the section on X‘ or ‘move the bit about Y to the end of the chapter’–and a way out of an impasse would become crystal clear. Again, here too, on actually sitting down and confronting the text, my assessment of the worth of the putative brainwave could change; my visit from the muse had not been as fruitful as I had previously imagined.

There are times, and I always pay for them, when I forget the wisdom of Dunne’s observation, and I am too lazy to pull out my phone to write down my supposed inspiration. I cannot be bothered to put down my book; my phone is in my backpack; the subway is too crowded. Whatever the reason, I reassure myself I will make notes when I arrive at my destination. But I almost never do. And thanks to a peculiar transience associated with such thoughts, they do not survive and persist. Irate at my lack of attention, they move on to more attentive and grateful minds. I call out again and again, but they are gone, leaving not even a wispy trace in their wake.  There is no way to call them back again, except perhaps to get back to work.

Naguib Mahfouz On Forgetting And Habit

In Naguib Mahfouz‘s Autumn Quail, Isa, the corrupt bureaucrat whose long, slow, and painful decline after a purge following the 1952 revolution in Egypt the novel tracks, brings back Riri, a woman of the night, to his home. The next day,

He woke up about noon and looked with curiosity at the naked girl sleeping next to him. Recollections of the previous night came back and he told himself that as long as oblivion and habit still existed, everything remained possible. [Anchor Books, 2000, pp. 374]

Many a student of human nature is puzzled by the subject of his study. Why do humans behave as irrationally and self-destructively as they do? Why do they hurt the ones they love? Why can they not learn from their mistakes?

Perhaps because they forget; perhaps because they are compelled.

If memory can be so constitutive of identity, then forgetting gives us the chance to make ourselves anew. Forgetting ensures pain and joy once experienced are consigned to the past, and cannot act as guides to the future: they cannot inhibit, they cannot impel us to repeat a once-experienced pleasure, they cannot make us shrink from the venue of an older disaster. The burnt child fears the fire, but the forgotten fire provides no instruction for action when a flame is encountered again. Then the child becomes the moth; our expectation of its behavior is reconfigured. We may too, pass by a previously experienced domain of pleasure, now strangely reluctant to sample its joys again; we have already buried, pushed out into oblivion, memories of our times within its confines. We carry on, unaware that we have foregone an opportunity to experience that which once enthralled us so. This blithe ignorance may take us elsewhere, toward experiences and interactions, which are provocative of novel responses from us.

Habits, conditioned and impressed, sent deep into the innermost recesses of our being,  continue to drive us on too. They may become more than mere regularities in behavior; they may appear as instincts, innate and congenital. We may mark out zones of catastrophe, and expect no one will venture into their precincts, but the persona habitually conditioned to push open its doors and enter will continue to do so. And habits may send us, again and again, long after the shocks and the pleasures of the new have worn off, seeking old exaltations and ecstasies, hoping they will be as productive of joy as they once were. The resultant inevitable disappointment, so clearly visible to the observer, and which should seem to act as inhibitor, does not have that effect; the habitual commissioner of acts continues to do so long after all assessments of his actions as reasonable have ceased.

Forgetting and habit send us on into this world as a curious mixture of the new and the archaic. Such a creature is curiously open to possibility while simultaneously entrenched in older ways of being. The interaction between the two can be productive of much that might appear initially implausible.  ‘Everything remains possible’ indeed.

A Most Irritating Affectation

The most irritating affectation of the modern intellectual is to pretend to be technically incompetent. I exaggerate, of course, but I hope you catch my drift. Especially if you’ve encountered the specimen of humanity that I have in mind. (Mostly on social media, but often in person too.)

The type is clearly identified: a clearly intellectually accomplished individual–perhaps by dint of academic pedigree, perhaps by a body of public work, or just plain old clearly visible ‘smarts’–claims that they are incompetent in modern technology, that they simply cannot master it, that their puny minds cannot wrap their heads around the tools that so many of their friends and colleagues seem to have so effortlessly mastered. (‘Oh, I have no idea how to print double-sided’; ‘Oh, I have no idea what you mean by hypertext’). They are just a little too busy, you see, with their reading–good ol’ dead-tree books, no Kindles or Nooks here!–and writing–well, not on typewriters sadly, but word processors, for some change really cannot be resisted. Rest assured though, that they have to call for help every time they need to change the margins or fonts or underline some text.

This absorption in old-fashioned methodologies and materials of learning thus marks them as gloriously archaic holdovers from an era which we all know to have been characterized by a greater intellectual rectitude than ours. While the rest of us are slaves to fashion, scurrying around after technology, desperately trying to keep up with the technical Joneses, our hero is occupied with the life of the mind. So noble; such a pristine life, marked by utter devotion to the intellect and free of grubby mucking around with mere craft.

Why do I find this claim of incompetence to be an irritating affectation? My suspicion is easily provoked because I do find posturing in all too many places–as I did above in expressions of faux modesty, sometimes called humblebrags in the modern vernacular, but here, I think, is the rub. Those who profess such incompetence merely outsource the work of learning the tools we all learn to do our work to us. They are unwilling to put in the time to learn; they are too busy with their important work; we are not for we have, after all, shown that we have time to spare to learn. We should help–it is now our duty to aid their intellectual adventures.

A claim to incompetence should not be occasion for cheer, but it is. We are, after all, ambivalent about the technology that so dominates, regulates, and permeates our life; we are, all too often, willing to cheer on evidence that not all is well in this picture of utter and complete absorption in technique. We applaud this disdain; we wish we were so serene, so securely devoted to our pursuit of knowledge. We are also, of course, clapping wildly for a rebellion of sorts, a push-back against the creeping march of technology into every corner of our lives.

I think we can find better heroes.

Note: I’m willing to make some concessions for those over the age of fifty, but anyone younger than that bragging about their technical klutziness needs a rhetorical kneecapping.

The Fall Of Norman (And Norma) Bates

We know the story of Norman Bates:

Norman had been excessively dominated by his mother since childhood, and when she took a lover, he became insanely jealous that she had “replaced” him, then murdered his mother and her lover. Later, he developed a split personality to erase the crime of matricide from his memory and “immortalize” his mother by stealing and “preserving” her corpse. When he feels any sexual attraction towards someone, as was the case with Marion [Crane], the “Mother” side of his mind becomes jealous and enraged. At times, he is able to function as Norman but other times, the “Mother” personality completely dominates him….Norman, in his “Mother” state, had killed two missing girls prior to Marion [Crane].

Sometimes you can know the ending, and not worry about spoilers. This is certainly the case when watching Bates Motel–the television series which supplies a prequel to Psycho. We are well aware we are about to view the descent into madness, into full-blown psychosis, of a young man who, well before we came to know him as a cross-dressing, knife-wielding, homicidal maniac, was a sweet and shy young man. This knowledge, this dissipation of suspense, does not diminish the tragedy unfolding before us; it makes it all the more tragic because we see the characters inexorably moving toward their pre-determined fates. (This fact, this eventual degradation and decline, one in part caused by the psychological trauma our personal relationships can inflict on us, makes for difficult watching at times; I suspect some of my sensitivities are particularly acute because I’m a parent now.)

Bates Motel is not an ordinary prequel; it is a reboot, for it displaces the original Psycho in both space and time (from California to Oregon, and from the 1960s to the present era). But by retaining its central characters and pathologies, it ensures it does not stray too far from the original’s creepiness. And indeed, it might be that it supplies what was always the most intriguing and understated aspect of the original, one only touched upon briefly in its resolution: How did Norman become Norman? What was his relationship with his mother like? Who was his mother? What was she like?

Psycho is sometimes described as the first psychoanalytical thriller; fittingly, Bates Motel‘s primary virtue is that it enables an archaeology of its story. It fills out and makes available for inspection, the contours of Norman and Norma’s pre-history; it tells the story that Norman might have recounted on a therapist’s couch. Bates Motel clearly considers the task of supplying a full-blown causal story to account for Norman’s psychosis a task that is lies beyond its competence, for its writers assume some pre-existent pathology, but even then, we are promised some development and ‘progression’–hopefully, an artful blend of responses to both innate dysfunction and environmental abuse.

Bates Motel can only end with Marion Crane’s car pulling into the parking-lot. In this series at least, we know what the finale will be; we can now get back to clucking over the details that get us there.

Book Release Announcement: Eye on Cricket: Reflections On The Great Game

I’m pleased to announce the release of my second book on on cricket–‘the game, not the animal, or the cartoon character’: Eye on Cricket: Reflections on the Great Game (HarperCollins, 2015; online sale point in India here). This brings together a collection of essays based on my blogging over at ESPN-Cricinfo–over the past six years. (These essays are not mere reproductions of those posts but significant extensions, revisions, and reworkings.)

Here is the cover:

Artwork

Here is a foreword by Gideon Haigh, cricket’s pre-eminent historian:

I suspect that Samir Chopra and I were born to at least correspond. His blog profile lists as his interests ‘cricket, free software, military history, military aviation, hiking, tattoos, industrial music, travelling’. I don’t necessarily share those interests—although the military and musical tastes overlap—but they are interesting, and he brings to his cricket writing the sort of well-stocked and free-ranging mind ever in short supply.

Cricket bloggers have a tendency to come and go, say what they have to say, and move on. Samir, I think, gets better and better. I enjoy his style of taking a stray or miscellaneous pensée, then comparing and contrasting, unpicking and elaborating, until a surprisingly rigorous argument has been constructed and a provocative conclusion reached, whether it’s that Andy Flower had a nerve asking India to withdraw its run-out appeal for Ian Bell at Trent Bridge in 2011, or that Fire in Babylon was frankly overpraised—views I happen to share, although that is less the point that Samir makes such trenchant yet civil cases. They are like watching cricket with a thoughtful and challenging companion. Perhaps these are the conversations Samir would like to have had with someone at a Test match, but, alas, has had to conduct with himself in his self-imposed east-coast American exile. If so, we’re fortunate that he’s condemned to partake of his cricket by the interwebs in splendid isolation, as he describes in another lovely cameo here.

There was a lot that set me nodding in Eye on Cricket, in recognition and assent. Yes, sport is grossly overstuffed with martial imagery; yes, I also tend to appraise every library by what is on the shelves at 796.358. Being one himself, Samir understands the ‘playing fan’ and the vernacular cricketer with great acuity. ‘No game, no physical or cultural endeavour, can survive or be sustainable if held aloft only by the efforts of those most proficient at it,’ should hang in a gilded frame in the office of every cricket  administrator. I revelled in cricket-nerdish references to a light appeal by Sew Shivnarine, and to ‘Kirti Azad’s Finest Hour’ too. I’d read many of these pieces previously, yet was struck by how well they cohered in this collection—it was almost as though Samir had been unconsciously working towards this totality all along. I am surprised only that he has never written anything about Jade Dernbach. After all, it would bring together two of his interests. Over to you, Samir.

Here is the jacket description:

In Eye on Cricket, Samir Chopra, a professor of philosophy and a long-time blogger at ESPNcricinfo, offers us a deeply personal take on a game that has entranced him his entire life in the several lands he has called home.

In these essays, Chopra reflects on a childhood centred on cricket, the many obsessions of fandom, the intersection of the personal and the political, expatriate experiences of cricket, historical regrets and remembrances, and cricket writing and media.

Nostalgic, passionate and meditative, Eye on Cricket is steeped in cricket’s history and its cultural significance, and reminds the most devoted spectators of the game that they are not alone. It shows how a game may, by offering a common language of understanding, bring together even those separated by time and space and culture.

On Becoming Canadian

I’ve become Canadian. By that, I don’t mean that I’ve acquired Canadian citizenship, begun enjoying universal healthcare and ice hockey, started bragging about how much bigger Canadian grizzlies are than American ones or how much better Molson’s is than Miller’s. And so on. Rather, it’s just that I have become blasé about the cold weather that has been gripping the US East Coast this winter. And saying things like “This is such a beautiful day” on days when the temperature is just above freezing point.

There was a time, not so long ago, when temperatures below the freezing point were conversation-worthy and worth dressing up for. The thermometer would drop below 32F–or 0 C as we Canadians like to put it–and I would hasten to wear a pair of long-johns before heading out for the day. Hat and gloves were, needless to say, de rigeur. And on arriving at my destination, I would make sure to say something like “Damn, its freezing out there.” The roaring twenties induced this sort of reaction in me all too easily; the teens, ever so rare, provoked adjectives that were rather more extravagant.

But this winter, the twenties and the teens have been all too common, almost as common as the many, many snowflakes that have come drifting down from the heavens.  And indeed, so have single-digit temperatures. (Dropping as low as 2F or -19C at one point.) I know residents of the American Midwest and the great Canadian plains will snicker at this city slicker dropping these piddling temperatures about him like badges of pride. But trust me; I know why you feel that way now.

For now, I find myself increasingly unfazed by the cold. I don’t wear long-johns any more; I’ve just become used to a pair of frozen lower extremities. (Please don’t be distracted by the double entendre.) Hats and gloves, common accessories for the twenties, are now only so for the teens. And I hardly ever talk about the weather. (I just blog about it. The fact that the weather has made it to this blog should perhaps indicate that I’ve run out of things to say. That may be so.)

In this new, complacent-about-the-cold state, many deep thoughts occur to me: Is it true that cold is just relative? That man can get used to just about anything? (Nietzsche did say once that man could tolerate anything so long as he knew the ‘why’ of it. I have to admit that the technical details of this year’s cold snap, which involve depressing news about the melting of the Arctic ice cap, its effect on ocean currents, the jet stream, and masses of cold air sitting on Siberia, have certainly made this year’s cold comprehensible.) Will privation make me appreciate abundance? (That is, will this year’s spring or summer seem especially salubrious?)

I’ll admit that I don’t know the answers to these profound inquiries. I do know that the Niagara Falls are prettier on the Canadian side, that Wayne Gretzky is the greatest, and that everything tastes better with maple syrup.

A Bad Teaching Day

Yesterday, I had a bad teaching day.

First, I was scattered and disorganized in my Twentieth Century Philosophy class; I repeated a great deal of material we had already covered; I offered only superficial explanations of some important portions of the assigned reading; I did not answer questions from students satisfactorily. (It was pretty clear to me by the end of the class that I did not know how to explain Wittgenstein’s argument against private languages to a novice.)

Then, fifteen minutes later, I walked into my Philosophical Issues in Literature class-where we were scheduled to discuss Jose Saramago‘s Blindness–and floundered again. (Though not as badly.) Here, I largely failed to satisfy myself that I had covered all the bases I wanted to. For instance, I was unable bring the class discussion around to a consideration of Saramago’s satirical tone, his view of humanity, the novel’s take on technology and the reaction of the state to sudden catastrophe–all important in studying Blindness. Instead, the discussion ran in several different directions and I felt entirely unsure that I had done a good job in keeping it coherent.

Later, after a break of a couple of hours, I traveled to Manhattan to teach my graduate Nature of Law seminar. Now, I struggled because of faulty syllabus design. My fifth and sixth weeks of the class were ostensibly to be devoted to studying legal realism. For the first of these two weeks, I assigned three essays by Justice Oliver Wendell Holmes; for the second, a selection of articles from an edited anthology. There were two problems with this choice. First, the readings were disproportionately assigned to the two weeks–the first required the students to read a mere forty-five pages, the second, approximately two hundred and twenty. Second, and more seriously, some of the readings for the second week should really have been assigned as companions to the Holmes essays. This poor design almost immediately manifested itself in the class discussion.

It was quite difficult to discuss Holmes essays without the surrounding context–historical and legal–that the additional readings would have provided. As a result, my students and I found ourselves either listening to me lecturing about that missing component, or returning, again and again, to discuss threadbare, the same central theses of Holmes that had begun the class session. (Indeed, I found myself repeating some points ad nauseam.) As the class wore on, I could not fail to notice that my students were losing interest; perhaps the assigned readings hadn’t been substantive or provocative enough. Perhaps.

That expression, of students fading out, is a killer. I almost ended the class early–one normally scheduled to run for two hours–but not wanting to admit surrender, hung on for dear life. With ten minutes to go, my students were packing up. I desperately sought to show them the reading at hand had more depth in it, looking for a money quote that would illustrate, brilliantly, a point I had just been trying to make. I didn’t find the one I was looking for, and had to settle for a lame substitute.

Which is how the class ended, lamely.

Hours later, after I had reached home, had dinner, and begun to settle down for the night I was still fuming. This morning, it continued. And here I am, writing a blog post about the whole day.

Teaching can be a wonderfully invigorating experience; it can also be painfully demoralizing.

Mary McCarthy On Henry Mulcahy’s Selfishness

In Mary McCarthy‘s The Groves of Academe, John Bentkoop, a faculty member at Jocelyn College, offers his take on his beleaguered colleague, Henry Mulcahy, who has set in motion schemes of varying deviousness in his bid to hang on to his precious position after receiving a dismissal notice from the college president:

Hen has a remarkable gift, a gift for being his own sympathizer. It’s a rare asset; it could be useful to him in politics or religion….He’s capable of commanding great loyalty because he’s unswervingly loyal to himself….Very few of us have that. It’s a species of self-alienation. He’s loyal to himself, objectively, as if he were another person, with that feeling of sacrifice and blind obedience that we give to a leader or a cause. In the world today, there’s a great deal of free-floating, circumambient loyalty that fixes itself on such people, who seem to offer, by their own example, the possibility of a separation from the self that will lead to a higher union with the self objectified in an idea. It’s Hen’s fortune or his fate to have achieved this union within his own personality; he’s foregone his subjectivity and hypostatized himself as an object.

There is no doubt Mulcahy’s ‘gift’ speaks to what could be a great and valuable skill: it enables the kind of fidelity and commitment to a greater purpose that is so often conducive to desirable forms of self-disciplining and to a channeling of personal energies towards a sought-after goal. (This goal will be, in all probability, one only of interest to Mulcahy.) Indeed, it is Mulcahy’s greatest strength–such as it is–that he is so utterly dedicated to himself and his life’s projects. He knows, with little self-doubt, who is number one. Bentkoop does not invoke narcissism here but there is no doubt the loyalty he refers to flirts with such notions.

Bentkoop’s suggestion that Mulcahy’s self-loyalty would be of most use in politics and religion is thus, entirely appropriate: a determined politician or preacher needs to sound–most of all, to himself or herself–entirely sure about his or her political or moral rectitude. Only someone with utter loyalty to themselves could be so convinced.

Mulcahy thus seems to have achieved what many others seek so desperately: some cause, some leader, some channeling of our otherwise all-too disparate energies toward a coherent objective. Fidelity and commitment to something–if only we knew what it was! Mulcahy has the answer: first, engage in a psychological maneuver–unspecified by Bentkoop–to transcend one’s own subjectivity, and then, regard oneself–and our goals–as a distant other to be approached with loyalty and desire. Thus, perhaps, who knows, we might even find the desirable balance between narcissism and self-abnegation.

As The Groves of Academe shows, the problem with Mulcahy’s loyalty to himself is that he does not find this balance: he is all too quick to sacrifice others to his cause. His colleagues, his family, his students, are all merely pawns, incidentals in a larger enterprise. McCarthy’s view of Mulcahy’s moral failings–forced upon by him by the news of his possible firing–is acutely unsparing.

The readers of her novel are not the first, and neither the last, to discover that self-loyalty is sometimes just an exalted name for selfishness.

Of Therapy And Personal And Academic Anxieties

Reading some of the discussion sparked by Peter Railton’s Dewey Lecture has prompted me to write this post.

In the fall of 1996, I began studying for my Ph.D qualifier exams. I had worked full-time at a non-academic job for the previous year, saving up some money so that I could take a month or two off and study for my exams. I had notes, I had copies of the previous years’ exams. I was set. I began reading my way through an unofficial reading list.

As I worked, my mood swung between extreme anxiety and over-confidence. There were times I felt I would breeze through my pair of inquisitions; on other occasions, I would fight a rising tide of panic at the thought of sitting in a classroom, an empty blue-book in front of me. Sometimes, I would rise early, drink two cups of coffee, smoke a few cigarettes, look through my notes, and decide I could not read any more, just because the reading was making me anxious. Sometimes, I would check out, smoking pot all day before returning to work again the next day. Sometimes I wondered what the point of a long, endless pursuit of  a degree which would only guarantee unemployment at the end of it all was. I was lonely and isolated in my apartment; my girlfriend returned home late at night from her corporate job.

One day, I worked out in the morning, returned to my apartment, stared long and hard at the papers in front of me and burst into tears, sobbing on and off for about thirty minutes. The next day, I called a friend to ask for help.

Three years previously, shortly after I had begun graduate school, I had met my friend at a student party. Over a beer, she had told me she was in ‘therapy.’ I was surprised to hear her talk about it openly, as something she ‘needed’, which ‘kept her from going nuts.’ Then, in the fall of 1993, it had not been even six months since my mother had passed away after a long struggle with breast cancer, and I knew I was still mourning. I had often felt in the months that had passed, a melancholia that was not easily dispelled by the immersion in school and off-campus work and the long hours of drinking in bars that were my primary modalities for treating it. I had flirted with the idea of seeking help for a mood that was stubbornly resistant to being lightened, sensing that I was not in the grip of a garden variety change in mental disposition.

But therapy seemed like a cop-out. Many of my male friends spoke disparagingly of it, of the culture of whining it seemingly created, the endless childish blaming of parents for adult pathologies. Therapy seemed wimpy, not manly enough; it seemed like a solution for those not strong enough to deal with life’s adversities, who wanted to wallow instead in self-indulgent pity parties on therapists’ couches.

So I had held back, hoping I would just ‘deal with it’ and get better. But I noticed little change; I easily descended into gloom and doom; I struggled with sleep, with drinking too much, with staying in romantic relationships; I found anxiety and panic to be constant companions. I never used the d-word to describe myself, but I often suspected I was depressed.

In the fall of 1996, with my qualifier exams creating many new opportunities for questioning my self-worth, and thus further compromising my fragile sense of being held together, I had finally broken down. I went looking for help.

My friend directed me to the Institute for Contemporary Psychotherapy in Manhattan where, after intake interviews, I began therapy twice a week. A year later, I considered taking anti-depressant medication, and consulted a psychiatrist for an evaluation. The good doctor told me he could prescribe one of the most popular medications at the time–Prozac or Serzone. I agreed, but then, panicked, and said I didn’t want to start. I continued with my talk therapy. But it was a secret; I told no one, and continued to feel like I had ‘copped out.’ Sometimes this secrecy would require elaborate subterfuge; I would tell friends I had to leave them to ‘run an errand’, sometimes walking in the wrong direction, away from my intended train station.

A year later, I changed therapists. I had felt like I was going in circles. Much had changed; I had passed my qualifiers, passed my oral exam with distinction, and also ended my older relationship and begun a new one. It was time for a new therapist too.

I found a therapist and resumed therapy twice a week. I continued to keep my therapy a secret (from everyone except my girlfriend and my friend.) I finished my dissertation, and for the semester that I was in the US after completion, stayed with the same therapist. My move to Australia meant my therapy would be interrupted. I took this break in stride, telling myself that perhaps I could move on now, a new person in a new land.

But a few months after I had moved to Sydney, I was looking for help again. I found a therapist–a Kleinian interestingly enough–and began visiting him twice a week. I was struggling with the usual anxieties academics suffer from; these seeming ephemera jostled with my struggles with a long-distance relationship, with subterranean feelings of fear and non-belonging, and an anxiety that never vacated the basement. I crossed an important barrier when I told some good friends–including a particularly near and dear male friend–that I was in therapy; that openness felt liberating.

After I returned to New York to take up my current position, my therapy was interrupted again. Two years later, I called up my old therapist to find out if he would take me back as patient; he was agreeable, but he had moved. I gave up looking for therapists, unwilling to go through the process of finding a compatible one. Over the years, on several occasions, I would go searching for therapists, look through web pages, and even make a few phone calls. But I never went all the way. I stayed hesitant; finding a good therapist had been hard work, and I seemed unwilling to do it all over again. I wondered if a cognitive behavioral therapist might not work better for me, compared to the analytical types I had previously worked with. Some good friends of mine urged me to resume therapy, sensing from some of my pronouncements that I might need it. (My career moved along; I was tenured and became full professor, but I never stopped doubting that I belonged in this profession, never stopped suspecting that I was simply not smart enough, hard-working enough. And I never stopped missing my long-departed parents.)

I haven’t started therapy again. Perhaps I dread its ‘ramping up’ phase too much; perhaps I have convinced myself my ‘workarounds’ are adequate; perhaps I’m ‘cured.’ I’m not sure but whatever the answer, I’m glad my graduate school friend helped me out when she did, that she urged me to overcome my hesitancy and discomfort about seeking professional help, that I was able to speak openly and frankly with my friends that I had done so. I am now a father and my anxieties have not diminished; if anything, they have increased. Perhaps I will seek help again. I won’t be shy about telling my friends I’ve done so.

Steven Weinberg’s History Of Science Syllabus

A few years ago, on reading–perhaps in the New York Review of BooksSteven Weinberg mention his teaching an undergraduate history of science class at the University of Texas, I wrote to Weinberg:

Professor Weinberg,

[…] I believe you teach a class on the history of science at UT-Austin. I would be very interested in perusing your reading list for this class. Would it be possible for you to send me an electronic copy?

Weinberg wrote the following brief email:

My reading list consists of a set of collections of original sources, such as Heath’s Greek Astronomy, and Matthews’ The Scientific Background to Modern Philosophy, and Plato’s Timaeus. I include some handouts, like xeroxed copies of pages from Ptolemy‘s Almagest, etc. I add Kuhn‘s The Copernican Revolution. [links added]

Because Weinberg did not send me an electronic copy of his syllabus, and because I thought he would have if he had had one, I did not persist in asking him for it. And I did not ask for any more detail about the unspecified portions of his syllabus. Weinberg seemed like a pretty busy guy, and I didn’t think he’d be interested in entertaining curricular inquiries from a perfect stranger. (Yes, I know, I was being a little star-struck by a Nobel laureate.)

It is hard to evaluate this syllabus in this incomplete state. Still, there is certainly philosophy in it, as well as some interesting original sources. (Matthews’ collection, for instance, makes available some brief excerpts from the writings of Copernicus, Galileo, Descartes, Newton etc.) Some of Weinberg’s selections–because of their archaic language–are likely to be challenging for the modern undergraduate; the readings are definitely non-trivial and not light. It is, as might be expected, biased towards physics; there is little biology to be seen.  Of primary interest to me is that there is almost no ‘modern’ history of science in it: that is, no work by contemporary academic historians of science. Rather, Weinberg relies on ‘classics’ like the Heath collection, Kuhn, and primary works from the periods of interest (Almagest, the Matthews, Timaeus etc) I wonder if this disregard is because Weinberg distrusts modern academic treatments of the history of science, suspecting them of smuggling in illicit philosophical speculation into their ‘critical histories.’ Weinberg’s own skeptical attitude to modern philosophy of science might inform such a selection.

Unsurprisingly, almost all the readings above would function well in many philosophy of science classes. The Heath alone might be considered a historical supplement in a straight philosophy class, but it too, could feature in a more comprehensive ‘History and Philosophy of Science’ class–like the kind I had suggested in an earlier post on the philosophical education of scientists. The inclusion of the Timaeus is quite intriguing. It remains a rarely taught Platonic dialogue; in part, because of its style, which makes it not a particularly easy read but also because of its subject–cosmology.

I’m not going to take the liberty of suggesting additions to this syllabus largely because I don’t want to speculate about what might be on those unnamed ‘some handouts’, but it does seem to me that some supplementation with philosophy of science could turn this into a useful history and philosophy of science class.