The Academic’s Peculiar Dissonance

The academic state of mind is distinguished, I think, by a peculiar kind of dissonance; the academic is able to entertain two conflicting states of being simultaneously; each informs the other and brings to it its peculiar intensity and torment.

At one end of its affective and emotional spectrum lies the well-known impostor syndrome: the academic worries that he or she is a fraud, unsuited to the rigorous demands of the profession that their life’s choices have brought them to; they are besides themselves with anxiety that one day they will be ‘found out’ or worse, that they will go through the rest of their lives living out this charade, one in which they have managed to somehow convince others–by a toxic combination of lies and artifice and outright dishonesty–that they are purveyors of knowledge, skilled and educated beyond the imaginings of most. They are shocked and surprised and intimidated by the blustering displays of knowledge that their fellow academics subject them to; they examine their own achievements and find them wanting in every dimension when compared with those of their colleagues and other contemporaries; they find that academic life, rather than providing for occasions in which their knowledge will be on display instead provides one forum after the other in which they find out just how much they don’t know; they enter a bookstore and retreat, intimidated by the talents on display; they are convinced their ability will never match up to all those who seem to effortlessly master domains of knowledge they themselves can only nibble at.

At the other end of the spectrum lies what I will call the ‘frustrated and unrecognized genius syndrome’: the academic is convinced that the world has failed to adequately recognize his unique and distinctive talents and knowledge, all the while paying obeisance and elevating to the highest reaches of their profession charlatans of all stripes. They look on with barely contained frustration and anger as accolades and recognition are funneled and channeled to those they consider unworthy; they consider themselves cheated by the vagaries of the fortunes of the academic world; their books and articles are unread, unremarked, uncited, falling stillborn from the press to be embalmed on the dusty shelves of libraries, while those of utter nincompoops are elevated to the status of icons; they look back on their intellectual careers and remark on its many contingent occurrences that could have, with a slight twist or two, catapulted them into those very zones whose air they yearn to breathe. They are always on the cusp of ‘making it’; but they never do; and they remain convinced that if only the chips had fallen in the right way, they would be where those they consider unworthy reside instead. Fate and fortune have been cruel; accursed is this world and its ways. A prophet is never recognized in his day and age.

This is an uncomfortable state of affairs at best; it afflicts students and professors alike. It infects the life of the mind with its own distinctive anxieties and neuroses; it may account for some of the depressing statistics pertaining to mental health in the profession.

Hug an academic today. Or not.

The Acknowledgments Section As Venue For Disgruntlement

In The Revolutionary Career of Maximilien Robespierre  (University of Chicago Press, 1985) David P. Jordan writes in the ‘Acknowledgments’ section:

With the exception of the Humanities Institute of the University of Illinois at Chicago, whose fellowship gave me the leisure to rethink and rewrite, no fund or foundation, agency or institution, whether public or private local or national, thought a book on Robespierre worthy of support. [pp xi-xii; citation added]

Shortly after I had defended my doctoral dissertation, I got down to the pleasant–even if at times irritatingly bureaucratic–process of depositing a copy with the CUNY Graduate Center’s Mina Rees Library. The official deposited copy of the dissertation required the usual accouterments: a title page, a page for the signatures of the dissertation committee, an abstract page, an optional page for a dedication, and lastly, the acknowledgements. The first four of these were easily composed–I dedicated my dissertation to my parents–but the fifth one, the acknowledgements, took a little work.

In part, this was because I did not want to be ungracious and not make note of those who had tendered me considerable assistance in my long and tortuous journey through the dissertation. I thanked the usual suspects–my dissertation adviser, various members of the faculty, many friends, and of course, family. I restricted myself to a page–I continue to think multi-page acknowledgments are a tad self-indulgent–and did not try to hard to be witty or too effusive in the thanks I expressed.

And then, I thought of sneaking in a snarky line that went as follows:

Many thanks to the City University of New York which taught me how to make do with very little.

I was still disgruntled by the lack of adequate financial support through my graduate studies: fellowships and assistantships had been hard to come by; occasional tuition remissions had somewhat sweetened the deal, but I had often had to pay full resident tuition for a semester; and like many other CUNY graduate students, I had found myself teaching too many classes as an underpaid adjunct over the years. I was disgruntled too, by the poor infrastructure that my cohort contended with: inadequate library and computing resources were foremost among these. (During the last two years of my dissertation, I taught at NYU’s School of Continuing and Professional Studies and so had access to the Bobst Library and NYU’s computing facilities; these made my life much easier.)

In the end, I decided against it; my dissertation was over and done with, and I wanted to move on. A parting shot like the one above would have made felt like I still harbored resentments, unresolved business of a kind. More to the point, the Graduate Center, by generously allowing to me enroll as a non-matriculate student eight years previously, had taken a chance on me, and kickstarted my academic career. For that, I was still grateful.

I deleted the line, and deposited the dissertation.

Note #1: An academic colleague who finished his dissertation around the time I did dedicated his dissertation to his three-year old son as follows:

Dedicated to ‘T’ without whom this dissertation would have been finished much earlier.

Fair enough.

My First Academic Conference

The first academic conference I attended was the 1999 Annual Meeting of the Association of Symbolic Logic, held at the University of California at San Diego. I submitted an abstract for a presentation, which was accepted, and so off I went, hoping to gain ‘experience’ and ‘exposure.’ My paper was based on part of my then in-progress dissertation; to be more precise, it presented the first model of belief revision I was currently working on with my thesis advisor.

I  had applied for, and received, some limited funds for travel–these barely covered the flight to San Diego and did not help with car rental fees. (I had arranged housing with a philosophy graduate student at UCSD.) I arrived in San Diego, picked up my rental car, and drove to my host’s place. The next morning the conference began, and so did my disorientation.

First, I was in the wrong conference. This meeting’s attendance was mostly comprised of mathematical logicians (set theorists, model theorists, proof theorists, recursion theorists, complexity theorists, and the like) – no one was likely to be interested in the model of belief revision I was presenting. It was simply not interesting enough, at the formal and mathematical level, for this crowd. And its philosophical underpinnings and motivations were hardly likely to be of interest either; those features were not the sorts of things mathematical logicians looked for in the formal work that was being presented that weekend.

Second,  as a related consequence, I knew no one.  This was an academic community I had no previous contact with–I knew no faculty or graduate students in it. I wandered around the halls and rooms, occasionally striking up brief conversations with students, sometimes introducing myself to faculty. My thesis adviser was known to some of the faculty I introduced myself to; this fact allowed for some useful ice-breaking in conversations. (I also managed to embarrass myself by pushing copies of my paper into some hands.) But mostly, I stayed on the peripheries of these social spaces.

Third, the subject matter of the talks was utterly unfamiliar and incomprehensible. I had studied some logic, but I was an amateur yet. And the inclinations of the mathematical logicians who comprised the primary attendance at this conference were pitched entirely differently from the philosophical logic I had been exposed to: their work was almost entirely concerned with the mathematical properties of the frameworks they worked on. I attended a couple of talks, but all too soon, bewildered and bored, I gave up.

I did not feel I belonged. Not here, not at any academic conference. I was intimidated and made diffident; my doubts about my choice of career and dissertation topic grew. By the second day of the conference, this feeling had grown worse, not ideal preparation for my talk. Quaking in my boots at the thought of being exposed to a grilling by a heavy hitter in the audience, my nervousness knew few bounds. Fortunately, the worst case did not eventuate; I put up my slides, described the work underway, answered a perfunctory question or two, and walked off the ‘stage,’ relieved. 

That year, the final year of my dissertation work, I attended three more conferences–a graduate student meeting at Brown, and international professional conferences in Sweden and Greece. By the end of the summer, I was a little more comfortable in my skin at these spaces. One such attendance almost certainly helped secure me a post-doctoral fellowship. (Yet another saw me lost again among mathematical logicians.)

Over the years, I’ve attended many more. But I never got really comfortable with conferences; I never felt like I fitted in. Now, I don’t go to conferences any more; the travel sounds interesting, but the talks, the questions and answer sessions, the social schmoozing, the dinners, (and the conference fees!) don’t sound enticing. I prefer smaller-scale, more personally pitched interactions with my fellow academics.  But perhaps a suitable conference venue–with mountains close by–will overcome this reticence.

Artificial Intelligence And Go: (Alpha)Go Ahead, Move The Goalposts

In the summer of 1999, I attended my first ever professional academic philosophy conference–in Vienna. At the conference, one titled ‘New Trends in Cognitive Science’, I gave a talk titled (rather pompously) ‘No Cognition without Representation: The Dynamical Theory of Cognition and The Emulation Theory of Mental Representation.’ I did the things you do at academic conferences as a graduate student in a job-strapped field: I hung around senior academics, hoping to strike up conversation (I think this is called ‘networking’); I tried to ask ‘intelligent’ questions at the talks, hoping my queries and remarks would mark me out as a rising star, one worthy of being offered a tenure-track position purely on the basis of my sparking public presence. You know the deal.

Among the talks I attended–a constant theme of which were the prospects of the mechanization of the mind–was one on artificial intelligence. Or rather, more accurately, the speaker concerned himself with evaluating the possible successes of artificial intelligence in domains like game-playing. Deep Blue had just beaten Garry Kasparov in an unofficial chess-human world championship in 1997, and such questions were no longer idle queries. In the wake of Deep Blue’s success the usual spate of responses–to news of artificial intelligence’s advance in some domain–had ensued: Deep Blue’s success did not indicate any ‘true intelligence’ but rather pure ‘computing brute force’; a true test of intelligence awaited in other domains. (Never mind that beating a human champion in chess had always been held out as a kind of Holy Grail for game-playing artificial intelligence.)

So, during this talk, the speaker elaborated on what he took to be artificial intelligence’s true challenge: learning and mastering the game of Go. I did not fully understand the contrasts drawn between chess and Go, but they seemed to come down to two vital ones: human Go players relied, indeed had to, a great deal on ‘intuition’, and on a ‘positional sizing-up’ that could not be reduced to an algorithmic process. Chess did not rely on intuition to the same extent; its board assessments were more amenable to an algorithmic calculation. (Go’s much larger state space was also a problem.) Therefore, roughly, success in chess was not so surprising; the real challenge was Go, and that was never going to be mastered.

Yesterday, Google’s DeepMind AlphaGo system beat the South Korean Go master Lee Se-dol in the first of an intended five-game series. Mr. Lee conceded defeat in three and a half hours. His pre-game mood was optimistic:

Mr. Lee had said he could win 5-0 or 4-1, predicting that computing power alone could not win a Go match. Victory takes “human intuition,” something AlphaGo has not yet mastered, he said.

Later though, he said that “AlphaGo appeared able to imitate human intuition to a certain degree” a fact which was born out to him during the game when “AlphaGo made a move so unexpected and unconventional that he thought “it was impossible to make such a move.”

As Jean-Pierre Dupuy noted in his The Mechanization of Mind, a very common response to the ‘mechanization of mind’ is that such attempts merely simulate or imitate, and are mere fronts for machinic complexity–but these proposals seemingly never consider the possibility that the phenomenon they consider genuine or the model for imitation and simulation can only retain such a status as long as simulations and imitations remain flawed. As those flaws diminish, the privileged status of the ‘real thing’ diminishes in turn. A really good simulation,  indistinguishable from the ‘real thing,’ should make us wonder why we grant it such a distinct station.

Fearing Tenure: The Loss Of Community

In ‘The Clouded Prism: Minority Critique of the Critical Legal Studies Movement‘, Harlan L. Dalton wrote:

I take it that everyone drawn to CLS is interested in specifying in concrete terms the dichotomy between autonomy and community. If so, talk to us. Talk TO us. Listen to us. We have lots to say, out of the depths of our own experiences. For many of us, our sense of community is a strength, a resource, something we struggle to hang onto, sometimes in the most peculiar ways, especially when the pull of autonomy is strongest. The day that I am awarded tenure, should that happy event occur, any pleasure that I experience will be more than offset by the extreme panic that I’m sure will set in; I will worry that I have been propelled (or more  honestly that I have wittingly, selfishly and self-destructively propelled myself) two steps further away from so much that has nurtured me for so long. Even for those of us who have revelled in the sense of connectedness that, paradoxically, racial oppression has conferred upon us, there is a kicker: we don’t have any choice in the matter. We can’t choose to be a part of the community; we can’t choose not to be a part of the community.

When I first read these lines, I was reminded of a conversation that used to recur in some of my therapeutic sessions: Why would you shrink from that which you most–supposedly–desire?

Some insight may be found in Dalton’s confession. Tenure would mean not being part of a ‘community’, membership in which, while a reminder of exclusion from another, was also a belonging in a very particular way. It meant the enjoyment of a very distinctive camaraderie, the dwelling in a state of being that had its own rewards.

I will not attempt to speak for Dalton’s experiences so let me just briefly address my own. Gaining tenure meant the end of a ‘struggle’; it meant the end of a state in which I had a very ‘clear and distinct’ goal, a terminus of achievement, one that had established yardsticks and baselines for me, calibrating my ‘progress’ and reminding me of how far I had come and how far I still had to go. I saw myself as member of a group marked by its presence in the margins, by its distance from the center, by a vaguely heroic air of struggle against economic, intellectual, and even political barriers. We were the untenured, the ‘assistant professors’; we had secured the prize of a tenure-track position, but we were still ‘battlers.’ I had trajectories to follow, and I had fellow-travelers. My lot was sympathized with; many were solicitous of the state of my journey, my distance from its destination. I was assured of celebrations and revelries were I to cross the finish line. I could look ahead and see the goal; I could feel my cohort around me, propping me up.

In the midst of all this, even as I desired that onward and upward movement, I knew what I would leave behind: a time and a place in which I was in possession of that dearest of things, a clear and unstinting purpose.

I am well-aware that a reflection like this, in the context of today’s job market, is an extremely self-indulgent one. I write it only to highlight the ironic and puzzling nature of the situations that Dalton and those in therapy might find themselves in, and of the artfully hidden blessings of even those portions of our lives that we might find oppressive and worth delivering ourselves from.

Meritocracies, Rankings, Curricula: A Personal Take On Academic Philosophy

Some six years ago, shortly after I had been appointed to its faculty, the philosophy department at the CUNY Graduate Center began revising its long-standing curriculum; part of its expressed motivation for doing so was to bring its curriculum into line with those of “leading” and “top-ranked” programs. As part of this process, it invited feedback from its faculty members. As a former graduate of the Graduate Center’s Ph.D program, I thought I was well-placed to offer some hopefully useful feedback on its curriculum, and so, I wrote to the faculty mailing list, doing just that. Some of the issues raised in my email are, I think, still relevant to academic philosophy. Not everybody agreed with its contents; some of my cohort didn’t, but in any case, perhaps this might provoke some discussion.

Here, reproduced almost verbatim, is that email:

Perhaps I can throw my tuppence in the pile, by offering some remarks based on my experiences as a graduate of this Ph.D program, and by commenting on whether changing the curriculum to bring it into line with “leading” or “top-ranked” programs is a worthwhile motivation or not.

Firstly, I question the need to bring our curriculum into line with that of “leading” programs. I remain unconvinced that rankings of philosophy programs are a serious indicator of the education they provide. In the bad old days, rankings of philosophy programs seemed to be a set of indices that reflected the star power of the faculty. When NYU’s Ph.D program went “live”, its ranking magically jumped to 2 or 3, without that department having produced a single Ph.D, or having given any indicator whatsoever that their graduates were “better philosophers” than the ones produced by ours.

While the Leiters of this world have made their Reports more comprehensive, it is still not clear to me that the rankings are saying anything worthwhile about how well they *prepare* their students. If we had some solid data for saying that a particular curriculum is a significant causal factor in the philosophical acumen of its graduates, then I’m all for major change. Without that I’m a little reluctant to tinker so extensively.

A significant set of reasons why graduates of XYZ University (please replace with your favorite top-ranked department) are able to get good jobs is because they have had:

a) better financial support and are able to concentrate more on coursework and writing projects;

b) more institutional support for research activities like visiting conferences and building up a solid professional network;

c) more ‘star faculty’ at their disposal who are then able to tap into their rich network of professional contacts, write the important letters, make the important phone calls after the APA and ensure things like invited chapters in edited collections and the like.

The academy, like most other institutions in this world of ours, follows the Matthew Principle: those that have, get more.

I attended classes at NYU and Columbia, and interacted with graduate students from many of the programs in this region. My cohort was second to none in their philosophical chops. I never thought, “If only our curriculum was structured differently, then we’d be the ones with eight interviews at the APA’s Eastern Division Meeting.”

What we lacked the most perhaps was some sense of professionalization in our discipline. We spent most of our time wondering how we would graduate given our financial situation, how we would clean up those incompletes that had accumulated, and so on. Many of us were not bold enough to send papers to professional conferences or journals. We started to think about publications a little late in the game. This is what needs to change the most in my opinion.

I have a feeling some of this already has. I see more students from this program publishing in professional journals and conferences, learning the vagaries of the reviewing process, and most fundamentally, starting to see themselves as professors in training. May this process continue.

We can most help our graduates by making sure they produce scholarly output by the time they graduate. A publication in a top-ranked journal or two, possibly as a result of a semester long mentored study with a faculty member. Done right, this could be of considerable value to the faculty member as well. It seems this idea (or some variant thereof) is on the table, and I’m all for it.

My experience with the Grad Center‘s curriculum was largely positive. I enjoyed the Core courses and the broad grounding they provided in the central issues of the discipline. If I had a complaint–and this was echoed by many of my cohort–it was that the classes were often quite ahistorical. Some or most of the reading lists/syllabi were almost exclusively 20th century in content. I would be in favor of standardizing core reading lists so as to make them more comprehensive and rigorous, but I’m not overly optimistic that any sort of consensus would be reached.

My exam experiences were mixed. I enjoyed studying for the written and oral exams because again, I felt I gained a synoptic perspective on the discipline. Of the exams the oral exam was the most useful. I felt one of the written exams had become a little silly because its questions had become predictable. And the other exam was so out in left-field, I felt blindsided by the lack of a definitive reading list. But this problem has been taken care of–I believe–thanks to structured reading lists. I’m not against getting rid of the comprehensives because the education they aim to impart can be provided by other requirements.

I did my 60 credits for coursework as follows: six cores (Metaphysics, Epistemology, Philosophy of Language, Ethics, Logic, Social and Political Philosophy); one independent study in Mathematical Methods for Physicists at NYU; one class on Space and Time at Columbia; one class on Film and the City at the GC; and eleven other classes from our Departmental offerings. I felt my education was well-rounded, and that I had numerous opportunities to specialize in many different fields. At no stage in my Ph.D or during the job hunt, did I feel the curriculum had been a problem.

I wished more professors had urged me to convert my term papers into conference presentations, or to take the germ of an idea in there and explore it further, possibly for a conference presentation or a journal article.  That’s what I felt was missing.

As always, I would be very interested in comments.

‘Don’t Call Me A Philosopher’

I cringe, I wince, when I hear someone refer to me as a ‘philosopher.’ I never use that description for myself. Instead, I prefer locutions like, “I teach philosophy at the City University of New York”, or “I am a professor of philosophy.” This is especially the case if someone asks me, “Are you a philosopher?”. In that case, my reply begins, “Well, I am a professor of philosophy…”. Once, one of my undergraduate students asked me, “Professor, what made you become a philosopher?” And I replied, “Well, I don’t know if I would go so far as to call myself a philosopher, though I did get a Ph.D in it, and…”. You get the picture.

I’m not sure why this is the case. I think folks that have Ph.Ds in mathematics or physics or economics and who teach those subjects and produce academic works in those domains have no hesitation in calling themselves mathematicians or physicists or economists.

Part of the problem, of course, is that in our day and age, in our culture, ‘philosopher’ has come to stand for some kind of willful pedant, a non-productive member of society, content to not contribute to the Gross Domestic Product but to merely stand on the sidelines and take potshots at those who actually produce value. The hair-splitter, the boringly didactic drone. (Once, shortly after a friend and I had finished watching Once Were Warriors, we began a discussion of its merits. As I began pointing out that the director’s explicit depiction of violence toward women might have been necessary to drive home a broader point about the degradation of Maori culture, my friend interrupted, “There you go, being philosophical again! Can’t you just keep things simple?”).

But this modern disdain for the ‘philosopher’, this assessment of her uselessness, her unemployability, is not the only reason that I shrink from being termed one. There is another pole of opinion that I tend toward: ‘philosopher’ sounds a little too exalted, a little too lofty; it sounds insufferably pompous. It comes packaged with too many pretensions, too many claims to intellectual rectitude and hygiene. Far too often, that title has served as cover for too many sorts of intellectual prejudice. To describe myself thus or allow someone else to do would be to permit a placement on a pedestal of sorts, one I’m not comfortable occupying. (This situation has not been helped by the fact that when someone has described me thus in company, others have giggled and said “Oh, you’re a philosopher now?” – as if I had rather grandiosely allowed such a title to be assigned to me.)

This discomfort arises in part from my self-assessment of intellectual worth, of course. I do not think I am sufficiently well-read in the philosophical literature; there are huge, gaping, gaps in my education. I remain blithely unaware of the contours of many philosophical debates and traditions; the number of classics that I keep reminding myself I have to stop merely quoting and citing and actually read just keeps on growing. I do not write clearly or prolifically enough.  And so on. (Some of these feelings should be familiar to many of my colleagues in the academic world.)

For the time being, I’m happy enough to make do with the opportunity that I’ve been offered to be able to read, write, and teach philosophy. The titles can wait.