Brave Analytic Philosophers Use Trump Regime To Settle Old Academic Scores

Recently, Daniel Dennett took the opportunity to, as John Protevi put it, “settle some old academic scores.” He did this by making the following observation in an interview with The Guardian:

I think what the postmodernists did was truly evil. They are responsible for the intellectual fad that made it respectable to be cynical about truth and facts. You’d have people going around saying: “Well, you’re part of that crowd who still believe in facts.””

Roughly, postmodernism brought you Donald Trump. If only Trump voters hadn’t read so much Deleuze or Derrida or Spivak, we wouldn’t be in the mess we are now. Dennett has now been joined in this valiant enterprise of Defending Truth and Knowledge by Timothy Williamson who makes the following remarks in an interview with The Irish Times:

No philosophical manoeuvre can stop politicians telling lies. But some philosophical manoeuvres do help politicians obscure the distinction between truth and falsity.

When I visited Lima, a woman interviewed me for YouTube. She had recently interviewed a ‘postmodernist’ philosopher. When she pointed at a black chair and asked ‘Is that chair black or white?’ he replied ‘Things are not so simple’.

The more philosophers take up such obscurantist lines, the more spurious intellectual respectability they give to those who try to confuse the issues in public debate when they are caught out in lies. Of course, many things in public affairs are genuinely very complicated, but that’s all the more reason not to bring in bogus complexity….

Obviously it wasn’t mainly postmodernism or relativism that won it for Trump, indeed those philosophical views are presumably more widespread amongst his liberal opponents than amongst his supporters, perhaps most of whom have never heard of them. Still, those who think it somehow intolerant to classify beliefs as true or false should be aware that they are making it easier for people like Trump, by providing them with a kind of smokescreen.

In the course of an informal Facebook discussion, I made the following responses to Dennett’s remarks (which I described as ‘very silly’):

[We] could just as well lay the blame on the very idea of truth. Perhaps if truth wasn’t so exalted so much, we wouldn’t have so many people claiming that they should be followed just because what they said was the truth. Especially because many lies really are better for us than some truths. Perhaps we would have been better off seeing what worked for us, rather than obsessing about naming things as true or false.

Fascist insurgencies like the ones here in our country are not relying on post-modern critiques of truth and fact to prop up their claims; they need only rely on something far simpler: the fact that talking of truth and facts grants them an aura of respectability. The elevation (or demotion) of this political debate to a matter of metaphysics and epistemology is to play their game because we will find these pillars of ours to actually rest on sand. Far better to point out to proponents of ‘alternative facts’ that these facts will not help them send their kids to school or cure their illnesses. Let us not forget that these ‘facts’ help them in many ways now: it finds them a community, makes them secure, gives vent to their anger and so on. I’ve never liked the way everyone is jumping up and down about how some great methodological crisis is upon us in this new era, which is entirely ab initio. People have been using ‘fake news’ and ‘alternative facts’ all through history and using them to achieve political ends.

On a related note, Ali Minai responds to another set of claims against ‘relativism’ made in an article in The Chronicle of Higher Education by Alan Jay Levinovitz:

In fact, it is non-relativism that has generally been the weapon of choice for authoritarians. The weaponization of “alternative facts” may be aided by relativism but its efficacy still relies on the opposite attitude. It works only when its targets accept “alternative facts” as actually true.

What these responses to the Defenders of Truth Against Relativism make quite clear are the following propositions:

  1. So-called ‘postmodern’ critiques are more often than not, the political weapons of choice for those critiquing authoritarian regimes: they serve as a theoretical grounding for claims against ‘dominant’ or ‘totalizing’ narratives that issue from such regimes.
  2. Non-relativism or absolutism about truth is the preferred theoretical, argumentative, and rhetorical platform for authoritarians. Let us not forget that science was a challenge to the absolutism about truth that revealed religions claimed to profess; the Enlightenment brought many ‘alternative facts’ in its wake. Those worked; their ‘truth’ was established by their working. All the mathematical proofs and telescope gazings would have been useless had the science and technology built on them not ‘worked.’
  3. When fascists and authoritarians present ‘alternative facts’ and reject established knowledge claims, they do not present their alternative claims as ‘false’ because ‘truth’ is to be disdained; rather, they take an explicitly orthodox line in claiming ‘truth’ for their claims. ‘Truth’ is still valuable; it is still ‘correspondence’ to facts that matters.

The target of the critiques above then, is misplaced several times over. (Moreover, Willamson’s invocation of the philosopher who could not give a ‘straightforward’ answer to his interlocutor is disingenous at best. What if the ‘postmodernist’ philosopher wanted to make a point about colorblindness, or primary or secondary qualities? I presume Williamson would have no trouble with an analytic philosopher ‘complicating’ matters in such fashion. What would Williamson say to Wilfrid Sellars who might, as part of his answer, say, “To call that chair ‘black’ would be to show mastery of the linguistic concept ‘black’ in the space of reasons.” Another perfectly respectable philosophical answer, which Williamson would not find objectionable. Willamson’s glib answer to the question of whether the definition of truth offered by Aristotle correct is just that; surely, he would not begrudge the reams of scholarship produced in exploring the adequacy of the ‘correspondence theory of truth,’ what it leaves out, and indeed, the many devastating critiques leveled at it? The bogus invocation of ‘bogus complexity’ serves no one here.)

Critiques like Williamson and Dennett’s are exercises in systematic, dishonest misunderstandings of the claims made by their supposed targets. They refuse to note that it is the valorization of truth that does all the work for the political regimes they critique, that it is the disagreement about political ends that leads to the retrospective hunt for the ‘right, true, facts’ that will enable the desired political end. There is not a whiff of relativism in the air.

But such a confusion is only to be expected when the epistemology that Williamson and Dennett take themselves to be defending rests on a fundamental confusion itself: an incoherent notion of ‘correspondence to the facts’ and a refusal to acknowledge that beliefs are just rules for actions–directed to some end.

Cioran on Academic Writing’s ‘Forms of Vulgarity’

In ‘The Addict of Memoirs’ (from Drawn and Quartered, Arcade Publishing, New York, 1983/2012), E. M. Cioran writes:

Is there a better sign of “civilization” than laconism? To stress, to explain, to prove–so many forms of vulgarity.

Bergson is said to have said–somewhere–that time spent in refutation is time wasted¹. There is, evidently, a sympathy between these pronouncements that I make note of. Cioran suggests laconic forms of expression avoid three undignified sins that burden our writing. For in stressing, in explaining, in striving to prove–requirements placed upon us by the need to persuade, to change minds, to engage dialectically–we may go far afield of our original intentions were in writing. We write to communicate but only secondarily to persuade; what matter if we don’t?

Cioran’s assessment of the dialectical aspects of writing are harsh but rare is the academic writer who would not heave a sigh of relief were he or she to be spared their burdens. Consider, for instance, the trappings of academic writings: the elaborate piling on of references in opening sections to indicate–to referees, almost never to readers–that adequate scholarship is on display, the careful invocation of select objections and their refutation to shore up the central thesis presented. the footnoting to account for shades of meaning or to point to subsidiary debates, the careful setting of the stage for the presentation of the thesis, and so on. Such is  the overhead these place on any piece of academic writing that a common complaint made by readers–even if not always verbally articulated–is that a little too much fluff obscured the author’s central points. (I’m not discounting the importance of the references the author-researcher provides to future scholars in the field; merely that these accouterments are, at one level,  entirely peripheral to the points made by the writer and are only present because of the location of the writing within a particular social structure of inquiry. And as my nod to referees above indicates, within the context of a system of peer-review these considerations can quickly fade into insignificance.)

These ‘forms of vulgarity’ may too, force writers into forms of expression they are not competent at. Not everyone can explain or refute or prove as well as they can state a bold or original thesis; to indulge in these can weaken the work presented, which is both the writer’s and the reader’s loss. The provision of a claim or thesis in splendid isolation may be as productive–if not more–of thought as the provision of an elaborate package of arguments, objections, counter-objections, refutations, and so on; if we are to suggest further avenues of inquiry or to cut to the heart of the matter, a concise, powerful, and laconic statement may work best. Perhaps the reader can construct objections and see if the thesis presented stands up to them. Perhaps the writer and the reader–together–may then bring the writer’s work to fruition. Which is how it normally goes, in any case.

Note #1: I would appreciate a reference so I can reassure myself I’ve not made up this line.

The Acknowledgments Section As Venue For Disgruntlement

In The Revolutionary Career of Maximilien Robespierre  (University of Chicago Press, 1985) David P. Jordan writes in the ‘Acknowledgments’ section:

With the exception of the Humanities Institute of the University of Illinois at Chicago, whose fellowship gave me the leisure to rethink and rewrite, no fund or foundation, agency or institution, whether public or private local or national, thought a book on Robespierre worthy of support. [pp xi-xii; citation added]

Shortly after I had defended my doctoral dissertation, I got down to the pleasant–even if at times irritatingly bureaucratic–process of depositing a copy with the CUNY Graduate Center’s Mina Rees Library. The official deposited copy of the dissertation required the usual accouterments: a title page, a page for the signatures of the dissertation committee, an abstract page, an optional page for a dedication, and lastly, the acknowledgements. The first four of these were easily composed–I dedicated my dissertation to my parents–but the fifth one, the acknowledgements, took a little work.

In part, this was because I did not want to be ungracious and not make note of those who had tendered me considerable assistance in my long and tortuous journey through the dissertation. I thanked the usual suspects–my dissertation adviser, various members of the faculty, many friends, and of course, family. I restricted myself to a page–I continue to think multi-page acknowledgments are a tad self-indulgent–and did not try to hard to be witty or too effusive in the thanks I expressed.

And then, I thought of sneaking in a snarky line that went as follows:

Many thanks to the City University of New York which taught me how to make do with very little.

I was still disgruntled by the lack of adequate financial support through my graduate studies: fellowships and assistantships had been hard to come by; occasional tuition remissions had somewhat sweetened the deal, but I had often had to pay full resident tuition for a semester; and like many other CUNY graduate students, I had found myself teaching too many classes as an underpaid adjunct over the years. I was disgruntled too, by the poor infrastructure that my cohort contended with: inadequate library and computing resources were foremost among these. (During the last two years of my dissertation, I taught at NYU’s School of Continuing and Professional Studies and so had access to the Bobst Library and NYU’s computing facilities; these made my life much easier.)

In the end, I decided against it; my dissertation was over and done with, and I wanted to move on. A parting shot like the one above would have made felt like I still harbored resentments, unresolved business of a kind. More to the point, the Graduate Center, by generously allowing to me enroll as a non-matriculate student eight years previously, had taken a chance on me, and kickstarted my academic career. For that, I was still grateful.

I deleted the line, and deposited the dissertation.

Note #1: An academic colleague who finished his dissertation around the time I did dedicated his dissertation to his three-year old son as follows:

Dedicated to ‘T’ without whom this dissertation would have been finished much earlier.

Fair enough.

A Most Irritating Affectation

The most irritating affectation of the modern intellectual is to pretend to be technically incompetent. I exaggerate, of course, but I hope you catch my drift. Especially if you’ve encountered the specimen of humanity that I have in mind. (Mostly on social media, but often in person too.)

The type is clearly identified: a clearly intellectually accomplished individual–perhaps by dint of academic pedigree, perhaps by a body of public work, or just plain old clearly visible ‘smarts’–claims that they are incompetent in modern technology, that they simply cannot master it, that their puny minds cannot wrap their heads around the tools that so many of their friends and colleagues seem to have so effortlessly mastered. (‘Oh, I have no idea how to print double-sided’; ‘Oh, I have no idea what you mean by hypertext’). They are just a little too busy, you see, with their reading–good ol’ dead-tree books, no Kindles or Nooks here!–and writing–well, not on typewriters sadly, but word processors, for some change really cannot be resisted. Rest assured though, that they have to call for help every time they need to change the margins or fonts or underline some text.

This absorption in old-fashioned methodologies and materials of learning thus marks them as gloriously archaic holdovers from an era which we all know to have been characterized by a greater intellectual rectitude than ours. While the rest of us are slaves to fashion, scurrying around after technology, desperately trying to keep up with the technical Joneses, our hero is occupied with the life of the mind. So noble; such a pristine life, marked by utter devotion to the intellect and free of grubby mucking around with mere craft.

Why do I find this claim of incompetence to be an irritating affectation? My suspicion is easily provoked because I do find posturing in all too many places–as I did above in expressions of faux modesty, sometimes called humblebrags in the modern vernacular, but here, I think, is the rub. Those who profess such incompetence merely outsource the work of learning the tools we all learn to do our work to us. They are unwilling to put in the time to learn; they are too busy with their important work; we are not for we have, after all, shown that we have time to spare to learn. We should help–it is now our duty to aid their intellectual adventures.

A claim to incompetence should not be occasion for cheer, but it is. We are, after all, ambivalent about the technology that so dominates, regulates, and permeates our life; we are, all too often, willing to cheer on evidence that not all is well in this picture of utter and complete absorption in technique. We applaud this disdain; we wish we were so serene, so securely devoted to our pursuit of knowledge. We are also, of course, clapping wildly for a rebellion of sorts, a push-back against the creeping march of technology into every corner of our lives.

I think we can find better heroes.

Note: I’m willing to make some concessions for those over the age of fifty, but anyone younger than that bragging about their technical klutziness needs a rhetorical kneecapping.

Meritocracies, Rankings, Curricula: A Personal Take On Academic Philosophy

Some six years ago, shortly after I had been appointed to its faculty, the philosophy department at the CUNY Graduate Center began revising its long-standing curriculum; part of its expressed motivation for doing so was to bring its curriculum into line with those of “leading” and “top-ranked” programs. As part of this process, it invited feedback from its faculty members. As a former graduate of the Graduate Center’s Ph.D program, I thought I was well-placed to offer some hopefully useful feedback on its curriculum, and so, I wrote to the faculty mailing list, doing just that. Some of the issues raised in my email are, I think, still relevant to academic philosophy. Not everybody agreed with its contents; some of my cohort didn’t, but in any case, perhaps this might provoke some discussion.

Here, reproduced almost verbatim, is that email:

Perhaps I can throw my tuppence in the pile, by offering some remarks based on my experiences as a graduate of this Ph.D program, and by commenting on whether changing the curriculum to bring it into line with “leading” or “top-ranked” programs is a worthwhile motivation or not.

Firstly, I question the need to bring our curriculum into line with that of “leading” programs. I remain unconvinced that rankings of philosophy programs are a serious indicator of the education they provide. In the bad old days, rankings of philosophy programs seemed to be a set of indices that reflected the star power of the faculty. When NYU’s Ph.D program went “live”, its ranking magically jumped to 2 or 3, without that department having produced a single Ph.D, or having given any indicator whatsoever that their graduates were “better philosophers” than the ones produced by ours.

While the Leiters of this world have made their Reports more comprehensive, it is still not clear to me that the rankings are saying anything worthwhile about how well they *prepare* their students. If we had some solid data for saying that a particular curriculum is a significant causal factor in the philosophical acumen of its graduates, then I’m all for major change. Without that I’m a little reluctant to tinker so extensively.

A significant set of reasons why graduates of XYZ University (please replace with your favorite top-ranked department) are able to get good jobs is because they have had:

a) better financial support and are able to concentrate more on coursework and writing projects;

b) more institutional support for research activities like visiting conferences and building up a solid professional network;

c) more ‘star faculty’ at their disposal who are then able to tap into their rich network of professional contacts, write the important letters, make the important phone calls after the APA and ensure things like invited chapters in edited collections and the like.

The academy, like most other institutions in this world of ours, follows the Matthew Principle: those that have, get more.

I attended classes at NYU and Columbia, and interacted with graduate students from many of the programs in this region. My cohort was second to none in their philosophical chops. I never thought, “If only our curriculum was structured differently, then we’d be the ones with eight interviews at the APA’s Eastern Division Meeting.”

What we lacked the most perhaps was some sense of professionalization in our discipline. We spent most of our time wondering how we would graduate given our financial situation, how we would clean up those incompletes that had accumulated, and so on. Many of us were not bold enough to send papers to professional conferences or journals. We started to think about publications a little late in the game. This is what needs to change the most in my opinion.

I have a feeling some of this already has. I see more students from this program publishing in professional journals and conferences, learning the vagaries of the reviewing process, and most fundamentally, starting to see themselves as professors in training. May this process continue.

We can most help our graduates by making sure they produce scholarly output by the time they graduate. A publication in a top-ranked journal or two, possibly as a result of a semester long mentored study with a faculty member. Done right, this could be of considerable value to the faculty member as well. It seems this idea (or some variant thereof) is on the table, and I’m all for it.

My experience with the Grad Center‘s curriculum was largely positive. I enjoyed the Core courses and the broad grounding they provided in the central issues of the discipline. If I had a complaint–and this was echoed by many of my cohort–it was that the classes were often quite ahistorical. Some or most of the reading lists/syllabi were almost exclusively 20th century in content. I would be in favor of standardizing core reading lists so as to make them more comprehensive and rigorous, but I’m not overly optimistic that any sort of consensus would be reached.

My exam experiences were mixed. I enjoyed studying for the written and oral exams because again, I felt I gained a synoptic perspective on the discipline. Of the exams the oral exam was the most useful. I felt one of the written exams had become a little silly because its questions had become predictable. And the other exam was so out in left-field, I felt blindsided by the lack of a definitive reading list. But this problem has been taken care of–I believe–thanks to structured reading lists. I’m not against getting rid of the comprehensives because the education they aim to impart can be provided by other requirements.

I did my 60 credits for coursework as follows: six cores (Metaphysics, Epistemology, Philosophy of Language, Ethics, Logic, Social and Political Philosophy); one independent study in Mathematical Methods for Physicists at NYU; one class on Space and Time at Columbia; one class on Film and the City at the GC; and eleven other classes from our Departmental offerings. I felt my education was well-rounded, and that I had numerous opportunities to specialize in many different fields. At no stage in my Ph.D or during the job hunt, did I feel the curriculum had been a problem.

I wished more professors had urged me to convert my term papers into conference presentations, or to take the germ of an idea in there and explore it further, possibly for a conference presentation or a journal article.  That’s what I felt was missing.

As always, I would be very interested in comments.

‘Don’t Call Me A Philosopher’

I cringe, I wince, when I hear someone refer to me as a ‘philosopher.’ I never use that description for myself. Instead, I prefer locutions like, “I teach philosophy at the City University of New York”, or “I am a professor of philosophy.” This is especially the case if someone asks me, “Are you a philosopher?”. In that case, my reply begins, “Well, I am a professor of philosophy…”. Once, one of my undergraduate students asked me, “Professor, what made you become a philosopher?” And I replied, “Well, I don’t know if I would go so far as to call myself a philosopher, though I did get a Ph.D in it, and…”. You get the picture.

I’m not sure why this is the case. I think folks that have Ph.Ds in mathematics or physics or economics and who teach those subjects and produce academic works in those domains have no hesitation in calling themselves mathematicians or physicists or economists.

Part of the problem, of course, is that in our day and age, in our culture, ‘philosopher’ has come to stand for some kind of willful pedant, a non-productive member of society, content to not contribute to the Gross Domestic Product but to merely stand on the sidelines and take potshots at those who actually produce value. The hair-splitter, the boringly didactic drone. (Once, shortly after a friend and I had finished watching Once Were Warriors, we began a discussion of its merits. As I began pointing out that the director’s explicit depiction of violence toward women might have been necessary to drive home a broader point about the degradation of Maori culture, my friend interrupted, “There you go, being philosophical again! Can’t you just keep things simple?”).

But this modern disdain for the ‘philosopher’, this assessment of her uselessness, her unemployability, is not the only reason that I shrink from being termed one. There is another pole of opinion that I tend toward: ‘philosopher’ sounds a little too exalted, a little too lofty; it sounds insufferably pompous. It comes packaged with too many pretensions, too many claims to intellectual rectitude and hygiene. Far too often, that title has served as cover for too many sorts of intellectual prejudice. To describe myself thus or allow someone else to do would be to permit a placement on a pedestal of sorts, one I’m not comfortable occupying. (This situation has not been helped by the fact that when someone has described me thus in company, others have giggled and said “Oh, you’re a philosopher now?” – as if I had rather grandiosely allowed such a title to be assigned to me.)

This discomfort arises in part from my self-assessment of intellectual worth, of course. I do not think I am sufficiently well-read in the philosophical literature; there are huge, gaping, gaps in my education. I remain blithely unaware of the contours of many philosophical debates and traditions; the number of classics that I keep reminding myself I have to stop merely quoting and citing and actually read just keeps on growing. I do not write clearly or prolifically enough.  And so on. (Some of these feelings should be familiar to many of my colleagues in the academic world.)

For the time being, I’m happy enough to make do with the opportunity that I’ve been offered to be able to read, write, and teach philosophy. The titles can wait.

An Old Flame (No, Not That Kind)

Writing about the adversarial disputation styles present in academic philosophy reminded me of the time I lost my temper at someone who worked in the same department as me. (I don’t use the term ‘colleague’ advisedly. This dude was anything but.) Then, I was in the computer science department at Brooklyn College, and had for a long time been the subject of a series of personal attacks by a senior professor in the department. He made insulting remarks at department meetings about my research, my work on the curriculum committee, attacked me during my promotion interview, and of course, made many, many snide, offensive remarks over the departmental mailing list. (I was not alone in being his target; many other members of my department had been attacked by him as well.)

Finally, after he had yet another crude comment on the mailing list about my work, matters came to a head. I lost my temper and wrote back:

Ok, its been mildly diverting for a while. But I’ve tired of dealing with your sub-literate philistine self.

First, I don’t care what your middle name is. I made one up; you want me to be careful in how I address you? When all I am subjected to is more of the stinking piles of meshugna hodgepodge that is periodically deposited in my inbox?

Secondly, you bore me. You are excessively pompous, and your actions and pronouncements reek of a disturbing misanthropy. You are a legend in your own mind, and nowhere else. You pontificate excessively, lack basic reading skills and are constitutionally incapable of constructing an argument. You suffer under the delusion that your laughable savant-like talents actually have something to do with intelligence. You strut around, convinced that you make sense, while what you really should do is pay less attention to those voices in your head.

Thirdly, while I could take some time to construct a rebuttal of your useless ramblings, I’d rather spend some time insulting you in public. That’s what you like to do, so why don’t I just play along for a bit? But only as long as you don’t bore me excessively. When it gets to that point, I’ll have my SPAM filter mark your emails as SPAM and toss them in the trash where they belong. I like a little light amusement once in a while, and you occasionally provide it. Its cheap, low-brow entertainment. I think [senior professors] should be good for more than cheap entertainment but you have set your sights very low, so I should humor you for a bit before I go back to work. Its the least I can do for a ‘colleague’.

I used to flame self-deluded folks like you for fun back in the good ol’ Usenet days; if you want to join in and stick a bulls-eye on your forehead, be my guest. I miss the days of flaming Penn State undergrads who ran to post ramblings like yours five minutes after they had received their first BITNET accounts. But those guys could read at least, so flaming them was fun. With you, I’m not sure. Maybe you should go write a grant, schmooze with a grants program officer, or take a journal editor out for lunch. Or perhaps take a history lesson in computer science. One thing you do need is an education. In manners, first and foremost, but once you are done with that, I’ll send you a list of other subjects you need to catch up on. There’s a whole world out there. Try it sometime.

When you can construct a flame, get back to me, bring an asbestos suit, and I’ll get to work. But please, try to entertain me. If I am to be subjected to foolishness, I want to be entertained as well.  You’re a bit like Borat without the satire or irony. Or humor. Or entertainment value. In short, (stop me if you’ve heard this before), mostly, you just bore me.

Now, I command you: entertain me. Write an email that makes sense. Otherwise, run along. I’ve got serious research to do.

This might seem like fun. But it wasn’t. It was draining and dispiriting. I had been provoked, and I had fallen for it.

Won’t get fooled again.