Professorship and ‘The Perennial Taker of Courses’

In ‘In Greenwich, There Are Many Gravelled Walks‘ Hortense Calisher writes,

Robert was a perennial taker of courses–one of those non-matriculated students of indefinable age and income, some of whom pursued, with monkish zeal and no apparent regard for time, this or that freakishly peripheral research project of their own conception, and others of whom, like Robert, seemed to derive a Ponce de Léon sustenance from the young.

I have special fondness for the non-matriculate; I began my academic career as one, taking two graduate classes in philosophy before I began formal doctoral studies. And before I registered for them, when I informed my mother that I planned to quit my day job eventually to seek a full-time academic career, her immediate, and immensely gratifying, reaction was, ‘That’s great! If you become a professor you take classes for the rest of your life at your university!’ I hadn’t thought that the opportunity to be an endless dilettante, browsing through each semester’s course offerings and picking one, would present itself as the most obvious advantage of a professor’s life, but my mother certainly thought that way.

I haven’t managed to do so. But I did try. After I returned to New York from my post-doctoral work in Sydney, I sat in on Spanish 101. Learn a language, travel, cook–you know, the standard aspirations. I attended quite a few classes, but found it difficult to keep up with homework given my teaching and service duties (and of course, my own academic interests). I didn’t make it to the end of the semester; sometime shortly after the mid-term (in which I got a decent, but not excellent, grade), I dropped out.

A year later, I tried again. This time around, having convinced myself that the problem the last time around had been the lack of a formal component to my dabbling, and with an eye on a graduate seminar on the Frankfurt School offered through the History department at the CUNY Graduate Center, I registered, taking advantage of the tuition exemption for employees of the City University.

This time around, things went marginally better. I did most of the readings, attended all the classes, and even wrote a  paper on Horkheimer, which was probably quite amateurish, but which was very helpful in making me more familiar with his writings. But again, I found things not entirely to my liking.  I was still busy with teaching and service and writing, and the time needed to travel to Manhattan for the seminar and do the readings seemed onerous. (Perhaps I didn’t enjoy the company of graduate students. Too many of them seemed to instantiate dreaded archetypes of that demographic: the hasn’t-done-the-readings-but-will-still-pontificate-on-it and the can’t-shut-up-and-stay-on-point varietals being the most pernicious. I certainly wasn’t deriving any ‘Ponce de Léon sustenance’ from them.)

So that was my last attempt to replicate the non-matriculate days. I became ever busier with my own writing and confined my dilettantism to unguided, unstructured dabbling on my own. And I had found other outlets for it: teaching new classes or revising syllabi for classes taught previously and blogging being the most prominent among them. Besides, once you’re a full professor, its all pretty much dabbling in any case.

Walking the City: Random Walks Through Manhattan Streets

In Street Life: Becoming Part of the City, Joseph Mitchell wrote:

What I really like to do is wander aimlessly in the city. I like the walk the streets by day and by night. It is more than a liking, a simple liking–it is an aberration. Every so often, for example, around nine in the morning, I climb out of the subway and head toward the office building in midtown Manhattan in which I work, but on the way a change takes place in me–in effect, I lose my sense of respectability–and when I reach the entrance to the building I walk right past it, as if I had never seen it before. I keep on walking, sometimes only for a couple of hours, but sometimes until deep in the afternoon, and I often wind up a considerable distance away from midtown Manhattan–up in the Bronx Terminal Market maybe, or over on some tumbledown old sugar dock on the Brooklyn riverfront, or out in the weediest part of some weedy old cemetery in Queens. It is never very hard for me to think up some excuse that justifies me in behaving this way…

I lived in Manhattan from 1993 to 2000 and often walked ‘aimlessly in the city’; Manhattan’s layout encouraged such roaming. It felt like a gigantic playground, laid out so as to invite exploration. I moved across the Hudson to 95th Street and West End Avenue in 1993, and soon began walking regularly to and from my classes on 42nd Street (between 5th and 6th Avenues). I wanted to vary my walk, so I chose different methods for changing my routes: sometimes crossing straight over to Broadway and then walking uptown, sometimes heading for Central Park West, sometimes letting the lights regulate my path. The feeling of stumbling onto a never-before explored city block never grew old; I often thought of checking them off a list but felt too lazy to do so, trusting that time and my randomizing algorithms would eventually exhaust the possibilities. When I moved to the Lower East Side (5th Street between Avenues A and B) in 1997, I continued walking to 42nd Street, and was able to conduct my explorations while heading uptown. As always, I found storefronts, buildings, street characters, food, and sundry other urban features and residents I would not have had I stuck exclusively to taking the subway.

Manhattan encouraged expansive walking. I dreamed up extravagant routes and sometimes acted on these plans. On one such jaunt, I walked from 5th Street to 110th (the northern edge of Central Park), moving from 59th to 110th along Central Park East, turned west, walked south along Central Park West down to 59th again, turned east to Lexington Avenue, walked south till 28th, where I stopped for some Indian food, before heading back home. I planned too, to walk the entire length of Broadway, never pulled it off, but haven’t given up that dream yet.

Walking on Manhattan streets reminded me, as always, that the best way to experience a city is from street level; the pace is right, its features pop into focus, you can stop and stare and sample. A city is made up of streets; walking on them is still how one best finds out what makes it tick.

Why The Talking Dead is a Bad Idea

Last night, I declined to watch the Oscars and chose The Walking Dead instead. If you’re going to watch zombies, why not watch a more interesting group of them? Snark aside, I had not seen most of last year’s crop of nominees, other than the mildly diverting Argo, and more to the point, I’ve burned out on the Motion Picture Academy’s annual orgy of self-congratulation. (Last year’s post on the Oscars described the genesis of this gradual turning away, one which started much, much earlier for the Grammys, and is now firmly in place for most awards of a similar kind.)

So, my choices for the evening settled, I turned to AMC. This represents a novelty of sorts for me. My following of television series has been restricted to watching the commercial-free episodes available on Netflix or bittorrent sites.  But my hankering for the Grim Grimesmeisters Hijinks had grown too acute, so there I was, braving myself to sit through the barrage of commercials that would inevitably accompany the latest installment of Zombie Apocalypse Bulletins. (I had begun this brave adventure last week, with the second episode of season three.)

The commercials were painful, but far more bothersome was AMC’s show The Talking Dead, which followed the new episode, an hour-long discussion of the episode with in-studio guests, a studio audience and a ‘surprise cast character.’ I had stopped watching after fifteen minutes the previous week, and this time around, my patience ran out after five.

The problem with The Talking Dead, and with any other show like it, which aims to dissect, discuss and lay threadbare an ongoing television show and wax ‘analytical’ about it, is that it dispels fantasy all too quickly. The point of watching a show like The Walking Dead (or Breaking Bad, or The Wire, or ) is to enter an alternate reality for a while, to be caught up in its story and characters, to come to believe, if only fleetingly, that the trials and tribulations of those on screen are real. A discussion show blows this imperative out of the water. It reminds us relentlessly, that the characters are just actors, often uninteresting people in their non-character personas, that directors, writers, and producers are pulling the strings and are often insufferably pompous, that locales are studio lots.  It connects the artfully constructed parallel universe to ours far too quickly; it raises the hood and peeks at the innards a little too closely. The Walking Dead in particular is supposed to be a grim show; it has little humor (both in the comic book and the series); the goofiness of The Talking Dead is especially grating.

I realize that I’m taking the on-the-surface silliness of The Talking Dead too seriously, so let me reiterate that the point being made here is a general one: too much inquiry into an ongoing fantasy is a bad idea. The serious fan should stay away; suspend disbelief, watch the show, and when you’re done, keep it that way. Till the next episode.

Op-Eds and the Social Context of Science

A few years ago, I taught the third of four special interdisciplinary seminars that students of the CUNY Honors College are required to complete during the course of their degrees. The CHC3 seminar is titled Science and Technology in New York City, a moniker that is open, and subject to, broad interpretation by any faculty member that teaches it. In my three terms of teaching it, I used it to introduce to my students–many of whom were science majors and planned to go on to graduate work in the sciences–among other things, the practice of science and the development and deployment of technology in urban spaces. This treatment almost invariably required me to introduce the notion of a social history of science, among whose notions are that science does not operate independent of its social context, that scientists are social and political actors, that scientific laboratories are social and political spaces, not just repositories for scientific equipment, that scientific theories, ‘advances’ and ‘truths’ bear the mark of historical contingencies and developments. (One of my favorite discussion-inducing examples was to point to the amazing pace of scientific and technological progress in the years from 1939 to 1945 and ask: What could have brought this about?)

If I were teaching that class this semester, I would have brought in Phillip M. Boffey‘s Op-Ed (‘The Next Frontier is Inside Your Brain‘, New York Times, February 23) for a classroom discussion activity. I would have pointed out to my students that the practice of science requires funding, sometimes from private sources, sometimes from governmental ones. This funding does not happen without contestation; it requires justification, because funds are limited and there are invariably more requests for funding than can be satisfied, and sometimes because there is skepticism about the scientific worth of the work proposed. So the practice of science has a rhetorical edge to it; its practitioners–and those who believe in the value of their work–must convince, persuade, and argue. They must establish the worth of what they do to the society that plays host to them.

Boffey’s Op-Ed then, would have served as a classic example of this aspect of the practice of science. It aims to build public support for research projects in neuroscience, because, as Boffey notes at the very outset:

The Obama administration is planning a multiyear research effort to produce an “activity map” that would show in unprecedented detail the workings of the human brain, the most complex organ in the body. It is a breathtaking goal at a time when Washington, hobbled by partisan gridlock and deficit worries, seems unable to launch any major new programs.

This effort — if sufficiently financed — could develop new tools and techniques that would lead to a much deeper understanding of how the brain works. [link  in original]

And then Boffey is off and running. For Congressmen need to be convinced; perhaps petitions will have to be signed; perhaps other competitors who also hope to be ‘sufficiently financed’ need to be shown to be less urgent. And what better place to place and present these arguments than the nation’s media outlets, perhaps its most prominent newspaper?

The scientist as polemicist is one of the many roles a scientist may be called on to play in his work in science. Sometimes his work may be done, in part, by those who have been persuaded by him already. Boffey’s arguments, his language, his framing of the importance of the forthcoming legislation, would, I think, all serve to show to my imagined students this very important component of the practice of science.

Walking, Head Down, on a Damp and Grey Day: How Virtuous It Is

On days like this, many residents of the US eastern seaboard are apt to question their decision to ever inhabit these spaces. The temperature is in the thirties (that’s just a couple of degrees above freezing point for all the folks living in Celsius-land); a steady, persistent drizzle is falling; and the most familiar color of all here on the East Coast, grey, has been used to paint, yet again, New York’s urban landscapes. Many of us will stay indoors today, but those who venture out will find that that experience brings its own reward, one which I suspect underwrites the tolerance that long-term East Coasters have for this benighted clime. Which is that walking, head down, through near-freezing temperatures while water drips off your hat, beanie, jacket  or whatever–because you know, many New Yorkers, like Pacific Northwesters, disdain umbrellas when rain of this intensity is falling–is often prone to provoking an acute sense of virtuousness in oneself.

Why would that be? For one thing, the mere fact of being outdoors puts you on the side of the Spartans. You have disdained comfort, the domestic hearth, and have ventured forth boldly. Not for you the safety of the familiar, the quotidian. No, suffused with the spirit of the intrepid, you have dared to look into your closet, laced and buttoned up, and sallied out. And once outdoors, the physical particulars of the day are conducive to a very distinctive mode of daydreaming.

As you walk, head bowed, grimly determined to make it through and past the damp and cold, you enter a zone similar to that entered by many who persistently engage with the uncomfortable: the once seemingly impossible barriers that your task seemed to have raised start to melt away, leaving you with the pleasing possibility that your abilities have the magical effect of making life more tractable.  This is gratifying in the extreme.

But even more importantly walking in bad weather forces a mode of concentration upon us that is increasingly hard to find and persist with in our normal, constantly-interrupted, notified, pinged, paged, and remindered existence: for that span of time that the walk persists, its just you and the execrable weather. And when things are that intimate, when using the smartphone might not be, you know, all that smart, why not just retreat a little bit into the ever more unfamiliar space of introspection?

I suspect these ventures into that space are often found by us to be pleasurable, that we enjoy our retreats into these rare moments of solitude. Thoughts move a little differently, they are not so easily displaced by external stimuli. Because, lets face it, on an East Coast day like this, who wants to look about and around, and stop and stare? Better to press on.

And that pressing on is really the clincher, I think. Nothing quite makes you imagine yourself as the relentless, courageous, explorer like a walk in really, really, shitty weather.

And yes, I did go out today.

‘If It’s Dead, Kill It’: The Second Compendium of the Walking Dead

Last year, I discovered The Walking Dead (the television series and the comic book). Like most fans of the television series, I’m all caught up now with the second half of the third season. Given the disappointing nature of the first two episodes of the second half, I’m glad that I have something else to take care of my Walking Dead jonesing: the massive second compendium of the comic book (Compendium Two, Image Comics, 2013), which collects issues 49 through 96. (The series is up to issue 108 by now, so it will be a while before the third compendium will be released; in terms of tracking the relationship between the comic book and the television series, the third season is right about where the first Compendium ends.)

I’ve written on this blog before about the relationship between the comic book and the television series so I will not get into that again. Rather, reading the second Compendium has provided me an opportunity to make some educated guesses about where the show might be going, and even more interestingly, to examine the particular vision the creators of the comic book have about the post-zombie-apocalypse world.

Most prominently, it is clear the most interesting conflicts in the zombie world are not with the dead but with the living.  While zombies are deadly, and require vigilance, violence and nous to keep at bay, the human survivors are more insidious and harder to combat. Allusions to Hobbesian states of nature and methods to alleviate them are never too far from the surface in the comic book especially in the two Woodbury-like developments encountered in the second compendium.People are prickly, selfish, angry, paranoid, greedy, and all of the rest; turns out, in a world ruled by zombies those qualities are merely enhanced, not ameliorated. For the most part, this is what gives the comic book (and the television series) its edginess: there is almost always perpetual conflict between those who have survived. Like the first compendium, there is grotesque violence directed at humans even as we note that acts of violence directed against the dead have now become mild amusements.  And this is what makes the zombie world just so bothersome: there is no getting away from plain folks. Hell really is other people. (The second compendium also, finally, starts to allude to what really would be the biggest problem of all: an inconsistent and fast dwindling food supply.)

There is internal conflict too. Rick Grimes continues to be (literally) haunted by his memories as do other characters in a variety of ways. And there is a great deal of mourning, painful introspection and just second-guessing, for the numbers of the dead continue to pile up, each death generating its own profuse regret and bitterness. Indeed, if you’ve survived, you’re traumatized and will act out that trauma in one way or the other. This makes some episodes in the compendium a little tedious, as reading them approximates listening into a therapy session. Which should remind us: the busiest service providers in a zombie world would be grief counselors and psychotherapists. The Walking Dead are not just the zombies, they are the living too.

The Mad Men Are Serious Downers

I’m only three episodes deep into Mad Men, and I’m already struck by how grim the show is. There’s misogyny, sexism, racial and ethnic prejudice, sexual prudery (of a kind), depressing suburban life, loveless marriages, loveless affairs, rigid gender roles, corporate language, the vapidity of advertising, and smoking indoors. And alcohol, lots of it. Mainly martinis and scotch, consumed at all hours of the day, in offices and homes, and during kids’ birthday parties. (I’m not sure if I’ve missed out on anything; I’m sure fans will correct me if I have.)

In using ‘grim’ as a description for the show–which I intend to keep watching for the time being just because it is morbidly fascinating–I do not mean to look past the stylish dressing, the carefully designed interiors, the loving caresses of the whisky and martini glasses, the nostalgia for a time when boys could be boys, white folk could be white folk, and women knew just how to be women, that apparently captivate so many of the show’s fans. Rather, I find that adjective appropriate because despite the apparent cheeriness and cleverness of the office banter, the endless drinking and dining in fashionable Manhattan restaurants, and the freedom to drink in one’s office, no one in the show seems to have had the most minuscule ration of any kind of happiness doled out to them. This is one serious downer of a show.

This should not be entirely surprising. Advertising consumer products requires the careful manufacture and sale of a fantasy, one underwritten by a corporate imperative. What Mad Men does quite well, whether deliberately or not, is to depict participation in that fantasy-mongering as an ultimately soulless, dispiriting enterprise. After all, if you’re shoveling it all day and all night, wouldn’t you find your life a serious drag? Once this is realized, the near-constant drinking suddenly becomes much more understandable; who wouldn’t need a few stiff ones to navigate through the lives these folks lead? Pour me a large one, please.

The dispiriting effect of Madison Avenue is not restricted to the office and the boardroom; it spreads out into homes and suburbs too.  As an advertising account executive, if you spend one-third of your life talking in platitudes, and spinning yards and yards of not particularly clever mumbo-jumbo, there is a good chance you’ll bring home that contagious emptiness with you and let it infect everyone and anyone around you. Resuming drinking at home seems like a good way to deal with these domestic blues.

The show’s writing is clever in parts, and the pretty displays of archaic behaviors and attitudes are certainly generative of the morbid fascination I mentioned above. For the time being, I will plough on, hoping that the Mad Folk don’t harsh my mellow too severely in the weeks to come.

Note: I read Daniel Mendelsohn‘s memorable review of Mad Men a while ago, long before I had seen a single episode of the show. I intend to reread it once I’m a couple of seasons deep.

O. Henry on the South (Mainly Nashville)

I’ve only read a couple of short stories by O. Henry but have long owned an omnibus collection of them (presented to me on my twenty-eighth birthday). I’ve finally taken a gander at it, and stumbled on his classic A Municipal ReportHenry was a Southerner transplanted to the East Coast, so I find the narrator’s voice–a supposed ‘outsider’ speaking of the South–of particular interest. This developing ‘attitude’ towards Nashville (and its people) leads to several memorable, witty descriptions. Here are a few of my favorites.

On Southern weather:

Take a London fog 30 parts; malaria 10 parts; gas leaks 20 parts; dewdrops gathered in a brick yard at sunrise, 25 parts; odor of honeysuckle 15 parts. Mix.

The mixture will give you an approximate conception of a Nashville drizzle. It is not so fragrant as a moth-ball nor as thick as pea-soup; but ’tis enough – ’twill serve.

On Southern hotels, hospitality, and history (race and the Civil War too!):

I went to a hotel in a tumbril. It required strong self-suppression for me to keep from climbing to the top of it and giving an imitation of Sidney Carton. The vehicle was drawn by beasts of a bygone era and driven by something dark and emancipated.

I was sleepy and tired, so when I got to the hotel I hurriedly paid it the fifty cents it demanded (with approximate lagniappe, I assure you). I knew its habits; and I did not want to hear it prate about its old “marster” or anything that happened “befo’ de wah.”

The hotel was one of the kind described as ‘renovated.” That means $20,000 worth of new marble pillars, tiling, electric lights and brass cuspidors in the lobby, and a new L. & N. time table and a lithograph of Lookout Mountain in each one of the great rooms above. The management was without reproach, the attention full of exquisite Southern courtesy, the service as slow as the progress of a snail and as good-humored as Rip Van Winkle.

Tobacco chewing:

All my life I have heard of, admired, and witnessed the fine marksmanship of the South in its peaceful conflicts in the tobacco-chewing regions. But in my hotel a surprise awaited me. There were twelve bright, new, imposing, capacious brass cuspidors in the great lobby, tall enough to be called urns and so wide-mouthed that the crack pitcher of a lady baseball team should have been able to throw a ball into one of them at five paces distant. But, although a terrible battle had raged and was still raging, the enemy had not suffered. Bright, new, imposing, capacious, untouched, they stood. But, shades of Jefferson Brick! the tile floor – the beautiful tile floor! I could not avoid thinking of the battle of Nashville, and trying to draw, as is my foolish habit, some deductions about hereditary marksmanship. [links added]

The Southern gentleman, Major Wentworth Carswell:

I happened to be standing within five feet of a cuspidor when Major Caswell opened fire upon it. I had been observant enough to percieve that the attacking force was using Gatlings instead of squirrel rifles; so I side-stepped so promptly that the major seized the opportunity to apologize to a noncombatant. He had the blabbing lip. In four minutes he had become my friend and had dragged me to the bar.

I desire to interpolate here that I am a Southerner. But I am not one by profession or trade. I eschew the string tie, the slouch hat, the Prince Albert, the number of bales of cotton destroyed by Sherman, and plug chewing. When the orchestra plays Dixie I do not cheer….Major Caswell banged the bar with his fist, and the first gun at Fort Sumter re-echoed. When he fired the last one at Appomattox I began to hope.

The Nightmare of the Lost Semester

It has just come to my notice that the New York Review of Books has been running a series on dreams. Thus far, entries include Georges Perec’s “Fifty Kilos of Quality Meat,” Charles Simic’s “Dreams I’ve Had (and Some I Haven’t),”Michael Chabon’s “Why I Hate Dreams” and Nicholson Baker’s ‘On the Stovetop of Sleep.’ Inspired by this, and remembering my recounting of an anxious dream related to copy-editing in the face of a publisher’s deadline, I thought I’d put down a brief note about my dreams.

Like most people that dream, some of mine are repeats, variations on whose themes occur repeatedly in my sleeping hours. These in turn are made up of some of familiar types: anxiety-laden nightmares about heights and wild rivers that threaten to drown me being especially prominent ones. Some of these have their own escape hatches built into them. For instance, when stuck on a threatening height, a surefire tactic for ending the dream is to, wait for it, jump. But some dreams are more stubborn than others; they offer no easy way out.

A classic entry in this list is the Dream of the Lost Semester. In this dream, I find myself late in a semester, staring at my teaching schedule, horror-struck by the realization that I have failed to attend even a single meeting of a class assigned to me. I have not given my students their syllabus; no readings have been assigned. As I realize this, I panic. If I start attending classes now, my students will heap scorn on me: where have I been all this while? They will jeer me, mock me, as I walk in.  I would have to stand there, the target of their derision, the man who had kept them waiting in such utter futility for so many weeks now. The shame of such a public humiliation would be too much to bear. In the construction of the dream, no complaints have been tendered to my department, students have not dropped the class, or anything like that. Instead, somehow, I believe that attendance has taken place as usual, the students patiently waiting for their Godot-like professor to show up, sometime. The absurdity of this, somehow, bubbles through, and slowly I convince myself the college has found a substitute for me, even as I continue to teach my other classes. The relief from this ‘realization’ does not last; wouldn’t someone have contacted me about such a replacement by now? Perhaps my class is just as orphaned as I imagine it to be. So, again, I consider starting up the semester, even if a few weeks late. This courage lasts only a few seconds; I return to seeking refuge in the hope that the university has found out about the abandoned class and arranged for a substitute. And so it goes.

The Dream of the Lost Semester finds its roots, quite obviously I think, in the anxiety that precedes the start of every semester, as I finalize reading lists and syllabi, order books, check bookstore inventories and so on. And no matter how long I teach, I still suffer from stage-fright, those little jitters that afflict me just before I step into a new classroom for the first time each semester. Add those two up and you get this creepy little insidious entry into my subconscious, one that bubbles up every now and then to remind me of the centrality of teaching to my life.

Ethnocentricity, Moral Beliefs and Moral Truth

Adam Etinson writes in The Stone on ethnocentrism (defined as ‘our culture’s tendency to twist our judgment in favor of homegrown beliefs and practices and against foreign alternatives’), skepticism about universal morality and the existence of moral facts as  a response to it, and finally, on whether such skepticism is warranted. To wit, concern about ethnocentrism in the domain of morality finds its grounding in universally acknowledged datum: that disagreements are extensive, intractable (and disagreeable), that ‘culture and upbringing’ play a significant role in such clashes. Is moral relativism or skepticism about the existence of objective moral facts an appropriate response?

Etinson thinks not:

For one, however obvious it may be that culture plays an important role in our moral education, it is nevertheless very hard to prove that our moral beliefs are entirely determined by our culture, or to rule out the possibility that cultures themselves take some direction from objective moral facts….Second, moral relativism, for its part, seems like an odd and unwarranted response to ethnocentrism. For it’s not at all clear why the influence of culture on our moral beliefs should be taken as evidence that cultures influence the moral truth itself  — so that, for instance, child sacrifice would be morally permissible in any community with enough members that believe it to be so. Not only does that conclusion seem unmotivated by the phenomenon under discussion, it would also paradoxically convert ethnocentrism into a kind of virtue (since assimilating the views of one’s culture would be a way of tapping into the moral truth), which is at odds with the generally pejorative understanding of the term.

These are curious responses to make.

The first is made in the face of the acknowledged data (about disagreement over moral beliefs and the existence of cultural variance in moral practices). If ‘cultures themselves take some direction from objective moral facts’ then surely there should be greater agreement over our moral beliefs? Perhaps Etinson takes our existing moral agreements to be the evidence of such influence, no matter how attenuated?

The second response, contra moral relativism, assumes that there is a ‘moral truth’ out there, one influenced by cultures. But the skepticism about moral facts that goes by the name of ‘moral relativism’ is not committed to any such truth; it takes all its cues from its claim that the empirical particulars of cultures generate moral beliefs, which vary by time and place. That kind of relativism does not think that a ‘moral truth’ is the product of a culture’s influences; rather, the culture merely generates a set of permissible actions. There is no commitment here to the notion of a moral truth that would be made accessible by ‘assimilating the views of one’s culture’; rather one brings oneself into line with one’s culture and what it deems permissible by assimilating its views. (Note that Etinson himself, in writing of ‘moral truth’ in connection with moral relativism adds the caveat, ‘for any given people.’) This would ensure that ethnocentrism retains its non-virtuous standing, a concern important to Etinson, for presumably it leaves open the possibility that these sets of permissible actions could remain the subject of moral critique.  But having made this concession, a further question is almost immediately prompted: isn’t the assumption of objective moral truth and facts our primary, if not sole, reason for imagining ethnocentrism to be non-virtuous in the domain of morality? If so, then is Etinson’s skepticism about moral relativism warranted?