Unsung Heroines and Premature Glory

News Scientist  is currently featuring a story titled “Unsung Heroines: Five Women Denied Scientific Glory.” The woman scientists featured are: Hertha Ayrton, Jocelyn Bell Burnell, Gerty Cori (an odd choice given she was the first woman to win the Nobel Prize), Rosalind Franklin, and Lise Meitner.

For my money, of the stories told here, those of Burnell, Franklin, and Meitner are especially poignant. The little bio provided for Burnell includes a pair of interesting remarks made by her:

In 1967, as a postdoctoral physicist at the University of Cambridge, she discovered the first pulsar using a radio telescope she had built with her supervisor Antony Hewish, astronomer Martin Ryle and others….Bell Burnell was the second author named on the paper that announced the discovery (Nature, DOI: 10.1038/217709a0), but it was Hewish and Ryle who received a Nobel prize for it in 1974. She has made light of this, saying that “students don’t win Nobel prizes” and “an award to me would have debased the prize”.

I take Burnell to be making the point that Nobel Prizes–descriptively 0r normatively–recognize not isolated achievements, but a sustained record of scientific excellence. Of course, Burnell was not ‘only’ a “student” – she was a post-doctoral fellow, and thus already a practicing academic. Her concern about the prize being “debased” seems misplaced in two respects: 1) she had not made an accidental, flukish discovery 2) the Nobel Prize is awarded for both lifetime achievement and singular inventions or discoveries.  I suspect that besides her suggestion that the Nobel only recognize careers worth of scientific work, Burnell had also internalized some cultural prejudices about excessively early recognition serving as a disincentive for future effort. It is, if I may say so, an old-fashioned attitude.

Burnell’s remarks remind me of an incident in my own career. Shortly after I finished my doctorate and began work as a post-doctoral fellow, I was asked by a colleague, then working on a highly technical book on computational learning theory, whether I’d be interested in co-authoring a chapter that would explain the philosophical significance of the new formalisms being developed.  I would not be a co-author of the book, but would be listed as a co-author for that chapter alone. I agreed; my friend’s work was fascinating, and I looked forward to fleshing out its conceptual foundations. And the co-authorship line on the CV wouldn’t hurt one bit.

There was one small problem though: my colleague was working with two other logicians on his book. They needed to approve of my writing that chapter. One of them, a senior academic, refused. His stated reason was straightforward: I would be spoiled by such ‘early success’; I should not expect co-authored chapters in books to come my way so easily; I needed to build a ‘track record’ before I could earn such distinction.

As a reminder: I was a Ph.D, not a fledgling graduate student; I was not going to be made co-author of the book but only of one chapter.

I’ve had many head-shaking moments in my academic career; this was one of them.

The Mad Men Can’t Quite Get Hold Of Me

A year or so ago, I wrote my first brief response to AMC’s Mad Men. Three episodes in, I described it as ‘grim’ and a ‘serious downer’. Now, five seasons in, I’m still inclined to that description. (The fact that it has  taken me this long to come close to exhausting Netflix’s online repository of its episodes should indicate I haven’t indulged in any kind of binge viewing and have been happy enough to suspend watching the show for a variety of reasons–like watching other television series and movies.)

I do not mean to be reductive in my take on Mad Men. I find its writing enjoyable and like many other viewers find Roger Sterling‘s lines particularly memorable (indeed, I often find myself wishing he was given more screen time); I appreciate its careful attention to its ‘look and feel’ – its sumptuous interiors and clothes most notably; I am cognizant the show attempts to highlight the misogyny, gender discrimination and racism of days gone by. This is a very slick and smart show in many ways.

But for all that, it simply isn’t compelling enough. I do not know if there is a story in there somewhere or whether I am merely paying witness to an episodic dysfunction of family, society and business. Perhaps I have made matters worse by watching it in the distracted fashion I have employed, but this consideration seems to involve a rather insuperable chicken-and-egg question: Was I distracted because Mad Men didn’t grab me, or was I not grabbed because I was distracted?

Perhaps it’s because I find Don Draper utterly vapid and uninteresting. I do not know if Draper is supposed to cut a tragic figure or whether my reaction is the appropriate one to have to a man of Madison Avenue. Perhaps the writers of the show have succeeded in making me realize the shallowness of the advertising executive.

Perhaps the show’s attempts to serve as a chronicle of the times don’t always work; I’m not sure why, but its references to, and attempts to integrate, ‘the world outside’ –as in its incorporation of the JFK assassination, the civil rights struggle, the death of Marilyn Monroe–sometimes feel forced.

But in the end, I think the reason I don’t find Mad Men as compelling as many others do remains the same as I articulated in my original post: I find advertising and its business and supposed creativity not very interesting at all. (It doesn’t help I consider mass advertising to have had a ruinous effect on political discourse in the US.) I am not intrigued by the processes that bring ad copy and art to life; I do not imagine those who work in advertising’s creative departments to be inspirational geniuses; (I am intrigued to hear so many of the shows fans say they find Draper’s pitches ‘clever’); I find talk of ‘account servicing’ tedious. These prejudices, I suspect, get in the way of my being able to enjoy the show fully.

Still, the show exerts a peculiar fascination on me; I intend to watch it in its entirety and will write on it again. This post, and my first one, have been rather superficial takes; perhaps my summation will be rather more synoptic and thoughtful.

Jehane Noujaim’s ‘The Square’: Enthralling and Frustrating

Jehane Noujaim‘s The Square is an enthralling and frustrating documentary record of the 2011 Egyptian Revolution. It tells its story by holding a steady narrative focus on a small cast of central characters and tracking the revolution’s rise and fall–so to speak–from the glory of Hosni Mubarak‘s resignation to its co-optation by a variety of counterrevolutionary forces (the military, the Muslim Brotherhood). As The Square ends after Mohamed Morsi‘s downfall in July 2013, we see Tahrir Square–the emotional epicenter of the revolution–once again serving as a focal point for the possible mustering of those forces that had brought about the first showdown with Mubarak’s regime.  The revolution, it is clear, is unfinished business.

The Square is emotionally affecting. We witness the fervor and sentiment and passion of the hundreds of thousands that gathered in Tahrir Square to bring down first, Mubarak, and then, Mohamed Morsi; we are horrified and appalled at the violence visited on them by the police, the army and a motley crew of hired thugs; we listen in on articulate and angry debate between those who come together in the revolution even as they are riven by ideology.

The Square is also a curiously decontextualized record of what has been happening in Egypt over the last three years. The close attention paid to Khalid Abdalla, Ramy Essam, Magdy, Ahmed Saleh and The Square‘s other central characters prevents both a panning-out or a further zooming in. We are told absolutely nothing about the mechanics of the revolution: the grass-roots organizing that enabled a powerful totalitarian regime to be toppled surely deserved a closer look.

Many questions thus remain unanswered. We do not know why Mubarak was brought down; we fail to understand the significance of the Muslim Brotherhood in Egyptian politics. Why is Magdy, the member of the Muslim Brotherhood, who is so clearly conflicted about their role in the revolution, so worried about his fate if the ‘liberals’ were to come to power? And why is he the only voice of such a significant counterrevolutionary force? For that matter, who are the Muslim Brotherhood? Have they ever been in power before? How was a gathering in one urban zone able to bring down a despot? Surely, just large gatherings alone do have such revolutionary effects? Did any revolutionary activity occur outside Cairo? Egyptian history and politics are complicated; their casts of characters are large and driven by a variety of ideological and intellectual motivations; these all deserved just a little more attention.

Perhaps my complaints are misplaced; Noujaim’s worst sin might only have been to presume her viewers had done their homework. But can those who make movies about the Middle East afford to be so complacent in their presumptions? Especially when the relationship between religion and state is as complicated as it is in the Islamic world? In these times, every documentary about that region of the world is in the envious position of being able to seize upon the many teachable moments its currents events provide.

Noujaim shouldn’t have hesitated to edify us.

The Campus Bell, From School to College

When I began school, I passed into another zone of discipline. The most prominent marker of that regime of control was not the corporeal form of the stern schoolmaster but rather, the sounds of a bell ringing. It was the aural code that dominated the next twelve years of my life, slicing up the school day into distinct slices of time, neatly parceling out portions for study and sometimes even play. The bell would ring, and we would file obediently into class; it would ring again, and the study of another subject would begin; and so on, till the final, merciful heralding of the end of school day rang out, ushering us toward the school buses that would take us back home.

But the boarding school bell regulated even more of my day. It rang first at 530AM, rudely hauling me out of bed, and sending me scampering to get changed and ready for morning tea, and from there on, it signaled again and again for physical drill, breakfast, chapel service, the start of classes, then the class periods, a short recess, classes again, lunch, sports periods, evening tea, evening prep, and then finally, lights out.

The boarding school bell rang the longest for chapel service; I never timed it, but I’m sure it was rung for at least a few minutes, its insistent notes traveling over the campus, summoning up schoolboys no matter where they were, sending them to kneel on their knees, sing hymns and pray. One morning, I was given the task of ringing the bell for chapel service; I took to the task with some gusto, pulling the rope hard and striving to add some dramatic flair to that weekday sound.

After school, the era of the bell seemed to come to an end. There were no bells in university, no bells in graduate school. Classes ended because professors and students checked watches and clocks; we were supposedly self-regulated and disciplined. But the bell made a return again when I began teaching at Brooklyn College, whose campus features a landmark clock tower.

Now, again, on my teaching days, I hear, as I have for the past twelve years, the sounds of a bell marking time, ushering students and professors alike into classrooms. I am supposedly the one doing the disciplining when it comes to the students, but the sounds of the clock tower discipline me too, regulating my teaching hours and urging me not to be late.

The sounds of the clock tower do not just regulate; they can calm as well. My campus is an urban one but it strives to construct an air akin to those situated in considerably more bucolic surroundings. The clock tower’s sounds, for whatever reason, aid it in this endeavor; they sometimes lend the campus a pastoral air it would not otherwise posses. Perhaps unsurprisingly, I find the ringing of the bell in the evenings the most calming of all: the day is done, and my walk home to my family can begin.

The sounds of a bell ringing are now, finally, considerably more benign than I had ever experienced them to be.

Acts of Kindness: Writing to Writers, Especially Academic Ones

A couple of years ago, after reading Neil Grossexcellent biography of Richard Rorty, I sent him a short note of appreciation, telling him how much I enjoyed his book. Gross wrote back; he was clearly pleasantly surprised to have received my email.

I mention this correspondence because it is an instance of an act that I ought to indulge in far more often but almost never do: writing to let an author–especially an academic one!–know you enjoyed his or her work.

Most academic writing is read by only a few readers: some co-workers in a related field of research, some diligent graduate students, perhaps the odd deluded, excessively indulgent family member. (I am not counting those unfortunate spouses, like mine, who have pressed into extensive editorial service for unfinished work. These worthies deserve our unstinting praise and are rightfully generously acknowledged in our works.) Many, many academic trees fall in the forest with no one to hear them.

This state of affairs holds for many other kinds of writers, of course. Online, even if we know someone is reading our writing we might not know whether they thought it was any good; we might note the number of hits on our blogs but remain unaware of whether our words resonated with any of our readers. The unfortunate converse is true; comments spaces tell us, loudly and rudely, just how poor our arguments are, how pointless our analysis, how ineffective our polemicizing. There is no shortage of critique, not at all.

It is a commonplace point to direct at academic writers that their work needs to be made relevant and accessible. Fair enough. I think though, that our tribe would greatly benefit from some positive reader feedback when these standards–besides the usual scholarly ones–are met. Academics often write to one another, indicating their interest in a common field of study, the value of their correspondent’s writing, and sometimes asking for copies of papers. To these existent epistolary relationships I suggest we add the merely appreciative note: I enjoyed your writing and here is why.

These notes are not mere acts of kindness, a dispensing of charity as it were. They encourage and sustain a useful species of human activity. They create an atmosphere, I think, conducive to scholarship and to further striving toward excellence. They make a writer want more of the same.

I know we’re all busy, but the next time you read something you like, see if you can send the writer a little thank-you note. You don’t have to do it all the time, but sometimes wouldn’t hurt.

Go ahead: reach out and touch someone.

Note: I was prompted to write this post by receiving an email from a doctoral student at Cambridge who had just read my A Legal Theory of Autonomous Artificial Agents and found it useful in his work on legal personality.  The almost absurd pleasure I received on reading his email was a wistful reminder of just how much we crave this sort of contact.

Margaret Cavendish, Epicureanism, and Philosophy as Confession

In her erudite and enjoyable Epicureanism at the Origins of Modernity Catherine Wilson makes note of Margaret Cavendish‘s participation in the so-called “Cavendish Salon” in Paris, which served as “the center of a revival of Epicureanism led by Hobbes and Gassendi.” Cavendish, who might have obtained her knowledge of that school of thought either through her own translations of the originals or from Hobbes, went on to write Philosophicall Fancies, which would serve as one of the “earliest print references to the reviving doctrine.”

Interestingly, Wilson suggests Cavendish’s philosophical inclinations were grounded in her biography:

Echoing Lucretius‘s unforgettable opening passage on the murder of Iphigenia by Agamemnon, Cavendish went on to say in The World’s Olio of 1655 that it was better to be an atheist than superstitious; atheism fostered humanity and civility, whereas superstition only bred cruelty. Unlike More and Descartes, Cavendish recognized no spirits or incorporeal substances in her metaphysical system. Consciousness depended in her view on a material substrate: Nature makes a brain out of matter so that there can be perception and appreciation of the material world.

Cavendish’s religious skepticism and her initial attraction to the atomic philosophy reflected the somewhat rebellious and resentful attitudes of one excluded from participation in the learned world and essentially powerless. Accustomed to being ruled and ordered about by fathers, husbands, and even sons, early modern women might have been drawn to a philosophy in which nature was depicted as accomplishing everything by herself [note Wilson’s use of the feminine pronoun here] without taking direction from an autocratic and psychologically impenetrable divinity. Lucretius insisted that ‘nature is her own mistress and is exempt from the oppression of arrogant despots, accomplishing everything by herself spontaneously and independently and free from the jurisdiction of the gods’, and Cavendish proposed that:

Small Atomes of themselves a World may make,
For being subtile, every shape they take;
And as they dance about, they places find,
Of Forms, that best agree, make every Kind.
[Margaret Cavendish, Poems and Fancies (London:1664),6]

.

As this marvelous collection of quotations–put together by Peter Suber–shows, the idea that philosophy works as a kind of confession has a long and storied history. Among the most famous proponents of this metaphilosophical thesis was, of course, Nietzsche.

First, in Human, All Too Human, trans. Marion Faber, with Stephen Lehmann, University of Nebraska Press, 1984 (original 1878):

[§513] However far man may extend himself with his knowledge, however objective he may appear to himself ultimately he reaps nothing but his own biography.

And then most memorably, in Beyond Good and Evil, trans. Walter Kaufmann, Vintage, 1966 (original 1886).

[§6] Gradually it has become clear to me what every great philosophy so far has been: namely, the personal confession of its author and a kind of involuntary and unconscious memoir; also that the moral (or immoral) intentions in every philosophy constituted the real germ of life from which the whole plant had grown.Indeed, if one would explain how the abstrusest metaphysical claims of a philosopher really came about, it is always well (and wise) to ask first: at what morality does all this (does he) aim? Accordingly, I do not believe that a “drive to knowledge” is the father of philosophy; but rather that another drive has, here as elsewhere, employed understanding (and misunderstanding) as a mere instrument….

Links added throughout; Nietzsche quotations from Suber’s page

Twenty-One Car-Free Years

Over the weekend, thanks to traveling up to Albany to meet an old friend, I was unable to make note of an especially important anniversary: March 30th marked twenty-one years of blessed freedom from car ownership.

On March 30th, 1993, I sold my Toyota pickup truck, purchased a mere eighteen months previously, at a drastically marked-down price. My reasons were simple and numerous: I was headed out of the US for an indefinite period; if I returned, it would be to New York City, where I did not expect to own a car; and lastly, most significantly, my insurance premium–in New Jersey–had climbed to an astronomical four thousand dollars a year.

Cars had always been an expensive headache for me. My first car, a Toyota Corolla with over hundred thousand miles on it, had flamed out spectacularly on a New Jersey highway; it had minimal resale value and I was only too happy to dispose of it in a junkyard. My second, a Volkswagen Jetta, had niggling problems with its fuel pump, and spent too much time in the repair shop. And while I owned it, my insurance climbed into the stratosphere.

My troubles began a few minutes after I had picked up the Jetta from the used-car dealer. As I drove down the Garden State Parkway, already late for work, I failed to notice I was speeding. A state trooper pulled me over, informed me I was driving at 78mph in a 55mph zone and gave me a ticket. That meant four points on my license and a thousand dollar increase in my annual  premium. A few months later, after I had skidded on a wet road and hit the kerbside, I filed a damage claim, which the insurance company honored. But in exchange for this thousand-dollar payment, they raised my premium by a thousand dollars a year. And then, finally, thanks to another wet road, I rear-ended a truck, filed a damage claim again and was treated to the same sequence of claim-payment-followed-by-premium-increase.

By late 1992, I could not afford to drive a car any more. But I still had to commute to work. So I persisted for a few months, all the while actively plotting my escape from New Jersey to New York City. When my move looked imminently possible, I put up my truck–purchased after my Jetta’s fuel-pump troubles had become intolerable–for sale.

Twenty-one years on, I remain relieved to be free of the hassles of parking, gas prices, speeding tickets, towing, worries about blood alcohol content, traffic, and all of the rest.  Public transportation, with all its frustrations, works well enough for me. New York City’s magnificently flawed subway system takes me where I need to go; on rare occasions, I rent or borrow a car. That limited and circumscribed ownership is all I can handle.

I do not know if I will ever move out of New York City. If I do, I know one of my most profound regrets will be the leaving behind of this blessedly car-free life.

The Visually Sophisticated Society and “Seeing is Believing”

In 1980, Stephen Jay Gould and Steven Selden sent their copy of H.H Goddard‘s The Kallikak Family: A Study in the Heredity of Feeble-Mindedness to James H. Wallace, director of Photographic Services at the Smithsonian Institution. The photographs in Goddard’s book of the supposedly “feeble-minded” family had appeared to confirm their mental infirmity:

All have a depraved look about them. Their mouths are sinister in appearance; their eyes are darkened slits.

Kallikaks_sal-big

But as the photograph above indicates, and as Wallace noted:

There can be no doubt that the photographs of the Kallikak family members have been retouched. Further, it appears that this retouching was limited to the facial features of the individuals involved–specifically eyes, eyebrows, mouths, nose and hair. By contemporary standards, this retouching is extremely crude and obvious.

The intellectual dishonesty on display in Goddard’s work is but a small sample of the many instances noted in Gould’s critique of biological determinism, The Mismeasure of Man  (W. W. Norton, New York, 1980).

Of interest too, is what Wallace went on to say in his response to Gould and Selden:

 It should be remembered, however, that at the time of the original publication of the book, our society was far less visually sophisticated. The widespread use of photographs was limited, and casual viewers of the time would not have nearly the comparative ability possessed by even pre-teenage children today….

 The “visual sophistication” that Wallace indicates is, of course, a function of the greater prominence of the visual in modern society. Photographs, digital and analog, and moving images, whether those of the movies or television, are our commonplace companions; we record our lives, their humble and exalted moments, through a bewildering arrays of technologies and methods. Our blogs and other forms of social media are awash in these images. If the cultures that preceded ours were verbal, we are increasingly visual. Our future masterpieces might increasingly be drawn from this domain.

This swamping of our senses and sensibilities produces a greater refinement of our visual concepts and judgments. Modern cinephiles speak knowledgeably and effortlessly of cinematic palettes and visual grammars; admirers of photographers’ works offer esoteric evaluations of their correspondingly complex productions. We consider such discourses exceedingly commonplace; we are, after all, creatures whose dominant sensory modality is sight, able to examine their world from the microscopic to the macroscopic scale, from the beginning of their lives to their ends, through images.

Wallace’s invocation of our increased “visual sophistication” appropriately enough arises in the context of retouching. We are used to the altered digital image, the restored old photograph, the enhanced and corrected draft photograph; we return from vacations with a camera full of digital photos; we understand their final displayed product will be a modified one, lights and darks and colors and shades all expertly changed by our photo processing software, our clumsiness and inexpertness cleverly altered and polished out.

We are, by now, accustomed to the notion that seeing is not believing but rather, the opening salvo in a series of investigations.

Police or Wanna-Be Commandos?

You might have noticed your local police force starting to look increasingly militarized, wearing riot-gear like the type Glenn sports in The Walking Dead, and armed with not just weaponry like Rick Grimes‘ but with an attitude as bad as Merle‘s. Don’t worry, it’s part of a nation-wide trend of SWATting local police:

Peter Kraska, a professor at Eastern Kentucky University’s School of Justice Studies, estimates that SWAT teams were deployed about 3,000 times in 1980 but are now used around 50,000 times a year. Some cities use them for routine patrols in high-crime areas. Baltimore and Dallas have used them to break up poker games. In 2010 New Haven, Connecticut sent a SWAT team to a bar suspected of serving under-age drinkers. That same year heavily-armed police raided barber shops around Orlando, Florida; they said they were hunting for guns and drugs but ended up arresting 34 people for “barbering without a licence”. Maricopa County, Arizona sent a SWAT team into the living room of Jesus Llovera, who was suspected of organising cockfights. Police rolled a tank into Mr Llovera’s yard and killed more than 100 of his birds, as well as his dog. According to Mr Kraska, most SWAT deployments are not in response to violent, life-threatening crimes, but to serve drug-related warrants in private homes.

He estimates that 89% of police departments serving American cities with more than 50,000 people had SWAT teams in the late 1990s—almost double the level in the mid-1980s. By 2007 more than 80% of police departments in cities with between 25,000 and 50,000 people had them, up from 20% in the mid-1980s.

Many young men in the US with bullying issues resolve them through alcohol binges, picking street-fights, playing video-games with impressive body-counts, or raping women. Yet others, savvy enough to realize that modern policing offers you a real-life video game with real ninety-seven pounders to be kicked around, sign up for a tour of duty of America’s war zones (its cities), where hostiles (their colored residents) roam (walk), skulk (hang out on corners) and hide (stay indoors).

It’s just like that game Urban SWAT Force: you pick up a signal that crackles over your gleaming black radio, you answer snappily, employing those mnemonics so beloved of military commanders calling in artillery strikes, “Echo Romeo Oscar Yankee! Heading East on Sixteenth!”, you gun the engine, feeling that horsepower spring you forward, even as it pins you in your car seat, propelling you down that blacktop toward the ‘target’ cunningly disguised as a home. Then, time for the crouching attack, the battering ram on the door, the rush inside as stun-grenades go off, deafening everyone but you. Then, finally, the moment you were waiting for: as wailing women and children cower and beg, you open fire, emptying magazines into anything, and I mean anything, that moves: curtains, pets, goldfish, they’re all fair game.

Sometimes grannies bite the dust:

In 2006 Kathryn Johnston, a 92-year-old woman in Atlanta, mistook the police for robbers and fired a shot from an old pistol. Police shot her five times, killing her. After the shooting they planted marijuana in her home. It later emerged that they had falsified the information used to obtain their no-knock warrant.

It’s a jungle out there. Only the thin blue line protects us.

Ending the NCAA’s Plantation Racket

In Kevin Smith‘s Chasing Amy, Banky tries to talk Holden out of his crush on Amy:

Banky Edwards: Alright, now see this? This is a four-way road, okay? And dead in the center is a crisp, new, hundred dollar bill. Now, at the end of each of these streets are four people, okay? You following?

Holden: Yeah.

Banky Edwards: Good. Over here, we have a male-affectionate, easy to get along with, non-political agenda lesbian. Down here, we have a man-hating, angry as fuck, agenda of rage, bitter dyke. Over here, we got Santa Claus, and up here the Easter Bunny. Which one is going to get to the hundred dollar bill first?

Holden: What is this supposed to prove?

Banky Edwards: No, I’m serious. This is a serious exercise. It’s like an SAT question. Which one is going to get to the hundred dollar bill first? The male-friendly lesbian, the man-hating dyke, Santa Claus, or the Easter bunny?

Holden: The man-hating dyke.

Banky Edwards: Good. Why?

Holden: I don’t know.

Banky Edwards: [shouting] Because the other three are figments of your fucking imagination!

As I read news of the National Labor Relations Board‘s decision that college players have the right to unionize and allow myself a brief celebration of this victory for common sense, I also prepare myself for the inevitable defenses of the NCAA and its racket–college sports–from folks whom, in my kindest moments, I can only describe as deluded. It is for their sake that I have excerpted Banky’s rant above, for it could be easily rewritten with its three mythical creatures replaced by: the principled NCAA executive, the truthful NCAA lawyer and the honest college sport administrator. And standing over it all, the hallucination of the amateur student-athlete, who plays for passion and pride. Not money. No sir, not that filthy stuff, so visible in prices of season tickets, the salaries of coaches, administrators, the values of television rights deals, the spanking new sports facilities, gyms and stadiums.

Read the NLRB’s ruling and read the descriptions of college football players’ training and game routines, and ask yourself whether those descriptions accord with your sense of a college student playing sports on the side while he pursues a degree as his main vocation. Or do they better describe professional athletes who study a bit on the side? A choice sample:

During this time [football season], the players devote 40 to 50 hours per week to football-related activities, including travel to and from their scheduled games.

College sports is a plantation racket, from start to finish. Hold out promises of unimaginable riches to a community desperate for economic upliftment, pay them peanuts, shackle them to draconian codes of conduct enforced by hypocrites, all the while enriching yourself. That’s how it works. The student-athlete, the scholarship, the education in exchange for a few games; they sure do sustain a great deal of fantasy don’t they?

Thank you for tearing down the non-unionized wall, Mr. Ohr. Now, hopefully, later this year, the judges in O’Bannon vs. NCAA will take the necessary next steps.