Blade Runner 2049: Our Slaves Will Set Us Free

Blade Runner 2049 is a provocative visual and aural treat. It sparked many thoughts, two of which I make note of here; the relationship between the two should be apparent.

  1. What is the research project called ‘artificial intelligence’ trying to do? Is it trying to make machines that can do the things which, if done by humans, would be said to require intelligence? Regardless of the particular implementation? Is it trying to accomplish those tasks in the way that human beings do them? Or is it trying to find a non-biological method of reproducing human beings? These are three very different tasks. The first one is a purely engineering task; the machine must accomplish the task regardless of the method–any route to the solution will do, so long as it is tractable and efficient. The second is cognitive science, inspired by Giambattista Vico; “the true and the made are convertible” (Verum et factum convertuntur) or “the true is precisely what is made” (Verum esse ipsum factum); we will only understand the mind, and possess a ‘true’ model of it when we make it. The third is more curious (and related to the second)–it immediately implicates us in the task of making artificial persons. Perhaps by figuring out how the brain works, we can mimic human cognition but this capacity might be  placed in a non-human form made of silicon or plastic or some metal; the artificial persons project insists on a human form–the android or humanoid robot–and on replicating uniquely human capacities including the moral and aesthetic ones. This would require the original cognitive science project to be extended to an all-encompassing project of understanding human physiology so that its bodily functions can be replicated. Which immediately raises the question: why make artificial persons? We have a perfectly good way of making human replicants; and many people actually enjoy engaging in this process. So why make artificial persons this way? If the answer is to increase our knowledge of human beings’ workings, then we might well ask: To what end? To cure incurable diseases? To make us happier? To release us from biological prisons so that we may, in some singularity inspired fantasy, migrate our souls to these more durable containers? Or do we need them to be in human form, so that they can realistically–in all the right ways–fulfill all the functions we will require them to perform. For instance, as in Westworld, they could be our sex slaves, or as in Blade Runner, they could perform dangerous and onerous tasks that human beings are unwilling or unable to do. And, of course, prop up ecologically unstable civilizations like ours.
  2. It is a philosophical commonplace–well, at least to Goethe and Nietzsche, among others–that constraint is necessary for freedom; we cannot be free unless we are restrained, somehow, by law and rule and regulation and artifice. But is it necessary that we ourselves be restrained in order to be free? The Greeks figured out that the slave could be enslaved, lose his freedom, and through this loss, his owner, his master, could be free; as Hannah Arendt puts it in The Human Condition the work of the slaves–barbarians and women–does ‘labor’ for the owner, keeping the owner alive, taking care of his biological necessity, and freeing him up to go to the polis and do politics in a state of freedom, in the company of other property-owning householders like him. So: the slave is necessary for freedom; either we enslave ourselves, suppress our appetites and desires and drives and sublimate and channel them into the ‘right’ outlets, or we enslave someone else. (Freud noted glumly in Civilization and its Discontents that civilization enslaves our desires.) If we cannot enslave humans, with all their capricious desires to be free, then we can enslave other creatures, perhaps animals, domesticating them to turn them into companions and food. And if we ever become technologically adept at reproducing those processes that produce humans or persons, we can make copies–replicants–of ourselves, artificial persons, that mimic us in all the right ways, and keep us free. These slaves, by being slaves, make us free.

Much more on Blade Runner 2049 anon.

Old Battles, Still Waged: Accepting ‘Defeat’ In Self-Improvement

Over the past couple of days, I have engaged in a time-honored academic ritual: the cleaning of one’s office. Old books, journal articles, student papers and blue books, random handouts from academic talks, conference badges–all fodder for the recycling bin. But I went further, looking for especially archaic material; and I found it in my graduate school notebooks. Scribbled notes from graduate seminars filled their pages; but much else too. In their pockets I found syllabi and handouts; and on their back pages, many, many notes written to myself during the seminar class period.

Some of these notes are simple reminders to myself: submit forms, pick up checks, finish reading etc. Yet others are financial calculations; in graduate school, I always lived on the edge, and frequent checks of my financial health were necessary. These, as can be seen, often distracted me even as I thought about metaphysics and ethics. And then, perhaps most poignantly, I find little injunctions and plans for self-improvement: eat more of this, eat less of that, run more, workout regularly, reading and writing schedules, smoke less or quit; and on and on. Sometimes I offer exhortations or admonitions to myself. These blueprints for a new me occur with some regularity; they represent a recurring concern of mine.

Those concerns and the ways in which I negotiate with them persist.

I still make lists of plans, I still draw up schedules of work and abstinence; I’m still struggling. Now, you can find the blueprints I speak of in my hard drive, tucked away into files; I don’t scribble them anymore.  But I continue to obsess over how I can get over this weakness, this flaw, this thing that is ‘holding me back’; I continue to obsess over how I can ‘change’ and ‘improve’ and be ‘better.’ When I see my notebooks, I see that I’m fighting many of the same battles that I used to fight back then; against distraction, anxiety, lack of discipline in my personal habits, in my ‘work ethic.’ I used to dream of transcending these, of moving on; it seems like I still am. Perhaps battles that have been waged this long are indicators of persistent failure on my part, a depressing thought at the best of times.

I’ve often written on this blog about the difficulties and myths of ‘self-improvement’; perhaps talk of ‘self-improvement’ is a sham, a distracting disturbance that does not allow us to become truly comfortable with, and accepting of, ourselves. Perhaps we have not reconciled ourselves to who we are. But perhaps that’s who I am, the kind of person who will always be obsessed with making these kinds of changes and ‘improvements,’ who will never make them, or never in the way that I want, but yet never accept ‘defeat’ or ‘get the hint.’ In that case, perhaps the best way for me to accept who I am, to ‘become who you are!‘ is to not disdain this activity of constantly plotting and scheming to escape myself. To engage in it is to be me.

A Theological Lesson Via Military History

In Hell in a Very Small Place: The Siege of Dien Bien Phu (J. B Lipincott, New York, 1966, p. 85), Bernard B. Fall describes the build-up which foretold the grim military disaster to unfold at Dien Bien Phu–the lack of adequate defenses and ammunition, the poor tactical location etc–making note, along the way, of that curious mixture of arrogance, complacency, and overconfidence that infected French military leadership. There were ample notes of worry too, of course, and finally, even of the grim resignation that is often the military man’s lot. The deputy chief of staff of the French commander General Cogny, Lt. Col. Denef had written in an assessment to his commander that “It is too late to throw the machine into reverse gear….That battle will have to be fought on the scale of the whole Indochina peninsula or it will become a hopeless retreat.” As Fall notes:

In transmitting this report…Col. Bastiani, the chief of staff added a note of his which was deeply significant:

I fully agree…in either case, it will have to be the battle of the Commander-in-Chief. I think he must have foreseen the necessary requirements before letting himself into that kind of hornet’s nest.

This was the ultimate excuse of a staff officer: the situation was hopeless, the action made no sense, but there might after all be higher reason for all of this. “The Führer must know what he is doing.” This phrase had been repeated a hundred times over by the German defenders of Stalingrad as they senselessly fought on toward catastrophe.

The analogy that may be drawn with theological responses to the problem of evil is inescapable and irresistible. There is, all around us, misery and suffering and disease and pestilence afoot, all apparently for no good reason. How is this reconcilable with an all-powerful, all-knowing, all-good God? One answer: evil is a ‘local’ disaster, the ‘badness’ of which vanishes when viewed from a broader, all-inclusive, synoptic perspective–the one God has.  From our epistemically limited perspective, we might be surrounded by catastrophes that suggest disorder and untrammeled badness, but zooming back reveals a larger plan within which these seeming disasters fall into place, directed onward and upward by a grand teleological scheme of greater order and good. (The chemotherapy kills healthy and cancer cells alike, but it heals the body. Trust the doctor; he knows best; he will make sense of your nausea, your hair loss, your weakened body. Or something like that.)

So if we are to ‘endure’ these disasters, we must reassure ourselves that someone, somewhere knows what time it is, what the score, the deal, is. Much like the determined soldier marching into battle, ours is not ask why, but to do or die. Our lot, of course, would be considerably improved if we knew why this was all necessary; after all, as Nietzsche had pointed out, “He who has a why to live for can bear almost any how.” For the theologically inclined and the militarily obedient, the ‘why’ is supplied by faith in the benevolence of the Supreme Commander. The rest of us are left to weakly reassure ourselves that this too shall pass. Or not.

Melville On ‘The Most Dangerous Sort’: The Outwardly Rational Madman

In Billy Budd, Sailor (Barnes and Noble Classic Edition, New York, p. 40) Herman Melville writes:

[T]he thing which in eminent instances signalizes so exceptional a nature is this: though the man’s even temper and discreet bearing would seem to intimate a mind peculiarly subject to the law of reason, not the less in his heart he would seem to riot in complete exemption from that law, having apparently little to do with reason further than to employ it as an ambidexter implement for effecting the irrational. That is to say: Toward the accomplishment of an aim which in wantonness of malignity would seem to partake of the insane, he will direct a cool judgement sagacious and sound. These men are true madmen, and of the most dangerous sort, for their lunacy is not continuous but occasional, evoked by some special object; it is probably secretive, which is as much to say it is self-contained, so that when moreover, most active, it is to the average mind not distinguishable from sanity, and for the reason above suggested that whatever its aims may be–and the aim is never declared–the method and the outward proceeding are always perfectly rational.

This is an acute observation by Melville, for the personality type he describes here is indeed ‘the most dangerous sort.’ Its tokens conform outwardly to social and moral expectations at all times even as they reserve their malignancy for occasional and pointed demonstrations, which continue to don the cover of ostensibly reasonable behavior. (Indeed, their general conformance to normative standards earns them the indulgence of others, who are then ready to forgive what may come to seem like only an occasional aberration; the pattern in these aberrations may not be visible unless it is too late. ) These agents know how to commit unpardonable acts under the cover of legality; they are adept at picking and choosing among the offerings of the reasonable and civilized, looking for those rhetorical and argumentative maneuvers that will give their actions the best veneer of respectability. (Perhaps they should remind us–via an inexact analogy–of Nietzsche’s ‘educated philistines;’ outwardly sophisticated but lacking in inner culture.)

Unfortunately for this world, Melville’s ‘most dangerous sort’ is a little too common. Its most devastating and dangerous exemplars are found in the political sphere–like those who commit war crimes while proceeding according to some impeccable logic of statescraft–but the skepticism of their opponents may ensure that their cover is easily blown. Matters are far harder in the domain of personal relationships, especially abusive ones. There, in the private sphere, away from prying eyes, the abuser can concentrate on his ‘special object,’ the abused. Their ‘sanity’ may bring the abused to the edge of insanity; their weapon of choice is very often the questioning of the mental competence of their partner. Their long and intimate relationship with their target has granted them access to weaknesses, secrets, chinks in the armor; these are now mercilessly and ruthlessly exploited by language and action which is artfully cloaked by reason and respectability.

Beware the superficial moral and intellectual education; for its most dangerous effect is to produce precisely the type Melville warns us against.

Richard Holmes On Biography’s ‘Physical Pursuit’ Of Its Subjects

In an essay describing his biographical work on Samuel Taylor Coleridge, Richard Holmes writes:

[A] biography is…a handshake….across time, but also across cultures, across beliefs, across disciplines, across genders, and across ways of life. It is an act of friendship.

It is a way of keeping the biographer’s notebook open, on both sides of that endlessly mysterious question: What was this human life really like, and what does it mean to us now? In this sense, biography is not merely a mode of historical inquiry. It is an act of imaginative faith.

Holmes bases this view of the work of the biographer on two claims about the art, the first one of which claims that:

[T]he serious biographer must physically pursue his subject through the past. Mere archives were not enough. He must go to all the places where the subject had ever lived or worked, or traveled or dreamed.

Biography is a famously reviled literary genre–sometimes described as fantasy, sometimes intrusive voyeurism, sometimes ideologically motivated hatchet job. Holmes is right to describe it as being animated by an ‘endlessly mysterious question.’ (He is also perspicuous in describing it as a ‘handshake’ and an ‘act of friendship’ of sorts.) That question’s mystery–which becomes ever more prominent when we think about its unanswerability with respect to ourselves–does not make the attempt to answer it necessarily ignoble or ill-motivated. But it does bid us be circumspect in assessing how much of the biographer’s task is ever ‘complete.’

To acknowledge that difficulty note that Holmes adds a variety of physical emulation to the task of the biographer: we must be where our subject has been in order to assess what his experiences there might have been like, and thus evaluate what their contribution to his life’s work were. Thus the Nietzsche biographer must make the hike to Sils Maria and ascend the heights that surround it. There, perhaps, one might investigate what Nietzsche had in mind in his constant invocations of the ‘clean air’ he experienced there, and wonder about the sordid life he might have left behind. Because we are not disembodied intelligences, but rather embodied beings in constant interaction with our environments–physical, mental, and emotional–Holmes’ injunction is a wise one. The biographer who writes of Jack Kerouac without undertaking a long road-trip on American highways, and does not wonder about what effect the sights seen therein–big skies, the black asphalt stretching to the horizon, the lonely houses and farms, the lives of fellow travelers–could have had on an endlessly restless and fertile imagination is crippled, fatally, in his task.

But even as we set to work in this dimension, we realize how much is still hidden away from us, how much remains inaccessible. We are still left to play, unavoidably, with our speculations, distant third-person reports, and autobiographical confessions of dubious fidelity. Perhaps this is why Holmes concludes by describing biography as an ‘act of imaginative faith.’

Notes: This essay begins with what must be a distinctive entry to the ‘not-so-humblebrag’ genre:

By the time I had finished my eight-hundred-page biography of Percy Bysshe Shelley in 1974, I was nearly thirty.

CS Lewis’ Mere Christianity: Masterfully Flawed Apologetics

CS Lewis‘ Mere Christianity is rightly acknowledged as a masterpiece of Christian apologetics; it is entertaining, witty, well-written, clearly composed by a man of immense learning and erudition (who, as befitting the author of the masterful Studies in Words, cannot restrain his delightful habit of providing impromptu lessons in etymology.) Lewis is said to have induced conversions in “Francis Collins, Jonathan Aitken, Josh Caterer and the philosopher C. E. M. Joad” as a result of their reading Mere Christianity, and it is not hard to see why. The encounter of a certain kind of of receptive mind with the explication of Christian doctrine that Lewis provides–laden with provocative analogies and metaphors–is quite likely to lead to the kind of experience conversion provides: an appeal to an emotional core harboring deeply experienced and felt needs and desires, which engenders a radical shift in perspective and self-conception. Christianity offers a means for conceptualizing one’s existential and pyschological crises–seeing them as manifestation of a kind of possession, by sin, by the Devil–and holds out the promise of radical self-improvement: the movement toward man–all men–becoming Christ, assuming a moral and spiritual perfection as they do so.  All the sludge will fall away; man will rise and be welcomed into the bosom of God; if only he takes on faith in Christ and his teachings. This is powerful, heady stuff and its intoxicating powers are underestimated only by those overly arrogant about the power and capacities of reason and ratiocination to address emotional longings and wants.

It is clear too, from reading Lewis, why Christianity provoked the ire of a philosopher like Nietzsche. For they are all here: the infantilization of man in the face of an all-powerful, all-seeing, all-knowing, all-good God; the terrible Godly wrath visible in notions such as damnation; the disdain for this life, this earth, this abode, its affairs and matters, in favor of another one; the notion of a ‘fallen man’ and a ‘fall from grace’ implying this world is corrupt, indeed, under ‘occupation’ by an ‘enemy force.’ There is considerable self-abnegation here; considerable opportunity for self-flagellation and diminishment. No wonder the Existential Stylist was driven to apoplectic fury.

Lewis takes Biblical doctrine seriously and literally; but like any good evangelical he is not above relying on metaphorical interpretation when it suits him. (This is evident throughout Mere Christianity but becomes especially prominent in the closing, more avowedly theological chapters.) Unsurprisingly for a man of his times (who supports the death penalty and thinks homosexuals are perverts), the seemingly retrograde demand that wives unquestioningly obey their husbands, which might have sparked alarms in a more suspicious mind about the sociological origins of such a hierarchy-preserving notion, is stubbornly, if ever so slightly apologetically, defended.

Lewis’ arguments are, despite the apparent effort he takes to refute views contrary to Christian doctrine, just a little too quick. His infamous trilemma arguing for the Divinity of Jesus and his dismissal of the notion that his supposed Natural or Universal Law of Morality cannot be traced to a social instinct are notoriously weak (the former’s weaknesses are amply referenced in the link above while the latter simply pays no attention to history, class, and culture.)

But Mere Christianity, even if deeply flawed, is still worth a read: you witness an agile mind at work; you encounter a masterful writer; you find yourself challenged to provide refutations and counter-arguments; you even feel an emotional tug or two, letting you empathize with those who do not think like you do. That’s a pretty good catch for one book.

Chatwin And Nietzsche On Metaphors, Words, And Concepts

Writing of the Yaghan people and Thomas BridgesYaghan Dictionary, Bruce Chatwin writes:

Finding in primitive languages a dearth of words for moral ideas, many people assumed these ideas did not exist, but the concepts of ‘good’ or ‘beautiful’ so essential to Western thought are meaningless unless they are rooted to things. The first speakers of language took the raw material of their surroundings and pressed it into metaphor to suggest abstract ideas. The Yaghan tongue–and by inference all language–proceeds as a system of navigation. Named things are fixed points, aligned or compared, which allow the speaker to plot the next move.  [In Patagonia, Penguin, New York, 1977, pp. 136]

Chatwin then goes on to describe some of the extraordinarily rich range of metaphorical allusion found in the Yaghan language. His analysis finds resonance in Nietzsche‘s thoughts on language in ‘Truth and Lies in a Nonmoral Sense‘:

What is a word? The image of a nerve stimulus in sounds….One designates only the relations of things to man, and to express them one calls on the boldest metaphors. A nerve stimulus, first transposed into an image—first metaphor. The image, in turn, imitated by a sound—second metaphor. And each time there is a complete overleaping of one sphere, right into the middle of an entirely new and different one….It is this way with all of us concerning language; we believe that we know something about the things themselves when we speak of trees, colors, snow, and flowers; and yet we possess nothing but metaphors for things—metaphors which correspond in no way to the original entities….Every word immediately becomes a concept, inasmuch as it is not intended to serve as a reminder of the unique and wholly individualized original experience to which it owes its birth, but must at the same time fit innumerable, more or less similar cases…Every concept originates through our equating what is unequal. No leaf ever wholly equals another, and the concept “leaf” is formed through an arbitrary abstraction from these individual differences, through forgetting the distinctions; and now it gives rise to the idea that in nature there might be something besides the leaves which would be “leaf.”

The Yaghan language helps its speakers and users plot and live a particular a form of life. If it at all it is infected by a ‘dearth of moral ideas’ it is not because the moral–or aesthetic–concepts in question are lacking. Rather, the concept is manifest in altogether another fashion: the ‘good’ and the ‘beautiful’ are visible and operative in its concrete instances, as examples of what to do and what not to do, in what worked and did not, in that which helped and that which was unhelpful, in that which was praiseworthy or not. A nominalistic language then, is not inferior to one that traffics more extravagantly with universals; it is merely more nominalistic; it has evolved to suit and conform to, another way of life, of doing things, of relating to a very particular environment in a particular time and place. The language of universals, as Nietzsche notes, has not brought us closer to reality’s ‘ultimate forms’ – whatever that may mean.