‘Reciprocity’ As Organizing Principle For The Moral Instruction Of Young Women

I’ve often wondered how best to provide moral instruction to my daughter as she grows up, what principles and duties to keep front and center in the course of my conversations with her as she begins to grow into an age where her interactions with other human beings start to grow more complex. Over the past year or so, I’ve strived to organize this ‘instruction’ around the concept of ‘reciprocity,’ around a variation of the Golden Rule and the altruism it implies: do good unto others; but only continue with the good if it is reciprocated; do not feel obligated to respond to unkindness with kindness; indeed, you shouldn’t respond to unkindness with kindness; if good is done to you, then you must reciprocate with good. There is one conditional duty in here: that of doing good to others, whose obligations continue to hold only if your acts are met with good done to you in turn. There is no duty to do good in response to bad being done unto you; and there is an absolute duty of doing good to others when they do good unto you.

I’ve tried to provide this instruction by way of simple examples: we should not litter because in doing so we would make our neighborhoods dirty for ourselves and our neighbors; they should do the same for us; if some kid in school is nice to you, you should be nice back to them; if someone in school is not nice to you when you have been so to them, then don’t feel the need to continue being nice with them; acknowledge people’s generosity and kindness in some fashion, even if with a simple ‘thanks’; and so on. I’ve tried to make the claim that society ‘hangs together,’ so to speak, because of reciprocity. Without it, our social arrangements would fall apart.

Reciprocity is not as generous and self-sacrificing as pure altruism. I chose reciprocity as an organizing principle because I believe a commitment to altruism can hurt people, and moreover, in our society and culture, altruism has proved to be largely harmful to women. I was, and am, especially worried about a girl growing up–as too many in the past have–to believe that her primary duty is to make others happy, to do good to others even if good is not being done to her in turn. I believed that stressing reciprocity as an organizing moral principle would point in the direction of some positive obligations to make others happy but it would also place some limitations on those obligations. Aristotle wrote of the need to maintain a mean of sorts as we ‘practiced’ the virtue of generosity, between wastefulness and stinginess–the altruist gives too much in this reckoning. A moral agent guided by the principle of reciprocity aims to find a mean in the generosity of their benevolent or good actions: by all means be generous, but pick the targets of your generosity wisely.

I realize that the injunction to only do good if it is reciprocated in some way sounds vaguely unforgiving or unkind and perhaps self-defensive; but again, as I noted above, some such measure of protection is necessary for women, who for too long have been crushed by the burden of unfair or unrealistic expectations of their conduct, to the detriment of their well-being. I want my daughter to do good unto others, but I also want good to be done to her.

My daughter, to her credit, seems to have listened; she can now use the word ‘reciprocity’ in conversation and sometimes to describe a plan of ac; I wait to see how well she will internalize the ‘lessons’ it forms the core of. (She likes the rhyming with ‘gravity’; as I say to her, gravity makes the world of things work, reciprocity makes the world of people work!)

Note: ‘reciprocity’ enjoys two entries in Wikipedia. One drawn from social psychology  and the other from social and political philosophy.

 

 

 

 

Blade Runner 2049: Our Slaves Will Set Us Free

Blade Runner 2049 is a provocative visual and aural treat. It sparked many thoughts, two of which I make note of here; the relationship between the two should be apparent.

  1. What is the research project called ‘artificial intelligence’ trying to do? Is it trying to make machines that can do the things which, if done by humans, would be said to require intelligence? Regardless of the particular implementation? Is it trying to accomplish those tasks in the way that human beings do them? Or is it trying to find a non-biological method of reproducing human beings? These are three very different tasks. The first one is a purely engineering task; the machine must accomplish the task regardless of the method–any route to the solution will do, so long as it is tractable and efficient. The second is cognitive science, inspired by Giambattista Vico; “the true and the made are convertible” (Verum et factum convertuntur) or “the true is precisely what is made” (Verum esse ipsum factum); we will only understand the mind, and possess a ‘true’ model of it when we make it. The third is more curious (and related to the second)–it immediately implicates us in the task of making artificial persons. Perhaps by figuring out how the brain works, we can mimic human cognition but this capacity might be  placed in a non-human form made of silicon or plastic or some metal; the artificial persons project insists on a human form–the android or humanoid robot–and on replicating uniquely human capacities including the moral and aesthetic ones. This would require the original cognitive science project to be extended to an all-encompassing project of understanding human physiology so that its bodily functions can be replicated. Which immediately raises the question: why make artificial persons? We have a perfectly good way of making human replicants; and many people actually enjoy engaging in this process. So why make artificial persons this way? If the answer is to increase our knowledge of human beings’ workings, then we might well ask: To what end? To cure incurable diseases? To make us happier? To release us from biological prisons so that we may, in some singularity inspired fantasy, migrate our souls to these more durable containers? Or do we need them to be in human form, so that they can realistically–in all the right ways–fulfill all the functions we will require them to perform. For instance, as in Westworld, they could be our sex slaves, or as in Blade Runner, they could perform dangerous and onerous tasks that human beings are unwilling or unable to do. And, of course, prop up ecologically unstable civilizations like ours.
  2. It is a philosophical commonplace–well, at least to Goethe and Nietzsche, among others–that constraint is necessary for freedom; we cannot be free unless we are restrained, somehow, by law and rule and regulation and artifice. But is it necessary that we ourselves be restrained in order to be free? The Greeks figured out that the slave could be enslaved, lose his freedom, and through this loss, his owner, his master, could be free; as Hannah Arendt puts it in The Human Condition the work of the slaves–barbarians and women–does ‘labor’ for the owner, keeping the owner alive, taking care of his biological necessity, and freeing him up to go to the polis and do politics in a state of freedom, in the company of other property-owning householders like him. So: the slave is necessary for freedom; either we enslave ourselves, suppress our appetites and desires and drives and sublimate and channel them into the ‘right’ outlets, or we enslave someone else. (Freud noted glumly in Civilization and its Discontents that civilization enslaves our desires.) If we cannot enslave humans, with all their capricious desires to be free, then we can enslave other creatures, perhaps animals, domesticating them to turn them into companions and food. And if we ever become technologically adept at reproducing those processes that produce humans or persons, we can make copies–replicants–of ourselves, artificial persons, that mimic us in all the right ways, and keep us free. These slaves, by being slaves, make us free.

Much more on Blade Runner 2049 anon.

No, Aristotle Did Not ‘Create’ The Computer

For the past few days, an essay titled “How Aristotle Created The Computer” (The Atlantic, March 20, 2017, by Chris Dixon) has been making the rounds. It begins with the following claim:

The history of computers is often told as a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II. In fact, it is better understood as a history of ideas, mainly ideas that emerged from mathematical logic, an obscure and cult-like discipline that first developed in the 19th century. Mathematical logic was pioneered by philosopher-mathematicians, most notably George Boole and Gottlob Frege, who were themselves inspired by Leibniz’s dream of a universal “concept language,” and the ancient logical system of Aristotle.

Dixon then goes on to trace this ‘history of ideas,’ showing how the development–and increasing formalization and rigor–of logic contributed to the development of computer science and the first computing devices. Along the way, Dixon makes note of the contributions-direct and indirect–of: Claude Shannon, Alan Turing, George Boole, Euclid, Rene Descartes, Gottlob Frege, David Hilbert, Gottfried Leibniz, Bertrand Russell, Alfred Whitehead, Alonzo Church, and John Von Neumann. This potted history is exceedingly familiar to students of the foundations of computer science–a demographic that includes computer scientists, philosophers, and mathematical logicians–but presumably that is not the audience that Dixon is writing for; those students might wonder why Augustus De Morgan and Charles Peirce do not feature in it. Given this temporally extended history, with its many contributors and their diverse contributions, why does the article carry the headline “How Aristotle Created the Computer”? Aristotle did not create the computer or anything like it; he did make important contributions to a fledgling field, which took several more centuries to develop into maturity. (The contributions to this field by logicians and systems of logic of alternative philosophical traditions like the Indian one are, as per usual, studiously ignored in Dixon’s history.) And as a philosopher, I cannot resist asking, “what do you mean by ‘created'”? What counts as ‘creating’?

The easy answer is that it is clickbait. Fair enough. We are by now used to the idiocy of the misleading clickbait headline, one designed to ‘attract’ more readers by making it more ‘interesting;’ authors very often have little choice in this matter, and very often have to watch helplessly as hit-hungry editors mangle the impact of the actual content of their work. (As in this case?) But it is worth noting this headline’s contribution to the pernicious notion of the ‘creation’ of the computer and to the idea that it is possible to isolate a singular figure as its creator–a clear hangover of a religious sentiment that things that exist must have creation points, ‘beginnings,’ and creators. It is yet another contribution to the continued mistaken recounting of the history of science as a story of ‘towering figures.’ (Incidentally, I do not agree with Dixon that the history of computers is “better understood as a history of ideas”; that history is instead, an integral component of the history of computing in general, which also includes a social history and an economic one; telling a history of computing as a history of objects is a perfectly reasonable thing to do when we remember that actual, functioning computers are physical instantiations of abstract notions of computation.)

To end on a positive note, here are some alternative headlines: “Philosophy and Mathematics’ Contributions To The Development of Computing”; “How Philosophers and Mathematicians Helped Bring Us Computers”; or “How Philosophical Thinking Makes The Computer Possible.” None of these are as ‘sexy’ as the original headline, but they are far more informative and accurate.

Note: What do you think of my clickbaity headline for this post?

The Convenient Construction Of The Public-Private Distinction

Revolutions are public affairs; revolutionaries bring them about. They fight in the streets, they ‘man’ the barricades, they push back the forces of reaction. And then, they go home for the night, to a meal and a warm bed. There, they rest and recuperate, recharging the batteries of uprising, ready to battle again the next day. Revolutionaries are men, doing the real work, out in the public sphere; their home fronts are staffed by women, whose job is to sustain the revolution’s domestic aspects.

In The Revolutionary Career of Maximilien Robespierre (University of Chicago Press, Chicago, 1985, pp. 57-58), David P. Jordan writes:

Although Robespierre was most at his ease in the midst of bourgeois domesticity, he depended upon others to create such an environment for him. Left to himself, he would have perpetuated his solitude in bleak rented rooms. It is worth noting that he fought the Revolution from the comfort of a bourgeois home. His passivity, his willingness to have others look after him, bespeaks an indifference to the mundane. He knew nothing of the marketplace; in Paris, as it had been in Arras, food awaited him at table, including the fruits he adored. Similarly, he knew nothing of the conditions of the desperately poor, with whom he never fraternized extensively. And there is no record that he ever went next door at the Duplays’ to talk to the carpenters in the shop. [citation added]

“An indifference to the mundane.” The home is the site of the mundane, the ordinary, the dull and dreary. Outside, the public sphere, where the non-domestic happens, is where the extraordinary takes place. That is the zone of men, the revolutionaries; the home is where women (and perhaps some servants), like a pit-stop crew, get the smooth machine of revolution up and running again with an oil and tire change for the body and mind. The revolutionary, from his lofty perch, can look down on and disdain these mundane offerings, the labor underlying which is not worthy of recognition in manifestos intended to stir the masses to action.

The excerpt above is drawn from a book published in 1985, two years after Carole Pateman‘s classic feminist critique of the public-private dichotomy appeared in print.¹ It shows in paradigmatic form, the standard (male and patriarchal) construction of the public-private distinction in political theory. The ancient Aristotelean understanding of polis as sphere for politics and civic life and home as venue for a much lower form of life persists here. Jordan does not make note of Robespierre’s detachment from the domestic with approval, but he does not find anything problematic in it either; instead, it appears as the sort of bemused indifference that we associate, quite romantically, with artists, writers, poets, and others too intent on cultivating their creativity to be bothered with the ‘mundane’ particulars of life. In this history, the public fray rises above domestic scurrying; the men hover above the women below.

Note 1: Carole Pateman, ‘Feminist Critiques of the Public/Private Dichotomy,’ in Public and Private in Social Life 281,281 (S. I. Benn & G. F. Gaus eds., 1983)

Notes On Meditation Practice – II

Meditation induces two interesting forms of self-consciousness that do not arise during the actual sitting itself. They are, rather, ways of regarding the practice of meditation as it meshes with the rest of the meditator’s life.

First, the meditator is self-consciously aware of the fact that he is one. The normal, ongoing processes of identity formation and maintenance now include the attribute, ascribed to oneself, “engages in a meditation practice.” This is not innocent; for better or worse, ‘meditation’ carries certain connotations with it. These include, at the least, dimly perceived and understood stereotypes about the kind of person who does meditate, and why they might do so; by becoming a meditator, some of those stereotypes become ways of regarding yourself.  For instance, shortly after I began my practice, I found myself kicking off what looked like turning into a heated argument. As I did so, I felt curiously abashed and undignified, and a thought, unbidden, came to me: this was not how those who engage in meditation practice are ‘supposed to behave.’ I was supposed to be one engaging in a practice that induced calm and dignity, but here I was, squabbling like a child. Overcome by a sudden awkwardness, I retreated from my previously grimly defended position and began winding down the argument. I wanted to retreat from this zone of my loss of composure. This has not always been the case; on many occasions, I have blundered straight into the heart of a meltdown, and emerged with very little of my former grace intact. But that new perspective on myself has not gone away. It remains, lurking on the edges of my consciousness of myself, reminding me I now engage in an activity supposed to be changing me and making me into a new person.

Second, meditation is self-indulgent and the meditator knows it. Forty minutes a day is ‘too much’ to spare; none of us, especially here in this city, have that time to spare. As such, the very act of sitting down and shutting out the world’s demands feels like a supremely, virtuously self-centered action. You deny the world its claims on you–even as you carry thoughts about it into your mind, and yet, for those twenty minutes, remove yourself from its embrace. The awareness of the sheer subversiveness of this act–in a world-context in which there is an unceasing demand for our time and attention–is a liberation. It brings with it a curious sensation of power; to step away from this world feels like an empowering act, an assumption of agency in a situation where we are used, all too often, to bemoan the loss of ours. This awareness too, becomes part of our identity; it becomes an attribute to ourselves; it changes who we think we are.

Aristotle said that we are what we repeatedly do. Sitting in meditation, with a regular practice, makes you a meditator; that change, by itself, without any other extravagant claims, is a significant one.

Note: The first post in this series is here.

The Greek Alphabet: Making The Strange Familiar

In his review of Patrick Leigh Fermor‘s The Broken Road: From The Iron Gates to Mount Athos (eds. Colin Thubron and Artemis Cooper, New York Review Books, 2014) Daniel Mendelsohn writes:

His deep affection and admiration for the Greeks are reflected in particularly colorful and suggestive writing. There is a passage in Mani in which the letters of the Greek alphabet become characters in a little drama meant to suggest the intensity of that people’s passion for disputation:

I often have the impression, listening to a Greek argument, that I can actually see the words spin from their mouths like the long balloons in comic strips…:the perverse triple loop of Xi, the twin concavity of Omega,…Phi like a circle transfixed by a spear…. At its climax it is as though these complex shapes were flying from the speaker’s mouth like flung furniture and household goods, from the upper window of a house on fire.

I first encountered Greek letters, like most schoolchildren, in my mathematics and physics and chemistry classes. There was π, the ratio of the circumference of a circle to its diameter; ω, the frequency of a harmonic oscillator and later, infinity in set theory; λ, the wavelength of light; θ, ubiquitous in trigonometry; Ψ, the wave function of quantum mechanics; Σ, the summation of arithmetic and geometric series; a whole zoo used to house the esoteric menagerie of subatomic particles; and many, many more. The Greek alphabet was the lens through which the worlds of science and mathematics became visible to me; it provided symbols for the abstract and the concrete, for the infinitely small and the infinitely large.

I never learned to read in Greek but the Greek alphabet feels intimately familiar to me. Perhaps the most familiar after English.

I first saw Greek texts in the best possible way: Greek versions of Aristotle and Plato in my graduate school library, intended for use by those who specialized in ancient philosophy. (These texts were in classical Greek.) I took down the small volumes from the shelf and opened their pages and looked at the text. It was incomprehensible and yet, recognizable. I could see all the letters, those old friends of mine: the α and the β used to denote the atoms of a language for propositional logic, the Γ of the generalized factorial function, the Δ of differences; they were all there. But now they were pressed into different duties.

Now, they spoke of ethics and metaphysics and politics, of generation and corruption; their forms spoke of the Forms. Now they were used to construct elaborate philosophical systems and arguments. But even as they did so, I could not help feeling, as I looked at the pages and pages of words constructed out of those particles, that I was looking at the most abstruse and elaborate mathematical text of all. It was all unknown quantities, an endless series of fantastically complex mathematical expressions, one following the other, carrying on without end. Yes, it was all Greek to me.  And yet, I still felt at home.

The Curious Irony of Procrastination

Do writers procrastinate more than other people? I wouldn’t know for sure just because I have no idea how much procrastination counts as the norm and what depths practitioners of other trades sink to. But I procrastinate a great deal. (Thank you for indulging me in my description of myself as a ‘writer’; if you prefer, I could just use ‘blogger.’) At any given moment, there are many, many tasks I can think of–not all of them writerly–that I intend to get around to any hour, day, week, month, year or life now. (I procrastinate on this blog too; I’ve promised to write follow-ups to many posts and almost never get around to doing so.) This endless postponement is a source of much anxiety and dread. Which, of course, is procrastination’s central–and justifiably famous–irony.

You procrastinate because you seek relief from anxiety, because you dread encounters with the uncertainty, frustration, and intractability you sense in the tasks that remain undone. But the deferment you seek relief in becomes a source of those very sensations you sought to avoid. The affliction feared and the putative relief provider are one and the same. It is a miserable existence to suffer so.

One of my longest running procrastinations is close to the two-year mark now; this period has been particularly memorable–in all the wrong ways–because it has been marked by a daily ritual that consists of me saying ‘Tomorrow, I’ll start.’ (I normally go through this in the evening or late at night.) And on the day after, I wake up, decide to procrastinate again, and reassure myself that tomorrow is the day it will happen. As has been noted in the context of quitting vices, one of the reasons we persist in our habits is because we are able to convince ourselves that quitting, getting rid of the old habit,  is easy. So we persist, indulging ourselves once more and reassuring ourselves of our imagined success in breaking out of the habit whenever we finally decide we are ready to do so. (But habits are habits for a reason; because they are deeply ingrained, because we practice them so, because we have made them near instinctual parts of ourselves. And that is why, of course, new habits are hard to form, and old habits are hard to break.)

Similarly for procrastination; we continue to put off for the morrow because we imagine that when the morrow rolls around, we will be able to easily not put off, to get down to the business at hand. All that lets us do, of course, is continue to procrastinate today. The only thing put off till the morrow is the repetition of the same decision as made today–the decision to defer yet again.

Now, if as Aristotle said, we are what we repeatedly do, I’m a procrastinator; I’m an irrational wallower in anxiety, condemning myself to long-term suffering for fear of being afflicted by a short-lived one. That is not a flattering description to entertain of oneself but it is an apt one given my history and my actions.