Neil deGrasse Tyson And The Perils Of Facile Reductionism

You know the shtick by now–or at least, twitterers and tweeters do. Every few weeks, Neil deGrasse Tyson, one of America’s most popular public ‘scientific’ intellectuals, decides that it is time to describe some social construct in scientific language to show how ‘arbitrary’ and ‘made-up’ it all is–compared to the sheer factitude, the amazing reality-grounded non-arbitrariness of scientific knowledge. Consider for instance, this latest gem, now predictably provoking ridicule from those who found its issuance predictable and tired:

Not that anybody’s asked, but New Years Day on the Gregorian Calendar is a cosmically arbitrary event, carrying no Astronomical significance at all.

A week earlier, Tyson had tweeted:

Merry Christmas to the world’s 2.5 billion Christians. And to the remaining 5 billion people, including Muslims Atheists Hindus Buddhists Animists & Jews, Happy Monday.

Tyson, I think, imagines that he is bringing science to the masses; that he is dispelling the ignorance cast by the veil of imprecise, arbitrary, subjective language that ‘ordinary folk’ use by directing their attention to scientific language, which when used, shows how ridiculous those ‘ordinary folk’ affectations are. Your birthday? Just a date. That date? A ‘cosmically arbitrary event.’ Your child’s laughter? Just sound waves colliding with your eardrum. That friendly smile beamed at you by your school mate? Just facial muscles being stretched. And so on. It’s really easy; almost mechanical. I could, if I wanted, set up a bot-run Neil deGrasse Tyson Parody account on Twitter, and just issue these every once in a while. Easy pickings.

Does Tyson imagine that he is engaging in some form ‘scientific communication’ here, bringing science to the masses? Does he imagined he is introducing greater precision and fidelity to truth in our everyday conversation and discourse, cleaning up the degraded Augean stables of internet chatter? He might think so, but what Tyson is actually engaged in is displaying the perils of facile reductionism and the scientism it invariably accompanies and embellishes; anything can be redescribed in scientific language but that does not mean such redescription is necessary or desirable or even moderately useful. All too often such redescription results in not talking about the ‘same thing’ any more. (All that great literature? Just ink on paper! You know, a chemical pigment on a piece of treated wood pulp.)

There are many ways of talking about the world; science is one of them. Science lets us do many things; other ways of talking about the world let us other do things. Scientific language is a tool; it lets us solve some problems really well; other languages–like those of poetry, psychology, literature, legal theory–help us solve others. The views they introduce of this world show us many things; different objects appear in different views depending on the language adopted. As a result, we are ‘multi-scopic’ creatures; at any time, we entertain multiple perspectives on this world and work with them, shifting between each as my wants and needs require. To figure out what clothes to wear today, I consulted the resources of meteorology; in order to get a fellow human being to come to my aid, I used elementary folk psychology, not neuroscience; to crack a joke and break the ice with co-workers, I relied on humor which deployed imaginary entities. Different tasks; different languages; different tools; it is the basis of the pragmatic attitude, which underwrites the science that Tyson claims to revere.

Tyson has famously dissed philosophy of science and just philosophy in general; his tweeting shows that he would greatly benefit from a philosophy class or two himself.

Contra Cathy O’Neil, The ‘Ivory Tower’ Does Not ‘Ignore Tech’

In ‘Ivory Tower Cannot Keep On Ignoring TechCathy O’Neil writes:

We need academia to step up to fill in the gaps in our collective understanding about the new role of technology in shaping our lives. We need robust research on hiring algorithms that seem to filter out peoplewith mental health disorders…we need research to ensure that the same mistakes aren’t made again and again. It’s absolutely within the abilities of academic research to study such examples and to push against the most obvious statistical, ethical or constitutional failures and dedicate serious intellectual energy to finding solutions. And whereas professional technologists working at private companies are not in a position to critique their own work, academics theoretically enjoy much more freedom of inquiry.

There is essentially no distinct field of academic study that takes seriously the responsibility of understanding and critiquing the role of technology — and specifically, the algorithms that are responsible for so many decisions — in our lives. That’s not surprising. Which academic department is going to give up a valuable tenure line to devote to this, given how much academic departments fight over resources already?

O’Neil’s piece is an unfortunate continuation of a trend to continue to castigate academia for its lack of social responsibility, all the while ignoring the work academics do in precisely those domains where their absence is supposedly felt.

In her Op-Ed, O’Neil ignores science and technology studies, a field of study that “takes seriously the responsibility of understanding and critiquing the role of technology,” and many of whose members are engaged in precisely the kind of studies she thinks should be undertaken at this moment in the history of technology. Moreover, there are fields of academic studies such as philosophy of science, philosophy of technology, and the sociology of knowledge, all of which take very seriously the task of examining and critiquing the conceptual foundations of science and technology; such inquiries are not elucidatory, they are very often critical and skeptical. Such disciplines then, produce work that makes both descriptive and prescriptive claims about the practice of science, and the social, political, and ethical values that underwrite what may seem like purely ‘technical’ decisions pertaining to design and implementation. The humanities are not alone in this regard, most computer science departments now require a class in ‘Computer Ethics’ as part of the requirements for their major (indeed, I designed one such class here at Brooklyn College, and taught it for a few semesters.) And of course, legal academics have, in recent years started to pay attention to these fields and incorporated them in their writings on ‘algorithmic decision making,’ ‘algorithmic control’ and so on. (The work of Frank Pasquale and Danielle Citron is notable in this regard.) If O’Neil is interested, she could dig deeper into the philosophical canon and read works by critical theorists like Herbert Marcuse and Max Horkheimer who mounted rigorous critiques of scientism, reductionism, and positivism in their works. Lastly, O’Neil could read my co-authored work Decoding Liberation: The Promise of Free and Open Source Software, a central claim of which is that transparency, not opacity, should be the guiding principle for software design and deployment. I’d be happy to send her a copy if she so desires.