Tesla’s ‘Irma Update’ Shows The Dangers Of Proprietary Software

By now, you know the story. Tesla magically (remotely) updated the software of its cars during Hurricane Irma:

Tesla remotely sent a free software update to some drivers across Florida over the weekend, extending the battery capacity of cars and giving extra range to those fleeing Hurricane Irma.

According to reports, the update temporarily unlocked the full-battery potential for 75-kilowatt-hour Model S sedans and Model X SUVs, adding around 30 to 40 miles to their range.

“Cars with a 75-kilowatt-hour battery pack were previously software limited to 210 miles of driving range per single charge and will now get 249 miles, the full range capacity of the battery,” the company wrote on a blog.

As is evident from this description, the software regulating battery life is ‘autonomous’ of the user; the user cannot change it, or tweak it in any way to reflect changing user needs or driving conditions (like, say, the need to drive to a distant point in order to escape a potentially life-threatening change in the weather.) In short, the software that runs on Tesla’s cars is not ‘free‘–not in the sense that you have to pay money for it, but in the sense that you cannot do what you, as the user of the software, might or might not want to do with it: like share it, copy it, modify it. If the user needs ‘help’ he or she must wait for the benevolent corporation to come to its aid.

We, as software users, are used to this state of affairs. Most of the software we use is indeed not ‘free’ in this sense: the source code is kept a trade secret and cannot be inspected to figure out how it does what it does, the binary executables are copyrighted and cannot be copied, lastly, the software’s algorithms are patented. You cannot read the code, you cannot change it to better reflect your needs, and you cannot make copies of something you ‘own’ to give it to others who might need it. As software users eventually come to realize, you don’t ‘own’ proprietary software in the traditional sense of the term, you license it for a limited period of time, subject to many constraints, some reasonable, others not.

In an interview with 3AM magazine, while talking about my book Decoding Liberation: The Promise of Free and Open Source Software I had made note of some of the political implications of the way software is regulated by law. The following exchange sums up the issues at play:

3:AM: One aspect of the book that was particularly interesting to me was your vision of a world full of code, a cyborg world where ‘distinctions between human and machine evanesce’ and where ‘personal and social freedoms in this domain are precisely the freedoms granted or restricted by software.’ Can you say something about what you argued for there?

SC: I think what we were trying to get at was that it seemed the world was increasingly driven by software, which underwrote a great deal of the technology that extends us and makes our cyborg selves possible. In the past, our cyborg selves were constructed by things like eyeglasses, pencils, abacuses and the like—today, by smartphones, wearable computers, tablets and other devices like them. These are all driven by software. So our extended mind, our extended self, is very likely to be largely a computational device. Who controls that software? Who writes it? Who can modify it? Look at us today, tethered to our machines, unable to function without them, using software written by someone else. How free can we be if we don’t have some very basic control over this technology? If the people who write the software are the ones who have exclusive control over it, then I think we are giving up some measure of freedom in this cyborg society. Remember that we can enforce all sorts of social control over people by writing it into the machines that they use for all sorts of things. Perhaps our machines of tomorrow will come with porn filters embedded in the code that we cannot remove; perhaps with code in the browsers that mark off portions of the Net as forbidden territory, perhaps our reading devices will not let us read certain books, perhaps our smartphones will not let us call certain numbers, perhaps prosthetic devices will not function in ‘no-go zones’, perhaps the self-driving cars of tomorrow will not let us drive faster than a certain speed; the control possibilities are endless. The more technologized we become and the more control we hand over to those who can change the innards of the machines, the less free we are. What are we to do? Just comply? This all sounds very sci-fi, but then, so would most of contemporary computing to folks fifty years ago. We need to be in charge of the machines that we use, that are our extensions.

We, in short, should be able to hack ourselves.

Tesla’s users were not free during Irma; they were at the mercy of the company, which in this case, came to their aid. Other users, of other technologies, might not be so fortunate; they might not be the masters of their destiny.

Report On Brooklyn College Teach-In On ‘Web Surveillance And Security’

Yesterday, as part of ‘The Brooklyn College Teach-In & Workshop Series on Resistance to the Trump Agenda,’ I facilitated a teach-in on the topic of ‘web surveillance and security.’ During my session I made note of some of the technical and legal issues that are play in these domains, and how technology and law have conspired to ensure that: a) we live in a regime of constant, pervasive surveillance; b) current legal protections–including the disastrous ‘third-party doctrine‘ and the rubber-stamping of governmental surveillance ‘requests’ by FISA courts–are simply inadequate to safeguard our informational and decisional privacy; c) there is no daylight between the government and large corporations in their use and abuse of our personal information. (I also pointed my audience to James Grimmelmann‘s excellent series of posts on protecting digital privacy, which began the day after Donald Trump was elected and continued right up to inauguration. In that post, Grimmelmann links to ‘self-defense’ resources provided by the Electronic Frontier Foundation and Ars Technica.)

I began my talk by describing how the level of surveillance desired by secret police organizations of the past–like the East German Stasi, for instance–was now available to the NSA, CIA, and FBI, because of social networking systems; our voluntary provision of every detail of our lives to these systems is a spook’s delight. For instance, the photographs we upload to Facebook will, eventually, make their way into the gigantic corpus of learning data used by law enforcement agencies’ facial recognition software.

During the ensuing discussion I remarked that traditional activism directed at increasing privacy protections–or the enacting of ‘self-defense’ measures–should be part of a broader strategy aimed at reversing the so-called ‘asymmetric panopticon‘: citizens need to demand ‘surveillance’ in the other direction, back at government and corporations. For the former, this would mean pushing back against the current classification craze, which sees an increasing number of documents marked ‘Secret’ ‘Top Secret’ or some other risible security level–and which results in absurd sentences being levied on those who, like Chelsea Manning, violate such constraints; for the latter, this entails demanding that corporations offer greater transparency about their data collection, usage, and analysis–and are not able to easily rely on the protection of trade secret law in claiming that these techniques are ‘proprietary.’ This ‘push back,’ of course, relies on changing the nature of the discourse surrounding governmental and corporate secrecy, which is all too often able to offer facile arguments that link secrecy and security or secrecy and business strategy. In many ways, this might be the  most onerous challenge of all; all too many citizens are still persuaded by the ludicrous ‘if you’ve done nothing illegal you’ve got nothing to hide’ and ‘knowing everything about you is essential for us to keep you safe (or sell you goods’ arguments.

Note: After I finished my talk and returned to my office, I received an email from one of the attendees who wrote: