Resisting Big Data: Interfering With ‘Collaboration,’ Nonconsensually

Consider the various image-sharing databases online: Facebook’s photo stores, Instagram, Flickr. These contain trillions of photographs, petabytes of fragile digital data, growing daily, without limit; every day, millions of users worldwide upload the  images they capture on their phones and cameras to the cloud, there to be stored, processed, enhanced, shared, tagged, commented on. And to be used as learning data for facial recognition software–the stuff that identifies your ‘friends’ in your photos in case you want to tag them.

This gigantic corpus of data is a mere court-issued order away from being used by the nation’s law enforcement agencies to train their own facial surveillance software–to be used, for instance, in public space cameras, port-of-entry checks, correctional facilities, prisons etc. (FISA courts can be relied upon to issue warrants in response to any law enforcement agency requests; and internet service providers and media companies respond with great alacrity to government subpoenas.) Openly used and deployed, that is. With probability one, the NSA, FBI, and CIA have already ‘scraped’, using a variety of methods, these image data stores, and used them in the manner indicated. We have actively participated and collaborated, and continue to do so, in the construction of the world’s largest and most sophisticated image surveillance system. We supply the data by which we may be identified; those who want to track our movements and locations use this data to ‘train’ their artificial agents to surveil us, to report on us if we misbehave, trespass, or don’t conform to whichever spatial or physical or legal or ‘normative’ constraint happens to direct us at any given instant. The ‘eye’ watches; it relies for its accuracy on what we have ‘told’ it, through our images and photographs.

Now imagine a hacktivist programmer who writes a Trojan horse that infiltrates such photo stores and destroys all their data–permanently, for backups are also taken out. This is a ‘feat’ that is certainly technically possible; encryption will not prevent a drive from being formatted; and security measures of all kinds can be breached. Such an act of ‘hacktivism’ would be destructive; it would cause the loss of much ‘precious data’: memories and recollections of lives and the people who live them, all gone, irreplaceable.  Such an act of destruction would be justified, presumably, on the grounds that to do so would be to cripple a pernicious system of surveillance and control. Remember that your photos don’t train image recognition systems to recognize just you; they also train it to not recognize someone else as you; our collaboration does not just hurt us, it hurts others; we are complicit in the surveillance and control of others.

I paint this admittedly unlikely scenario to point attention to a few interesting features of our data collection and analysis landscape: a) we participate, by conscious action and political apathy, in the construction and maintenance of our own policing; b) we are asymmetrically exposed because our surveillers enjoy maximal secrecy while we can draw on none; c) collective, organized resistance is so difficult to generate that the most effective political action might be a quasi-nihilist act of loner ‘civil disobedience’–if you do not cease and desist from ‘collaborating,’ the only choice left to others still concerned about their freedom from surveillance might to be nonconsensually interrupt such collaboration.

Proprietary Software And Our Hackable Elections

Bloomberg reports that:

Russia’s cyberattack on the U.S. electoral system before Donald Trump’s election was far more widespread than has been publicly revealed, including incursions into voter databases and software systems in almost twice as many states as previously reported. In Illinois, investigators found evidence that cyber intruders tried to delete or alter voter data. The hackers accessed software designed to be used by poll workers on Election Day, and in at least one state accessed a campaign finance database….the Russian hackers hit systems in a total of 39 states

In Decoding Liberation: The Promise of Free and Open Source Software, Scott Dexter and I wrote:

Oversight of elections, considered by many to be the cornerstone of modern representational democracies, is a governmental function; election commissions are responsible for generating ballots; designing, implementing, and maintaining the voting infrastructure; coordinating the voting process; and generally insuring the integrity and transparency of the election. But modern voting technology, specifically that of the computerized electronic voting machine that utilizes closed software, is not inherently in accord with these norms. In elections supported by these machines, a great mystery takes place. A citizen walks into the booth and “casts a vote.” Later, the machine announces the results. The magical transformation from a sequence of votes to an electoral decision is a process obscure to all but the manufacturers of the software. The technical efficiency of the electronic voting process becomes part of a package that includes opacity and the partial relinquishing of citizens’ autonomy.

This “opacity” has always meant that the software used to, quite literally, keep our democracy running has its quality and operational reliability vetted, not by the people, or their chosen representatives, but only by the vendor selling the code to the government. There is no possibility of say, a fleet of ‘white-hat’ hackers–concerned citizens–putting the voting software through its paces, checking for security vulnerabilities and points of failure. The kinds that hostile ‘black-hat’ hackers, working for a foreign entity like, say, Russia, could exploit. These concerns are not new.

Dexter and I continue:

The plethora of problems attributed to the closed nature of electronic voting machines in the 2004 U.S. presidential election illustrates the ramifications of tolerating such an opaque process. For example, 30 percent of the total votes were cast on machines that lacked ballot-based audit trails, making accurate recounts impossible….these machines are vulnerable to security hacks, as they rely in part on obscurity….Analyses of code very similar to that found in these machines reported that the voting system should not be used in elections as it failed to meet even the most minimal of security standards.

There is a fundamental political problem here:

The opaqueness of these machines’ design is a secret compact between governments and manufacturers of electronic voting machines, who alone are privy to the details of the voting process.

The solution, unsurprisingly, is one that calls for greater transparency; the use of free and open source software–which can be copied, modified, shared, distributed by anyone–emerges as an essential requirement for electronic voting machines.

The voting process and its infrastructure should be a public enterprise, run by a non-partisan Electoral Commission with its operational procedures and functioning transparent to the citizenry. Citizens’ forums demand open code in electoral technology…that vendors “provide election officials with access to their source code.” Access to this source code provides the polity an explanation of how voting results are reached, just as publicly available transcripts of congressional sessions illustrate governmental decision-making. The use of FOSS would ensure that, at minimum, technology is held to the same standards of openness.

So long as our voting machines run secret, proprietary software, our electoral process remains hackable–not just by Russian hackers but also by anyone that wishes to subvert the process to help realize their own political ends.

Apple’s ‘Code Is Speech’ Argument, The DeCSS Case, And Free Software

In its ongoing battle with federal law enforcement agencies over its refusal to unlock the iPhone, Apple has mounted a ‘Code is Speech’ defense arguing that “the First Amendment prohibits the government from compelling Apple to make code.” This has provoked some critical commentary, including an article by Neil Richards, which argues that Apple’s argument is “dangerous.”

Richards alludes to some previous legal wrangling over the legal status of computer code, but does not name names. Here is an excerpt from my book Decoding Liberation: The Promise of Free and Open Source Software (co-authored with Scott Dexter) that makes note of a relevant court decision and offers arguments for treating code as speech protected under the First Amendment. (To fully flesh out these arguments in their appropriate contexts, do read Chapters 4 and 5 of Decoding Liberation. I’d be happy to mail PDFs to anyone interested.) Continue reading

Michelle Maltais’ Cyber-Weapon Fantasy About ‘War Without Bloodshed’

What is it about technology that makes so many, warriors and armchair-enthusiasts alike, imagine that it will make war,  somehow, less bloody, less brutal, less inhumane? That never-ending and most curious of seductions is again visibly on display in Michelle Maltais’ article ‘Cyber Missiles Mean War Without Bloodshed’ (Los Angeles Times , June 2nd 2012). Like most demonstrations of this destined-to-be-benighted hope, it is equal parts laughable naiveté and dangerous cynicism. And unquestioning acceptance of the pronouncements of techno-optimists.

Maltais begins with a line that should have given her pause:

What do you need to disrupt nuclear facilities of your enemy? A thumb drive.

Maltais imagines, as she would like us to, I’m sure, that disruption of nuclear facilities merely means their peaceful grinding to a halt. But what if that disruption entails a catastrophic chain reaction instead? Or perhaps some other mishap that releases toxic, radioactive materials? The fallacy here is to imagine that the cyber-weapon will work precisely as intended, calmly, sanguinely, operating without collateral damage. But war, remember, is that place where, always without fail, ‘the best-laid schemes of men gang aft agley.’ This precautionary note could be cited for almost every single imagined use of cyber weapons.

But sometimes it doesn’t need to be a precautionary note about cyber-weapon malfunction. Maltais quotes ‘Phil Lieberman, a security consultant and chief executive of Lieberman Software in Los Angeles’ as saying:

You’re seeing an evolution of warfare that’s really intriguing…[W]arfare where no one is dying.

Fallacy Numero Dos: Cyber weapons may not merely conduct disruptive warfare. Perhaps they could make guided missiles go awry, planes crash, or bring about any number of other catastrophic failures of systems equipped with guidance systems susceptible to invasive hacking. These might entail loss of human life as well.

These remarks, however, are overshadowed by what might be the central confusion implicitly on display in the orgy of technophilic presumption that runs through Maltais’ article, that cyber-attacks will be responded to with acquiescence or by similarly oriented weapons, thus conjuring up an image of a world populated by belligerents that are content to merely knock out each other communication systems–and similar targets–happily trading software-coded potshots at each other.

Au contraire. Cyber attacks, if sufficiently potent, are likely to be considered casus belli by those on the receiving end. Their mode and method of retaliation might not be of the cyber variety. It might, you know, involve weapons that go boom, that shred skin and flesh and bone, and, yes, cause bloodshed. Maltais and those she quotes imagine that software will be met with software. But warfare is often asymmetrical. Do we–in this time and age, in this era of the suicide bomber, the IED, the nail bomb and the improvised Molotov cocktail, that daily go up against awesome, mechanized and computerized armies–really need to be reminded of that? Perhaps we do. And perhaps we also need the rude awakening that only war can bring–because apparently, when it comes to war, nothing quite gets rid of technological fantasies like those currently on display, like the shedding of real blood, and the return home of not-to-be-photographed body bags.