Resisting Big Data: Interfering With ‘Collaboration,’ Nonconsensually

Consider the various image-sharing databases online: Facebook’s photo stores, Instagram, Flickr. These contain trillions of photographs, petabytes of fragile digital data, growing daily, without limit; every day, millions of users worldwide upload the  images they capture on their phones and cameras to the cloud, there to be stored, processed, enhanced, shared, tagged, commented on. And to be used as learning data for facial recognition software–the stuff that identifies your ‘friends’ in your photos in case you want to tag them.

This gigantic corpus of data is a mere court-issued order away from being used by the nation’s law enforcement agencies to train their own facial surveillance software–to be used, for instance, in public space cameras, port-of-entry checks, correctional facilities, prisons etc. (FISA courts can be relied upon to issue warrants in response to any law enforcement agency requests; and internet service providers and media companies respond with great alacrity to government subpoenas.) Openly used and deployed, that is. With probability one, the NSA, FBI, and CIA have already ‘scraped’, using a variety of methods, these image data stores, and used them in the manner indicated. We have actively participated and collaborated, and continue to do so, in the construction of the world’s largest and most sophisticated image surveillance system. We supply the data by which we may be identified; those who want to track our movements and locations use this data to ‘train’ their artificial agents to surveil us, to report on us if we misbehave, trespass, or don’t conform to whichever spatial or physical or legal or ‘normative’ constraint happens to direct us at any given instant. The ‘eye’ watches; it relies for its accuracy on what we have ‘told’ it, through our images and photographs.

Now imagine a hacktivist programmer who writes a Trojan horse that infiltrates such photo stores and destroys all their data–permanently, for backups are also taken out. This is a ‘feat’ that is certainly technically possible; encryption will not prevent a drive from being formatted; and security measures of all kinds can be breached. Such an act of ‘hacktivism’ would be destructive; it would cause the loss of much ‘precious data’: memories and recollections of lives and the people who live them, all gone, irreplaceable.  Such an act of destruction would be justified, presumably, on the grounds that to do so would be to cripple a pernicious system of surveillance and control. Remember that your photos don’t train image recognition systems to recognize just you; they also train it to not recognize someone else as you; our collaboration does not just hurt us, it hurts others; we are complicit in the surveillance and control of others.

I paint this admittedly unlikely scenario to point attention to a few interesting features of our data collection and analysis landscape: a) we participate, by conscious action and political apathy, in the construction and maintenance of our own policing; b) we are asymmetrically exposed because our surveillers enjoy maximal secrecy while we can draw on none; c) collective, organized resistance is so difficult to generate that the most effective political action might be a quasi-nihilist act of loner ‘civil disobedience’–if you do not cease and desist from ‘collaborating,’ the only choice left to others still concerned about their freedom from surveillance might to be nonconsensually interrupt such collaboration.

Report On Brooklyn College Teach-In On ‘Web Surveillance And Security’

Yesterday, as part of ‘The Brooklyn College Teach-In & Workshop Series on Resistance to the Trump Agenda,’ I facilitated a teach-in on the topic of ‘web surveillance and security.’ During my session I made note of some of the technical and legal issues that are play in these domains, and how technology and law have conspired to ensure that: a) we live in a regime of constant, pervasive surveillance; b) current legal protections–including the disastrous ‘third-party doctrine‘ and the rubber-stamping of governmental surveillance ‘requests’ by FISA courts–are simply inadequate to safeguard our informational and decisional privacy; c) there is no daylight between the government and large corporations in their use and abuse of our personal information. (I also pointed my audience to James Grimmelmann‘s excellent series of posts on protecting digital privacy, which began the day after Donald Trump was elected and continued right up to inauguration. In that post, Grimmelmann links to ‘self-defense’ resources provided by the Electronic Frontier Foundation and Ars Technica.)

I began my talk by describing how the level of surveillance desired by secret police organizations of the past–like the East German Stasi, for instance–was now available to the NSA, CIA, and FBI, because of social networking systems; our voluntary provision of every detail of our lives to these systems is a spook’s delight. For instance, the photographs we upload to Facebook will, eventually, make their way into the gigantic corpus of learning data used by law enforcement agencies’ facial recognition software.

During the ensuing discussion I remarked that traditional activism directed at increasing privacy protections–or the enacting of ‘self-defense’ measures–should be part of a broader strategy aimed at reversing the so-called ‘asymmetric panopticon‘: citizens need to demand ‘surveillance’ in the other direction, back at government and corporations. For the former, this would mean pushing back against the current classification craze, which sees an increasing number of documents marked ‘Secret’ ‘Top Secret’ or some other risible security level–and which results in absurd sentences being levied on those who, like Chelsea Manning, violate such constraints; for the latter, this entails demanding that corporations offer greater transparency about their data collection, usage, and analysis–and are not able to easily rely on the protection of trade secret law in claiming that these techniques are ‘proprietary.’ This ‘push back,’ of course, relies on changing the nature of the discourse surrounding governmental and corporate secrecy, which is all too often able to offer facile arguments that link secrecy and security or secrecy and business strategy. In many ways, this might be the  most onerous challenge of all; all too many citizens are still persuaded by the ludicrous ‘if you’ve done nothing illegal you’ve got nothing to hide’ and ‘knowing everything about you is essential for us to keep you safe (or sell you goods’ arguments.

Note: After I finished my talk and returned to my office, I received an email from one of the attendees who wrote:

 

Self-Policing In Response To Pervasive Surveillance

On Thursday night, in the course of conversation with some of my Brooklyn College colleagues, I confessed to having internalized a peculiar sort of ‘chilling effect’ induced by a heightened sensitivity to our modern surveillance state. To wit, I said something along the lines of “I would love to travel to Iran and Pakistan, but I’m a little apprehensive about the increased scrutiny that would result.” When pressed to clarify by my companions, I made some rambling remarks that roughly amounted to the following. Travel to Iran and Pakistan–Islamic nations highly implicated in various foreign policy imbroglios with the US and often accused of supporting terrorism–is highly monitored by national law enforcement and intelligence agencies (the FBI, CIA, NSA); I expected to encounter some uncomfortable moments on my arrival back in the US thanks to questioning by customs and immigration officers (with a first name like mine–which is not Arabic in origin but is in point of fact, a very common and popular name in the Middle East–I would expect nothing less). Moreover, given the data point that my wife is Muslim, I would expect such attention to be heightened (data mining algorithms would establish a ‘networked’ connection between us and given my wife’s own problems when flying, I would expect such a connection to possibly be more ‘suspicious’) ; thereafter, I could expect increased scrutiny every time I traveled (and perhaps in other walks of life, given the extent of data sharing between various governmental agencies).

It is quite possible that all of the above sounds extremely paranoid and misinformed, and my worries a little silly, but I do not think there are no glimmers of truth in there. The technical details are not too wildly off the mark; the increased scrutiny after travel is a common occurrence for many travelers deemed ‘suspicious’ for unknown reasons; and so on. The net result is a curious sort of self-policing on my part: as I look to make travel plans for the future I will, with varying degrees of self-awareness about my motivations, prefer other destinations and locales. I will have allowed myself to be subject to an invisible set of constraints not actually experienced (except indirectly, in part, as in my wife’s experiences when flying.)

This sort of ‘indirect control’ might be pervasive surveillance’s most pernicious effect.

Note: My desire to travel to Iran and Pakistan is grounded in some fairly straightforward desires: Iran is a fascinating and complex country, host to an age-old civilization, with a rich culture and a thriving intellectual and academic scene; Pakistan is of obvious interest to someone born in India, but even more so to someone whose ethnic background is Punjabi, for part of the partitioned Punjab is now in Pakistan (as I noted in an earlier post about my ‘passing for Pakistani,’ “my father’s side of the family hails from a little village–now a middling town–called Dilawar Cheema, now in Pakistan, in Gujranwala District, Tehsil Wazirabad, in the former West Punjab.”)

Is “Black Lives Matter” Aiding And Abetting Criminals?

This is a very serious question and deserves a serious answer. It is so serious that the New York Times has asked: Is “police reticence in the face of such protests, some led by groups like Black Lives Matter causing crime to rise in some cities”? The first answers are in. Those honorable folk, “the heads of the F.B.I. and the Drug Enforcement Administration said they believed that this so-called Ferguson Effect seemed to be real.” (The Ferguson Effect, which sounds like an atmospheric condition that produces high winds and heavy rain, is capable of creating law and order crises.)

In general, whenever black folk get uppity, crime increases. See, for instance, the wave of crime that spread through the American Deep South after the Civil War during the Reconstruction Era when freed slaves went on a rampage, killing, raping, and looting. Some folks blame that on white racists worried about the imbalance in the old power equations of the American South, but we should remind ourselves that the folks conducting those terrorist campaigns were riding around on horses while wearing white robes and hoods, so we will never, I mean never, know whether they were white or not.

We need not debate this question for too long. The FBI and the DEA–fine, upstanding defenders of civil liberties, and really, the first folks we should check in with when it’s time to evaluate political protest conducted by minorities–would never speak falsely on such matters. Besides, they have better things to do–like entrapping young Muslims in terrorist plots, arresting folks smoking that dangerous chemical, marijuana, and listening to the phone conversations, and reading the emails of, American citizens. (Some pedant will say I should be talking about the NSA but in this post 9/11 intelligence-sharing era, what’s the difference?)

We should be curious though about what such “police reticence” amounts to. Perhaps it means the following. Police officers will not be able to: fire sixteen bullets–known as ’emptying a clip’, I’m told–at black teenagers walking on a highway even ones with knives; come scrambling out of a car and begin firing, assaulting-a-Pacific-Beach style, at a twelve-year old playing with a toy gun in a children’s playground; shoot black men in wheelchairs; drive around a city with a ‘suspect’ in a paddy wagon, and then beat him to death; place sellers of illicit cigarettes in fatal strangleholds; shoot black men in the back, whether during an undercover drug sting or after a traffic stop; shoot black men who have knocked on doors seeking help; search, randomly and roughly, hundreds and thousands of young black men and women in their neighborhoods for looking suspicious.

The ultimate ramifications of such handicapping of our armed forces–sorry, police–are as yet, only poorly understood, but the contours of the resultant landscape are perhaps visible. Black folks will once again walk the streets; they will stay out late at night; they will go into white neighborhoods and mingle with the populace there. Of all the chilling effects of this new police caution, the last one, surely, is the most chilling. Black folks will be set free among us. The horror.

The NSA’s Bullrun Around Encryption

A few weeks ago, over at The Washington Spectator, I wrote a post on the NSA, which mentioned its historical–and historic–struggles with the pioneers of encryption:

[W]hen the NSA got wind of academic research on cryptography, its agents approached those working on such research and “suggested” that all such research be vetted by the NSA. Roughly, the NSA’s instructions to encryption researchers were: keep us apprised of what you are doing and run it by us for clearance before you release it to other academics.

It might have been the first time that a powerful covert government agency had suggested that academic research be controlled and monitored in this fashion: the NSA wanted nothing less than a monopoly on cryptography research. Given the NSA’s resistance to encryption reaching the masses, it’s a miracle we have it facilitating e-commerce today.

…[T]he NSA [and] the FBI…became more aggressive in attempting to prosecute those who made encryption software public.

For instance, the 1991 release of PGP (Pretty Good Privacy), a data encryption tool by developer Phil Zimmerman, was regarded as the “export” of a deadly weapon. It triggered a criminal investigation and ultimately failed prosecution of Zimmermann.

…We should not imagine that because the battle to bring encryption and privacy to the masses was won in the past that all future battles will be.

And today, I awoke to read this:

The National Security Agency is winning its long-running secret war on encryption, using supercomputers, technical trickery, court orders and behind-the-scenes persuasion to undermine the major tools protecting the privacy of everyday communications in the Internet age, according to newly disclosed documents.

The agency has circumvented or cracked much of the encryption, or digital scrambling, that guards global commerce and banking systems, protects sensitive data like trade secrets and medical records, and automatically secures the e-mails, Web searches, Internet chats and phone calls of Americans and others around the world, the documents show.

….Beginning in 2000, as encryption tools were gradually blanketing the Web, the N.S.A. invested billions of dollars in a clandestine campaign to preserve its ability to eavesdrop. Having lost a public battle in the 1990s to insert its own “back door” in all encryption, it set out to accomplish the same goal by stealth.

The agency, according to the documents and interviews with industry officials, deployed custom-built, superfast computers to break codes, and began collaborating with technology companies in the United States and abroad to build entry points into their products.

This is perhaps the most stunning revelation to have come from Edward Snowden yet. Privacy advocates have always suggested the use of encryption as a privacy-enhancing tool; these revelations show the NSA is winning the battle against it as well.

The NSA has now marked itself out as a truly distinctive agency: one that will stop at no measure–legal or not–to achieve its goals of complete surveillance. The almost perfectly asymmetrical relationship with secrecy that it has demanded and often, successfully created, has been one of its most astonishing achievements. This latest effort shows just how far it is willing to go.

Thus far, I’ve only read two news reports on Bullrun, the NSA’s anti-encryption program; I hope to write more on it once I’ve had a chance to read more about its details.

Drop The Whistle; Shoot A Black Kid Instead (or Torture Prisoners)

Chelsea Manning has been sentenced to jail for thirty-five years for committing the heinous crime of whistleblowing. Manning knows that she didn’t just commit a crime, she committed the wrong sort of crime:

Manning spoke to reporters after the hearing, to admit his disappointment at the sentence, telling those gathered, “I look back to that fateful day and wish I’d just left those files on my computer and gone out and shot a black kid instead….my life would be a lot less complicated if I’d only taken the life of a young person from a different ethnic background, instead of sending some documents to a website.”

Legal experts have expressed support for the 35 year sentence given to Manning, by explaining that members of the public don’t actually understand how the law works. Former lawyer Simon Williams explained, “Illegally taking information you don’t have the right to access, and using it for your own purposes is only ‘properly’ illegal if you’re not a government agency. Governments can do what they like with information they’re not allowed to have – if nothing else, Prism has taught us that.  Whereas absolutely anyone can shoot a black teenager to death, obviously.”

Manning has learnt these simple facts the hard way but that doesn’t mean that those young folks who have been following her trial have to as well.  They will, in particular, have hopefully internalized the following facts about the system of justice prevalent in this great nation of ours.

First, a career in high finance can ensure the penalty-free satisfaction of desires, even if their fulfillment runs afoul of ethical and legal consideration. If unbridled earning with no regard for the immiseration of others is your thing, then young folks will do well to pay attention to the so-called financial crisis of 2008 and its aftermath.  Giddy, reckless speculation, irresponsible and unregulated banking will never, ever get you sent to jail. This career option is best for those seeking to maximize income, while not being unduly worried about penalties. Indeed, this might earn you the admiring sobriquet of ‘indispensable’ by government officials.

Second, if finance seems dull, and big bucks seem passé, and your taste in pleasures are influenced by Sade, then consider a career in law-enforcement or counter-intelligence instead. Brutal interrogations, torture, and unchecked surveillance can induce sufficient frisson to satisfy even the most jaded. As before, there will be little fear of moral disapproval or legal penalty. This career choice, while not as lucrative as banking, does provide admiration from those who will regard you as a defender of their liberties and a fighter for freedom everywhere.

Lastly, if your career options are settled and you are looking for easy entertainment, then as Manning indicates, consider shooting black teenagers instead, especially those that wear clothes which, despite being worn by countless white teenagers, will always be regarded as symbols of black criminality. This course of action might even net you a book deal, or a moonlighting gig as spokesperson for the National Rifle Association.

Parents would do well to inculcate these principles early in their children.

Note: I have edited this post to use Manning’s name and pronouns of choice. Good luck to her.

The Asymmetric Panopticon

As I’ve noted before on this blog–in unison with many other commentators–the ‘if you’ve got nothing to hide, then you shouldn’t mind the government spying on you’ argument is among the dumbest to be made in defense of the NSA‘s surveillance program. A related argument is the ‘we don’t have privacy anyway, so quit tilting at windmills.’

A composite assumption of sorts that emerges from these is that the citizenry has no privacy, has no reasonable expectation of any in today’s most notable sphere of personal, political and economic interaction – the Internet, and thus, should be prepared and accepting of essentially unlimited scrutiny of its activities by the government and even private corporations.

These assumptions, along with the wholesale swallowing of governmental and corporate rationales for secrecy in the face of shadowy external threats and proprietary imperatives respectively have led to a rather dangerous panopticon: we are visible at all times, under a steady and constant gaze, to these ever-powerful entities, but they, and their internal machinations are not. (As I noted in my post on Bill Keller last year, it has also led to incompetent journalists asserting that those who demand transparency about the government should disclose details about their personal lives.)

There is nothing remotely symmetric about this arrangement.

On the governmental end, more material than ever before is rated ‘Classified’ or ‘Top Secret’ thus ensuring that those who strive to make it available to the public eye face–as may be seen in the case of Julian Assange, Bradley Manning or Edward Snowden–prosecution and public ridicule. It is worth remembering that the government’s classification of material as ‘Top Secret’, which is the basis for legal prosecution of whistleblowers, is never up for contestation. Thus, one strategy to make transparency harder and whistleblowing more dangerous is to simply classify huge amounts of material thus. It helps too, to mount a furious barrage of accusations of treason and worse against the whistleblower. (A related strategy makes it harder to observe and record the work of law enforcement officers: New York’s S.2402 bill, will, if nothing else, make it much more dangerous to videotape police officers in action.)

On the corporate end, opacity is ensured by a bewildering combination of trade secrets, non-disclosure agreements, proprietary recipes, business methods, and the like; these ensure that those who collect data about us are almost always working in the shadows, away from the public eye, their machinations and strategies and imperatives poorly understood.

So, we find ourselves at this pass: we are told that we have no privacy and should not expect any, but those who want our data and use it to control the contours of our lives, have all the privacy they need and want and then some; we are told that if we have nothing to hide, we have nothing to fear, but those who collect our data surreptitiously are allowed to hide what they do.  (Frank Pasquale‘s forthcoming book The Black Box Society: Technologies of Search, Reputation, and Finance will analyze and highlight this alarming state of affairs. As Pasquale points out, transparency should be a two-way street; data disclosure agreements should require the collectors to make themselves and their methods known and visible.)

The tables have been turned and we are pinned beneath them.  We cower, while our data collectors strut and preen.