Obfuscation: Exploiting data entropy

Posted on | February 17, 2014 | No Comments

Thanks to Seda Gürses for her efforts in organizing such a wonderful day. I am still wearing my camo souvenir!

Saturday’s ISTC for Social Computing Obfuscation Symposium at NYU brought together experts in law, policy, arts, activism, engineering, computer science, design and anthropology to discuss case studies and demos exploring obfuscation – strategies whereby ‘individuals, groups or communities hide, protect themselves, protest or enact civil disobedience, especially in the context of monitoring, aggregated analysis, and profiling in (digital) space’. Speakers debated the ethical and technical costs of sending confusing, diverting, or misleading information as a ‘weapon of the weak’ in an era of inevitable surveillance and big data, as well as new priorities for consumer protection rights when organizations adopt obfuscation techniques in, for example, the realm of privacy policies.

The day began with a brief overview of the term obfuscation and the benefits of the framework as defined by Helen Nissenbaum and Finn Brunton in ‘Vernacular resistance to data collection and analysis: A political theory of obfuscation’ (2011). Their point in the paper is that obfuscation is about confusing rather than hiding; it is a chance for ordinary users to ‘take control’ when faced with encounters that are defined by information asymmetry.

Obfuscation works as an opportunity to buy time, create plausible deniability, provide cover, foil profiling and ultimately elude surveillance in situations that amount to ‘data tyranny’. Obfuscation is a more promising alternative to the common defeatism invoked by arbitrary monitoring power, i.e. when users simply ‘suck it up’ and/or ‘hope for the best’ in their transactions.

Communication scholar Joseph Turow’s complementary take on this position highlighted that organizations also routinely engage in obfuscation practices to mislead. Obfuscation is not just a technique of those without power, then. It is a means for a company to avoid ‘anything marginal to its own primary resource considerations’. Turow talked about privacy policies as a kind of phatic speech: the mere convention and the fact of presentation can have a reassuring effect on the user even if the content of a policy makes little sense or holds no binding obligation. Privacy policies often boil down to being a ‘tough luck contract’, yet they raise significant new questions (for communication studies in particular) about what is an audience and what is a public. Michael Warner is a useful reference here.

The technical sessions taught me a lot of new things to be concerned about! e.g. the practices of font probing, proxy piercing and finger printing. Demos of Ad Nauseum, Anonymouth and Vortex showcased obfuscation tools already in development. But not all of the talks focused on technology. Throughout the day, conversations regularly turned to nature and history to situate the practices under analysis.

Finn Brunton drew attention to the rare spider that sets up a decoy spider in its web to stay safe from predators. It’s a resonant illustration of how to produce confusing messages or signals when concealment is impossible. Following Hanna Rose Shell’s work on camouflage, my mind turned to the idea of exposure more broadly. Especially in open fields. I kept thinking of scarecrows! Which also made me wonder at the ways obfuscation has facilitated different modes of production. It was Finn’s mention of ‘foot dragging’ and ‘go slows’ in the industrial era that sent me in this direction.

Nick Montfort and Susan Stryker rounded out the day talking about code: in Nick’s case, the sophistication and knowingness that can underwrite the most simple and poetic computation expressions; in Susan’s, state-imposed gender distinctions that set punishing terms for legibility and recognition. Laura Kurgan’s investigation of satellite imagery was a lovely counterpoint to Susan’s talk, in the sense that it conveyed the historical assumptions that militate surveillance norms. As such, and as a result of concerted activism, the blessing is that they may pass.

As the day closed, I was most struck by how much obfuscation helps to reveal the very strong relationship between knowledge and sight in Western culture. At times, these parallels between visibility and legibility, recognition and comprehension seemed a little easy. I was much more interested in comments made by Finn, in conclusion, about how obfuscation might be used to secure a model of identity that is much less human-centric than we have managed in the past. This is surely true when we find ourselves needing protection from the way we perceive an algorithm to be reading us. Mark Andrejevic calls this threat the ‘drone logic’ of a ballistic imaginary.

I also left pondering Rachel Law’s observation that, in the move to algorithmic living – when college health insurance policies are calculated based on your father’s heart medication purchases – we are essentially post-Internet. We are no longer in the driver’s seat, logging on to discover information at random. As tailored Google search results only serve to prove, sharing information now creates opacity. Every time you send something to a friend or colleague you enclose them in your own data bubble. The techniques we need to recreate and demand the objectivity of information, in the face of data filters, is a matter of exploiting the weaknesses of ‘data entropy’.



Leave a Reply

  • @melgregg

    @aaocarroll Thank you, I will cash in that toast later today :-)

    About 3 hours ago from Mel's Twitter via TweetDeck

  • Currently reading

    Invisible Labor
  • Look Who