Sunday, October 23, 2022

Digital dark sousveillance: Becoming NULL vs. DROPping the table [Rohan - core post #5]

This week’s texts seemed to be concerned with two different types of surveillance. On one hand, Browne, while reacting to “the new surveillance” (p. 14), is primarily concerned with surveillance conceptually and across eras. Meanwhile, Andrejevic, Vaidhyanathan, and Gaboury are specifically interested in the implications of digital surveillance through datafication, or dataveillance. While a clear distinction between the two is not necessary, I found it helpful in order to evaluate Gaboury’s argument about null values in SQL as a mechanism of dark sousveillance.

Browne draws on Steve Mann’s Veillance Plane to distinguish between sousveillance, surveillance, and other forms of veillance. Her reading of the plane tells us that surveillance refers to an entity in a position of power—often an organization—observes individuals; and that sousveillance refers to the converse relationship, when an individual with less power observes an entity with more power. The prototypical example of sousveillance is portable or wearable technology, such as George Holliday recording the police beating Rodney King.

Browne further defines “dark sousveillance” as “a site of critique” on a third dimension which “takes form in antisurveillance, countersurveillance, and other freedom practices” (p. 21). According to Mann’s diagram, counterveillance refers to a situation in which both surveillance and sousveillance are forbidden and antisurveillance refers to a situation in which surveillance is forbidden and sousveillance is absent or irrelevant. The common characteristic is that surveillance is not only absent but in fact impossible; this is why Browne refers to them as “freedom practices” against racializing surveillance.

What does dark sousveillance look like in terms of digital surveillance through datafication? Is it even possible? Gaboury seems to indicate that “becoming NULL” constitutes a dark sousveillance practice because it obfuscates visibility and thus “removes [oneself] from the productive logic of the system that would seek to identify it” (p. 154). However, I would argue that “becoming NULL” fails a crucial condition of dark sousveillance: escaping surveillance.

Obfuscating particular fields in a database through NULL values does not actually constitute the same anti-surveillance displayed on Mann’s Veillance Plane. Surveillance is still happening; it’s just being obfuscated. In order for it to constitute anti-surveillance, the obfuscation must be comprehensive and built into the practice by design. While this may be possible in analog surveillance, I would argue that SQL is a poor example in digital surveillance because an obfuscated digital observation is still legible because a record still exists in a database. Thus, database admins and analysts have developed practices to retain their legibility. For example, in the past, I have imputed data from other observations that share common characteristics (e.g., multiple people living in the same household); retained meaningful NULL values (e.g., when NULL represents exclusion from a campaign); and assigned default values for a required field. In the latter case, the COALESCE function is used specifically to overwrite NULL values in order to render all database rows legible. The point here is that “becoming NULL” seems less powerful than the datafication impulse, even if it requires sacrificing accuracy.

How else might dark sousveillance look online? It seems fairly impossible given the possibilities for tracking digital fingerprints. Instead, obfuscation or avoidance must happen further upstream: not at the database level, but instead at the point of data collection by blocking ads, cookies, and other trackers and avoiding transactions that leave digital traces. Here Andrejevic’s text is instructive because he focuses on population-level, as opposed to individual-level, subject formation. Vaidhyanathan’s example of Google Street View illustrates the difference: once there’s space allocated for your image or your property, there’s no way to completely nullify that record on an individual level (and attempting to render your record NULL risks triggering the Streisand Effect). The only way to avoid Google’s surveillance is for the population to refuse the system altogether. Or, to extend Gaboury's example, dark sousveillance calls not for becoming NULL individually, but for a movement to DROP the table altogether.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.