03 2018
Digital Cuts
Cutting flesh-technology-information-amalgams
In 2015, British artist James Bridle released the browser plugin Citizen Ex, which documents how our ‘data doubles’ or ‘data shadows’ pass through different jurisdictions while we surf the net. [1] The aim of the project was to steer attention towards a new form of temporary or even ephemeral citizenship — algorithmic citizenship — that emerges from the logics of transnational connectivity. Algorithmic citizenship does not grant one the rights of a common citizenship, but it can have serious consequences for people using the Internet, including with respect to freedom of expression, data privacy, or youth protection.
In 2001 activists of the noborder action camp in Strasbourg were already dedicating their attention to algorithmic allocations.[2] The relationship between data and bodies was the basis of an intervention that began with the observation that migrants’ data can easily cross EU borders and circulate within the Schengen Area while the people themselves remain physically stuck in camps or beyond EUropean borders. Tracing such localizations of the division between embodied subjects and their assigned data, guerilla communication activists from the camp headed for the data center of the Schengen Information System (SIS II) and dug out a suitable network cable in order to allegedly acquire data from SIS II with a laptop and rearrange it with the help of a plugin. The activists set themselves to the task of getting access to data doubles otherwise beyond their reach by using a cable.
In 2011, Viennese activist Max Schrems used European data protection legislation to force Facebook to hand over the data the social network had accumulated about him.[3] He received a CD with more than 1000 pages of information. The CD not only contained postings made by himself, such as selfies, chats or data about his Facebook friends, but also a kind of shadow profile, data deleted long ago, and masses of meta data, such as locations, IP addresses and computers he used, etc.[4] Ever since, Facebook offers its inhabitants a standard download feature to access a copy (of some) of the data stored on them.
All three examples address the complex relationship between people and their so-called data doubles, meaning the conglomeration of—willingly as much as unwillingly provided—data trails. These are data trails over which inhabitants of digital technecologies have long lost control and which partly live an obscure life of their own.
Today, fingerprint scanners in hot spots of border regimes and social media like Facebook have turned into interfaces between embodied subjects and data doubles. Data doubles influence rights of entry and residence permits, credit ratings, and the selection of postings, news, and advertising we encounter on social media platforms. They feed predictive policing software and the kill-lists of the drone war. Data doubles and embodied subjects are frequently discussed as being hybrid or cyborgian. But relatively little attention is paid to the fact that they are, in many cases, treated as separable—which becomes especially evident in border regimes—and have indeed proven to be divisible. Questions of divisibility and of in/voluntary participation have been debated with the concepts of dividuum and dividuation.[5] In these terms, we are necessarily divided, always involved, and constituted in different processes of sharing or participating—whether these are imaginary, affective, physical, or otherwise. Without losing sight of this assemblage, we want to focus on the cuts that shape and consolidate the relationship between the embodied subject and the data double. In so doing, we not only direct attention to categories of hybridity and amalgamation through division, but most importantly to the agency of division itself, to the cuts or dividings and their implications in surrounding contexts. Finally, we will outline some interventions that irritate, run contrary to, or reveal this form of standardizing fragmentation, understood as an adherence of regimes to different cuts.
Flesh-Technology-Information-Amalgams
In feminist Science & Technology Studies, intersections of bodies and technology have been negotiated as cyborgian, as assemblages, and as the result of specific practices of boundary making for quite some time.[6] These concepts have been taken up and advanced in theories of the “surveillant assemblage”. The surveillant assemblage abstracts human bodies from their territorial settings, turns them into data flows, and (re)assembles them as “data doubles”.[7] It turns bodies into Harawayan cyborgs, “flesh-technology-information-amalgam[s]”.[8] Data doubles function as a kind of additional self, which influences access to resources and can be the target of marketing practices and governmental power techniques.[9] Although data doubles may claim to refer back to specific individuals, they exceed the logic of representation and ultimately need to be understood as a mechanism of social sorting.[10]
By abstracting bodies to data flows, a new possibility of governing emerges, which the French legal scholar Antoinette Rouvroy has termed algorithmic governmentality.[11] This kind of governmentality no longer targets specific individuals, but rather addresses potentials and possible behavior, infra-individual data, and supra-individual profiles (that is, data doubles) by means of risk management, data mining, or big data applications. [12] It avoids confrontations with human subjects, and rather appeals to profiles and alleged potential modes of behavior that can be extracted from them (such as potential crime, visa-overstay, consumption etc.).[13] It can, however, have lasting effects on individual subjects. Algorithmic governmentality is an attempt to anticipatorily tame the future by reducing the virtual dimension of that which unpredictably happens in the now to calculable formulas and profiles, which guide the actions to be taken.[14]
While in the pre-digital era it was possible to move stacks of paper from one desk to another, or files from one authority to the next, and to treat these as representation of certain subjects, the internal dynamics of data doubles, the incomprehensible amount of data, and the link between information that is given knowingly or unknowingly, willingly or unwillingly are specifics of digital technecologies and hardly possible without big data, or the merging of data bases. Raw data gets turned into deterritorialized signals, which, in opposition to older statistical logics, do not create knowledge about the world in need of interpretation, but rather are directly taken from the digital world and hold the promise of absolute objectivity.[15] Here, knowledge is not so much generated as discovered. The mechanisms that make it possible to separate the embodied subject from its data double or to treat it as separable within digital environments need to be examined more closely.
Agential Cuts – Digital Cuts
Within the described technecologies, various human and nonhuman actors separate embodied subjects and data doubles in different material-discursive practices and apparatuses: these include, for example, operators of fingerprinting scanners and fingerprint data bases, software that creates profiles for advertisers, algorithms that route data flows through certain servers, or artificial intelligences that produce demarcations within flesh-technology-information-amalgams.
Feminist scholar Karen Barad holds that subjects, objects and agencies of observation are always entangled with one another, they intra-act. [16] Only in these intra-actions do boundaries of bodies and subjects materialize. These boundaries always need to be understood as temporary and local. In order to be able to describe phenomena despite this entanglement, Barad introduces the notion of an “agential cut”, which is supposed to allow the temporary and local separation between observer, observed, and agencies of observation.[17] As this concept of objectivity does not include traditional ontological exteriority, agential separability has to be created by agential cuts to allow for an objective perspective. [18] Barad is therefore initially concerned with entanglements that involve formations of materiality, agency, and topological changes. These are intrinsically linked to questions of human and nonhuman boundary making practices. Entities emerge as a result of agential cuts within phenomena, consequently producing new phenomena.[19] Her theory is therefore not about absolute distinctions and differences as such, but rather about meaningful and material cuts that do not suspend entanglement: “Causality is an entangled affair: it is a matter of cutting things together and apart (within and as part of phenomena).”[20] Bodies materialize differentially and because agential cuts not only produce knowledge about bodies, but shape bodies, objectivity means to take responsibility for the materializations the cuts produce.
The cuts that help Barad realize feminist concepts of objectivity seem to be mirrored in a digital form in the boundary making between embodied subject and data double – although in most instances certainly not with the aim of realizing feminist concepts of objectivity. We nonetheless believe that Barad’s concept of agential cuts—precisely because they introduce temporary and local boundaries between ontologically inseparable, intra-acting components—makes it possible to describe these phenomena which have been somewhat neglected by theorists. We therefore understand digital cuts as temporary-local separations or divisions of otherwise hybrid or interdependently functioning components of flesh-technology-information-amalgams. The cuts can be enacted by human and nonhuman actors alike and are linked to specific truth claims, as they process allegedly objective and real data traces. Raw data, on the one hand, is usually interpreted as an immanent digital reality and not as knowledge produced by material-discursive practices. On the other hand, digital cuts are temporary-local separations that create phenomena, which are then subject to different regimes and their respective claims to objectivity, while the cut itself is not considered as a form of truth production. Digital cuts can separate data doubles from embodied subjects or perform cuts within data doubles. The concept of the digital cut is suitable to describe phenomena where these separations happen knowingly and willingly, as well as phenomena where the divisions happen by force, such as the enrollment of biometric information in databases of migration control. Digital cuts can be made by human and nonhuman actors like artificial intelligences. With the help of these cuts, flesh-technology-information amalgams can be subordinated to different legal, technological or biopolitical regimes and be processed according to their respective logics. Besides common accentuations of hybridity and amalgamation, it is necessary to investigate where and with what consequences these linkages are once again broken up: in some cases, such as with the biometric data of migrants stuck in “hotspots” that is able to transversally travel through EUrope, a reference to a specific individual remains with a cut, while in other cases, such as when potentials are negotiated, a disengagement with specific subjects is programmatic. One example for such disengagements involves counter-terrorism measures such as risk alerts, where specific surnames, religious affiliation, language skills, or travel routes, etc., can turn into risk potentials. Here, the focus is not given to concrete individuals in the name of security, but to fragmented elements of a supposed risk. The potentially dangerous, dividuated subject is assembled from an amalgam of partial elements of other subjects and objects.[21] In some situations, interfaces between embodied subject and data double simultaneously prove to be the agency that performs a cut, in others—for example, in the case of intelligence surveillance or in social network analysis of the drone wars—interfaces like social media have little to do with the cuts. Sometimes agential cuts are rooted in the internal logics of specific technologies. The algorithmic citizenships examined by Bridle, for example, are generated by the internal logics of routing. The effects of cuts range from existentially threatening the constitution of one’s own life to banal recommendations concerning films and products on Netflix or Amazon.
Becoming Machine and Digital Sanctuary Cities
The interventions presented at the outset of this text address different levels of dividing flesh-technology-information-amalgams by means of law, civil disobedience, art, und technology: Bridle’s plugin documents the medial logic of routing, which time and again digitally splits citizenship in ever new algorithmically generated sub-citizenships and subordinates data doubles to constantly changing jurisdictions. The project attempts to make certain cuts visible and raises awareness of these processes. Schrems’ legal action against Facebook intervened by legal means in the non-transparent praxis of data collection and has allowed at least partial access to formerly inaccessible aspects of data doubles on Facebook. The noborder camp activists attempted to intervene on a symbolic level with an act of civil disobedience and guerilla communication and managed to draw attention to digital cuts of the EU migration and border regime.
Other approaches attempt to intervene more directly in flesh-technology-information-amalgams in order to make digital cuts obsolete. Starting from logics of data, they play with becoming machine: various browser plugins or bots like TrackMeNot, AdNauseam, MakeInternetNoise produce automatized requests or clicks for users and thus fill data doubles with random data, which complicates at least some forms of tracking and profiling. The disorder and multiplication achieved via random number generators appropriates dividuation, not to suspend temporary-local divisions, but in order to subvert their truth claims (in the sense of social sorting). The cuts become meaningless.
One concept that at least addresses accountability and responsibility for digital cuts within border regimes are Digital Sanctuary Cities. Some cities in the USA, which have limited their cooperation with state immigration offices and offer city services for people without papers to protect illegalized migrants, have also committed themselves to the protection of data doubles and delivered approaches to deal responsibly with the phenomenon of digital cuts. [22] A Sunlight Foundation white paper proposes some basic principles of digital sanctuaries, including not least accountability for information collection, limitation or avoidance of data collection, regular deletion of sensitive data, where possible, anonymization, "notice and consent" opportunities as well as limiting data-sharing and the merging of databases.[23] Digital Sanctuary Cities highlight the role of local politics and city authorities in the state regulation of migration. They try to extend care practices not only to data doubles, but also try to deal responsibly with digital cuts on a local level. This is premised on the insight that municipal data on people, even if it was not initially collected for the purpose of surveillance of migrants, can easily be used for surveillance and control measures. Access to schools, work, housing, or healthcare is always accompanied by forms of data collection that can be dangerous for some, or influence their status of deportability. One aspect to which attention shall be called here is that data protection alone is not enough and that the collection of data on city residents has to be limited in general. Furthermore, it is necessary to enlighten local authorities, libraries, medical offices, and property managers about what “sensitive data” on people actually is, and under what circumstances, with what transparency, and with what possible consequences data should be collected and stored. Thus, it is not only about limiting the capture of data, but also about appropriate management, which entails the regular deletion of data, but also introduces restrictions on data disclosures and the merging of databases.
Digital Sanctuary Cities address the collection, storage, and dispersal of data with an analytic perspective sensitive to issues of power, border regimes, legal and social injustice, or precarity, which are all closely linked to digital cuts. These interventions can be understood as measures that make it possible to comprehend these cuts as specific boundary making practices and as constitutive of everyday reality. They not only pull cuts out of black boxes, but also bring them to spaces of negotiation. And there would be much to negotiate: should the rights of data doubles, such as their possibility for transnational circulation, not be far more closely linked to the rights of embodied subjects—as it is exactly the entanglement between embodied subject and data double that enforces the movement of the one and the stasis of the other? Should the right to bodily integrity not also hold true for our biometric data, and so on? With the notion of digital cuts we want to add a theoretical approach to a topic already prevalent in artistic and political practice. This should make it possible to at least partially open some black boxes and introduce their content into democratic negotiation. Taking digital cuts seriously means creating cyborgian allegiances and partnerships in which normative ordering principles like the law are just as important to involve as disorder-causing bots or anonymization tools for local authorities in Sanctuary Cities. And it might sometimes entail digging for cables in order to start a conversation with your own data double.
[1] Bridle, James: „Algorithmic Citizenship“ (2015), http://citizen-ex.com/citizenship [15.01.2018].
[2]See, Hamm, Marion: „Ar/ctivism in Physical and Virtual Spaces“ (2003), http://transversal.at/transversal/1203/hamm/en [15.01.2018]; Schmidt, Jürgen: „another war is possible // volXtheater“ (2003), http://transversal.at/transversal/1203/schmidt/en [15.01.2018].,
[3] Coscarelli, Joe: „One Man's War against Facebook on the European Front“, New York Mag (Oktober 2011), http://nymag.com/daily/intelligencer/2011/10/one_mans_war_against_facebook.html [15.01.2018].
[4] “Europe vs. Facebook”, http://europe-v-facebook.org/EN/en.html [15.01.2018].
[5] See, Deleuze, Gilles: „Postscript on Control Societies“, in: Id.: Negotiations 1972–1990, New York, 1995, 169-182; Raunig, Gerald: Dividuum. Los Angeles 2016 [2014]; Ott, Michaela: Dividuationen. Theorien der Teilhabe. Berlin 2014.
[6] Suchman, Lucy, „Feminist STS and the Sciences of the Artificial“, in: Edward J. Hackett, Olga Amsterdamska, Michael Lynch und Judy Wajcman (eds.), The Handbook of Science and Technology Studies, Cambridge, MASS. / London 2008, p. 139-163, p. 150. Haraway, Donna, „A Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century“, in: Ead.: Simians, Cyborgs, and Women. The Reinvention of Nature. New York 1991, p. 149-181.
[7] Haggerty, Kevin D.; Ericson, Richard: „The surveillant assemblage“, British Journal of Sociology, 51, 4, (2000), p. 605-622, p. 606.
[8] Ibid. p. 611.
[9] Ibid. p. 613.
[10] Ibid. p. 614; Lyon, David (eds.): Surveillance as Social Sorting: Privacy, Risk and Automated Discrimination, London / New York 2005.
[11] Rouvroy, Antoinette: „The end(s) of critique. Data behaviourism versus due process”, in: Mireille Hildebrandt, Katja de Vries (eds.): Privacy, Due Process and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology, Abingdon und Oxon 2013, p.143-167.
[12] Ibid., p. 152, 161.
[13] Ibid., p. 152.
[14] Ibid., pp. 152.
[15] See, ibid., p. 147.
[16] Barad, Karen: Meeting the Universe Halfway. Quantum Physics and the Entanglement of Matter and Meaning, Durham / London, 2007, p. 197; Kämpf, Katrin M.; Mergl, Matthias: „Freeze! Eine queere Objektivitätsbricolage aus Karen Barads Empistem-Ontologie“, in: Nina Degele, Sigrid Schmitz, Marion Mangelsdorf, Elke Gramespacher (eds.): Gendered Bodies in Motion. Opladen/Farmington Hills 2010, p. 103-114.
[17] Barad, Karen: Meeting the Universe Halfway. Quantum Physics and the Entanglement of Matter and Meaning. Durham / London 2007, p. 148.
[18] Ibid., p. 140.
[19] See, ibid. p.148.
[20] Barad, Karen: Verschränkungen, Berlin 2015, p. 182; see also Barad, Karen: Meeting the Universe Halfway. Quantum Physics and the Entanglement of Matter and Meaning. Durham / London 2007, p. 394.
[21] See, Amoore, Louise: The Politics of Possibility. Risk and Security Beyond Probability. Durham/ London 2013, p. 131 [u.a.].
[22] Misra, Tanvi: “The new 'digital' sanctuaries”. Citizenlab (14.11.2017). https://www.citylab.com/equity/2017/11/new-digital-sanctuary-cities/541008 [15.1.2018]. For a critical take on Sanctuary Cities, which can also be interpreted as a form of governing migration, see Mancina, Peter: In the Spirit of Sanctuary: Sanctuary-City Policy Advocacy and the Production of Sanctuary-Power in San Francisco, California. Diss. Vanderbilt University, Nashville Tennessee 2016. http://etd.library.vanderbilt.edu/available/etd-07112016-193322/unrestricted/Mancina.pdf.pdf [15.1.2018].
[23] Sunlight Foundation: „Protecting Data, Protecting Residents. 10 Principles for Responsible Municipal Data Management“ (02/2017), https://sunlightfoundation.com/wp-content/uploads/2017/02/Protecting-data-protecting-residents-whitepaper.pdf [15.1.2018].