A partial testimony of a highly automated subject

Talk given at the event The Multiple Arts of Schematism in the Depths of the Soul, 26 May 2023, at V2 Rotterdam.

>> Write a talk called ‘A partial testimony of a highly automated subject’ in the style of Miriam Rasch – is the prompt I gave myself. What follows is a collage of earlier work, translated and reworked into English. <<

Sitting still seems to be all that is left.
      When I move, devices, apps and algorithms start quietly humming. Apparently, I produce something. Data.
      Data production requires movement, lots of movement. I don’t want to do it anymore; I no longer want to take part. So, I am sitting on my sofa staring outside, sixty-seventy kilogrammes of motionless opt-out.
      There is a camera down the street. I’ve no idea whether it belongs to the municipality, the person living there or the housing corporation. It looks over its shoulder at someone’s front door. A bigger, white camera ostentatiously keeps watch at the school on the other side of the square, clearly meant to catch the eye.
      The street I live in is quiet, and yet a police van drives down it at least once a day. Have we been designated somewhere as risky; does our postcode flash red in the systems? The city is called smart these days; it knows what is good for itself.
      Data thrives when there is action, clustering around hives of activity, around roads, stations, airports. Faces are scanned, gestures are classified – nervous, aggressive, neutral – telephone signals are tracked and screened. I can’t avoid being translated into data when I go outside, lending a hand in body and mind to the machinery of datafication. If I don’t want that, then I’ll have to go back indoors.

I try taking a data selfie. A self-portrait consisting of the traces of myself I can retrieve online.
      But I get bored by the trivial details. Instead of recognition – yep, that’s me – or insight – yikes, that’s me – my data selfie gives me a sense of stupid alienation.
      In the depths of my settings on Facebook (which I’ve quit long since) I found a mosaic of snapshots that are the 24 most-important categories the company uses to sell me to advertisers. The first tile characterising my profile was the Lowlands festival. Next to which there were tiles labelled philosopher, book trade, and something with animals.
      Then: fun fair, button, photo roll.
      Fun fair, button, photo roll?
      These three categories, together one-eighth of who I’m supposed to be, conjured up a surreal picture in my mind: with my hair like a rollercoaster wrapped round my head, my eye a button and my lips an overexposed film. I felt a little duped, or at least not taken seriously. The portrait seemed at most comical, but above all uninspired.

      ***

‘Google knows more about you than your own spouse.’
‘Amazon knows more about you than your best friend or your doctor.’
‘Facebook knows you better than your own mother.’
‘You are what you like.’

The headlines give me the chills. 150 likes or a few dozen clicks would be enough for online platforms to determine with great accuracy who you are, what you hope for and what you fear. With their data sets and algorithms, they would influence choices and behaviour without you realising it and know better than you what you want. The implication is that you would rather leave important decisions to them too.
      My chills are followed by scepticism. What is the nature of the knowledge these platforms excel at, I wonder. Is it really comparable to the knowledge that my loved ones have about me? And is such knowledge sufficient to make better choices than I could on my own?
      Well, in the first place, this is all about predicting buying behaviour – with buying behaviour also including voting behaviour, in these circles. Your supposed taste in clothes and political preferences are offered to the highest bidder, who can then bombard you with personalised ads.
      So, this knowledge would seem to have not so much to do with what I want, but what I can be seduced into. For the tech companies mentioned in the headlines, this kind of knowledge is surely the most valuable there is, but whether the same is true for my loved ones remains questionable.

It is not that difficult to make predictions about preferences, by the way. No matter how hard I would try not to reflect political beliefs in my classes, for instance, my voting behaviour turns out to be predicted by students pretty accurately. But that does not mean they know me better than my partner or my parents.
      Besides, what does Facebook really think it knows about my relationship with my partner or my mother?
      The ‘knowing’ of the algorithms is like a road leading from A to B. It goes only one way. Their collection and interpretation of information is a process of appropriation, you might even say, of aggression, trampling over, siege and seizure. Unlawfully, they have gained access to my mind (they cannot wait to enter my body too with their chips and implants) and now they boast the spoils.

      ***

They won’t mind if I see myself as the marionette that we meet in the Netflix documentary The Social Dilemma, or as Harari’s ‘hackable animal’. And damn me, sometimes I do feel like a fully automated… automaton.

Getting as much data as possible requires the least amount of friction. Friction disrupts the dream of having a ‘monopoly on reality production’. Dataism therefore follows the laws of frictionless design. Videos that play automatically, lights that adjust to the mood of the moment, voice assistants that help you get through the day. The ease of use of technology should be so high that you forget there is a device or app involved. If the apparatus retreats into an unobtrusive presence that is registered only passively, it may truly be called successful.
      We could all use a hand; indeed, friction occurs especially where humans themselves do not work smoothly. If we don’t articulate clearly enough, then surely our assistant won’t know what to do. If you’re always turned away in photos, of course the automatic tagging won’t work. If you do not make your preferences explicit with hearts and thumbs, how could you be sufficiently profiled?
      A simple example of frictionless design are the suggestions for replying in Gmail. After all, the most hard-boiled friction we encounter every day must be communication itself. Now, a response is written with a single click: ‘Excellent! Got it, thanks!’ LinkedIn teaches me to say: ‘Kudos!’ Or just send a thumbs up. The suggestions are predictions based on our collective past responses. ‘Excellent!’ is an echo of our own voice that Gmail sends back to us. The algorithm mimics us first, and in turn we mimic the algorithm. Meanwhile, the standardisation of communication teaches us what correct answers are, how being enthusiastic and friendly is the norm!
      Frictionless design in optima forma is design that completely takes over communication: apps that make contacts and appointments, the way Uber and Airbnb do between customer and driver or house owner. They make sure you no longer have to worry about potentially annoying interactions with other human beings.

      ***

But it’ll all only work with sufficient prior input. So please, behave, speak up, cry out, like, join, mute and retweet. I do, I will, I accept.
      In the end, all I am is my behaviour, the sum of all my tiny behaviours, like the movements of my thumb and eyes, my heart rate and eye movements. I am an open book, readable from the top of my skin, the depth of my health apps.
      My body and the data that can be mined from it, function as the pathway to understanding. It allows for predicting and controlling and manipulating that body, the very same body that was mined in the first place, and thus for predicting and controlling and manipulating the world, me in the world.
      It is a catch-22 situation, I’m made an accomplice to my own submission.
      I’m vaguely reminded of that rather quirky technology of the polygraph (more popularly known as the lie detector). Even though it’s still conjured up sometimes as a threat issued by the president of the United States of America, or in the movies, it seems a bit of an awkward object, not to be taken too seriously in its implications. If the person testifying lies, so claims the polygraph, this comes with certain emotions, like the fear of being caught, and these emotions translate into physiological activity. Heartbeat, sweat, galvanic skin responses. Just measure these reactions on the outside and you can determine what goes on on the inside. You will know the truth, even if you are not told the truth. The body is a source of data, and the data don’t lie.
      This truth about ourselves turns out to be rather boring, no? It was sitting there all along, waiting to be decrypted. We were just too stupid to notice. Turned inwards in old-fashioned Innenschau we missed what was there all along, like Poe’s purloined letter.

      ***

The soul isn’t deep, but poured out already into data sets, in profiles tied and knotted together across systems, talking to one another in languages I don’t understand. The depths, if ever I had them, shapeshift away from me, rolling in waves into the machine.
      Automated technology becomes autonomous, autonomous beings become automatons. What a mess.

One autonomy is not the other. What counts as proof that I as human cannot be autonomous is often used as an argument in favour of the autonomy of technology. Predictability robs humans of having a free will, but in the case of technology it is precisely a reason to name it autonomous. If a car or drone is programmed and automated to perfection, and thus can be fully trusted, it can be called autonomous.
      Opacity – which seems almost the opposite of predictability – is another such characteristic. For a long time, it was self-evident to attribute to humans an inner life that eluded explanation and understanding, and where autonomous considerations and decisions had to be located (because, where else?). But today, the inscrutability of one’s inner life and the intransparency of the processes and mechanisms that move us, are rather seen as proof that one cannot be autonomous. Again, for technology, it is the other way around: opacity is what makes them autonomous.

The algorithm promises to relieve me of my doubts and resolve my inner conflicts before they even arise. It promises frictionless autonomy. It will offer me my supreme autonomous decision on a silver platter, without me having to dig in my inner self or relate to something as tawdry as ‘the world’.
      Isn’t that all the easier? Instead of descending into the depths of my soul, I have myself read out on the surface. The detected patterns in my behaviour are like the placeholder of my personality. The more consistent the data, the more coherent the character. It will tell me what to do because I do it. Self-knowledge reduced to predictability.
      Dare I ask what pattern recognition and consistency in themselves have to do with knowledge or autonomy? Suppose my profile shows me someone I don’t want to be. It reflects back behaviour that is not in line with my values. For example, I am critical of the power of monopolies, preach about privacy, write about the disastrous belief called dataism, and yet there I am, diligently keeping up profiles, tracking steps or handing out scores to taxi drivers, hotels, or books I’ve read. Am I my behavioural patterns or my values? Isn’t it the inconsistency between the two that tells me the most about myself?
      In the friction between behaviour and value, between intention and execution, between confirmation and change – somewhere there, the possibility of being autonomous must lie. Somewhere there, meaning: in the distance between there and here. It won’t work without some notion of an inner world.

The Turkish Netflix series Bir Başkadır gives a poetic representation of this. Meryem, my near namesake, is a young, single woman from a traditional family, working as a cleaning lady and caring for her brother and his family. After a series of unexplained fainting spells, she is sent to a psychiatrist. There she talks incessantly, not about herself, but about all the others who are part of her life. She seems to – unknowingly? – testify to her own invisibility.
      The fact that someone is finally listening to her seems to please her. She becomes afraid that she will be forbidden to continue with the sessions; the people around her see therapy as modern nonsense. So, when the hodja, the neighbourhood’s religious leader, asks how it’s going, she tells him that she has not started seeing the therapist yet.
      It is a wonderful moment. Meryem’s quest for autonomy begins, one might conclude, with a lie. That lie creates an inner domain from which she can examine herself and her place in the world. Rather than using that space to dig up stories from her childhood or put her emotions on the table, she looks around her. That gaze betrays her growing autonomy. It emphasises the distance between here and there. (Wytske Versteeg: ‘Those who want to be seen by others will have to look themselves.’)

Confession: I know where they come from, the button, funfair, roll of film. But I’m not going to tell you.

      ***

Rather than in pure knowledge, I’m interested in the dirt on my sleeves, on the hems of my clothes dragging on the ground as I walk, like a Victorian lady tracking the heath. I track not some core or true self, but sideways and tangents. I’m not trying to be authentic nor in search of authenticity. I’m in search of what is lost or not yet existing. What is out there.
      The gap between me and my data profile is what John Cheney-Lippold calls ‘the else’. The else is more than a gap, it is something in itself. It’s not just a lack of information, an inescapable myopia or a reflection of something absent-present; it manifests itself, in words, details, a thing. The alienation that the else presents to me is productive. Productive of reality.
      What escapes from data is ultimately not just ‘the subjective’ – dreams, fears and desires, feelings like love or confusion – nor is it about grand notions like faith and justice. Ultimately, it is really about things escaping themselves. Opposed to Total Datafication is the material, physical world that keeps imposing itself, that which is left over and behind when you try to reduce the multiplicity of reality to a common denominator, that which covers you head to toe as soon as you try to neatly pigeonhole it – some surplus.
      Surplus formed by excrement, pus, and blood, by drizzle, soot, and Sahara dust, by medication residues flowing from the tap, plastic particles in bottled water, sky that no longer darkens, a city without silence. Indiscreet stuff that somehow manages to escape storage, control, and manipulation, that eludes prediction.
      Stuff that, in the words of the writer Tom McCarthy, traces ‘what has been excluded from it: the amputation-scar of an occurrence that, in its marked absence, seeps and stains and saturates an area’s surface all the more.’ It is the stuff, he says with a rather pervasive image, that is like cheese, the ‘corpse of milk’. It is ‘the fifth quarter: not the fourth but the fifth, the one that’s surplus to a thing’s integrity, to mathematics itself, a remainder.’
      This untameable surplus can be called, with some sense of drama, the revenge of the things. Counted, measured, categorised? That may be, but the fifth quarter of things does not allow itself to be controlled or smoothed out. It attracts attention through ugliness or nastiness (and everyone and everything is also ugly and nasty). It is the opposite to frictionless design, to Kudos!
      Rather than an automaton, I am a thing, oozing and smelly, opening up to no-thing.

      ***

In The Employees by Olga Ravn, we are invited to listen to the testimonies of humans and human-like robots who together populate the Six Thousandth Spaceship (Danish, Dutch: menneskelig, menneskelignede, niet menselijk, maar mens-gelijk). They conduct research on a planet that looks as much like Earth as the humanoids look like humans. There are things there too, living things, with smells and a certain haptic signature. These things shatter the binary opposition of human and machine. Where first there were dreams and desires on one side – the human side – and the harsh promise of immortality and updateability on the other – the humanoid one – in the end everyone is in doubt about who is who, who is human and who machine. The machine starts to recollect what can’t be remembered, the human longs for an algorithm that could be updated. All ask, who am I?
      Whether human or humanoid, the employees testify to the gap within them, which makes them vulnerable. And of course, in this vulnerability lies their power.
      Asked about their experiences with these strange, ectoplasmic objects, it is clear that even if the mystery remains, the act of testifying individualises one. You could even read their statements as a form of therapy. The individualisation is however remarkably, also a deepened entanglement with the other. It is followed by communion. And finally: riot, rebellion.

Dori Laub remarks that testimony – to war, to trauma, to a crime, life – is fragmented and unpredictable, and also, a new articulation of perspective. It speaks of that which can’t be put into existing words, that’s why it stumbles and breaks, the language must be bricolaged, translated, reworded, cut and pasted.
      The committeeinterviewing on the spaceship points to something more: a testimony cannot exist without someone listening – even if it is a committee out to police. Dori Laub even says: the listener precedes the testimony.
      So, say I am an employee. You don’t know whether I’m human or human-like, and it doesn’t matter, that distinction is obsolete. You listen to me. What do you hear? And what then comes up from the depths of your soul?


Geplaatst

in

,

door

Tags: