by Hans Schnitzler
‘First we shape our tools, thereafter they shape us’, wrote media theorist Marshall McLuhan a few decades ago. In times of data and algorithms this means that consumers and citizens alike are slowly but surely being shaped by smart systems.
For example: judges in the US tend to use computer programs to calculate if a suspect might be a possible back slider. The bigger this risk, the heavier the verdict. As a recent investigation of the journalist platform ProPublica showed, Afro-American citizens received much higher risk scores then their white country-man, regardless of the fact if they ever had been sentenced before for a criminal offense. And in China a social credit system is being built in which citizens are rated and rewarded according to preferred modes of behavior.
These examples show the true techno-deterministic face of data and algorithms. We are being confronted with a grand new narrative in which life processes are under- stood as data-processes and human beings primarily seen as information generating machines, biochemical algorithms of which the meaning and outcome can be deciphered by sequences of zero’s and one’s and reconstructed through spread sheets. Hence we are setting up a societal constellation in which correlations rule – that is: relations of probability – that never reveal why something is the case.
This new ideology is called dataism. But it is a highly problematic ideology because the “truth” of dataism sucks most meaning out of life; it is blind for the weight and implications of social contexts or personal narratives. It suggest absolute knowing, but implies absolute non-knowing; it replaces the ambiguity of why by an unrelenting it-is-so.
Despite this feature, data & algorithms, how prejudiced or deterministic they might be, function as fortune-tellers; they can predict actions before these actions have ever taken place, they can foresee the birth of a terrorist even before the potential terrorist is aware of his or her disruptive potential.
The temptation to make citizens or consumers accountable for their possible intentions or to let them pay for future behavior, is hard to resist. It’s hard to resist for insurers, law makers and law keepers alike.
And so, slowly but surely, we are shifting the social and legal focus from actuality to potentiality, from actual deeds to intentions, from doing to thinking and from facts to fiction. This is the central feature of so called surveillance societies.
The consequences of this reality reaches much further than worries about our privacy alone. To give an idea: in this whole process of ‘prepression’ – a term coined by the Dutch sociologist Willem Schinkel, to signify practices in which prevention and repression go hand in hand – the presump- tion of innocence, one of the pillars of the state of law, comes under scrutiny. Even stronger, it tends to a perverse reversion: you are guilty until proven innocent. And here we stumble onto the essence of terror as the German philosopher Friedrich Hegel understood it: a situation in which suspicion immediately converts in conviction and prosecution in liquidation.
Now more then ever and before we further externalize our faculties – our capacity to act and make autonomous choices – to machines, we need a kind of consensus or common understanding of human dignity or, in more general terms, what is means to be human. In which shape do we want to be?
This is the real challenge with which dataism is facing us. To accept this challenge means that techies and thinkers, designers and ethicists, have to work and think in close collaboration with each other. They have to work together to figure out what it means to shape and to be shaped.
About the author
Hans Schnitzler is philosopher, author, columnist and speaker. He is the author of “Het digitale proletariaat” (2015) & “Kleine filosofie van de digitale onthou- ding” (2017) at De Bezige Bij. Columnist for Follow the Money, former columnist at de Volkskrant. Essays and columns published in NRC, NRCNext, Trouw and more.
Photo: Michiel van Nieuwekerk