🇸🇪 Plattformssamhället

Digitala plattformar har under de senaste åren seglat upp som ett av de mest centrala begreppen i den digitala ekonomin. Plattformar möjliggör mängder av nya, effektiva sätt att organisera samhället – men de bygger också på ett element av styrning, då mänskligt handlande måste anpassa sig efter datorkoden. En handfull plattformsbaserade företag (Google, Facebook, Apple, Amazon, Microsoft) har fått enormt stort globalt inflytande, där inte bara användarna utan rader av andra samhällsaktörer har blivit beroende av tjänster från dessa giganter.

Samtidigt har de senaste åren rader av mindre företag dykt upp (många av dem nyetablerade så kallade startups, finansierade av riskkapital), vars affärsmodeller är baserade på olika typer av plattformar. Även många av dessa mindre plattformsföretag är i många avseenden beroende av dem.

Tillsammans med Stefan Larsson satte jag under 2018 ihop en antologi med syfte att samla flera kloka svenska röster i frågor relaterade till plattformsproblematiken. Publiceringen av antologin Plattformssamhället (Fores) i början av 2019 sammanföll i tid med den internationellt inflytelserika boken The Platform Society (Oxford University Press) av de holländska ledande forskarna José van Dijck, Thomas Poell och Martijn de Waal, som i synnerhet lyfter fram plattformarnas snabba framträdande inom olika områden som i Europa länge har varit offentligfinansierad verksamhet medan i USA är utpräglat privatägda (urban transport, nyhetsproduktion och dissemination, hälsovård och utbildning). Under hösten 2018 stod jag som extern reviewer av holländarnas bok, och arbetet med den svenska plattformsboken gynnades av denna internationella utblick.

Digitala plattformar kan få progressiva och rentav livsavgörande (goda) effekter – men kan likaledes användas för att kontrollera, manipulera och övervaka människor. De stora plattformsföretagen har global räckvidd, men det finns skäl att förutsätta att jurisdiktioner och politiska system förblir nationella. Häri ligger en rad utmaningar. Flera av dem berörs i boken.

Andersson Schwarz, J. & S. Larsson (red., 2019). Plattformssamhället: Den digitala utvecklingens politik, innovation och reglering. Stockholm: Fores.

Länk

🇸🇪 Journalistikens roll i den nya medieekologin

Internetanvändningen har på kort tid kommit att domineras av en rad privatägda och annonsfinansierade digitala plattformar med stort inflytande världen över: Facebook, Youtube, Twitter, och så vidare. Påfallande stor del av medieanvändningen äger rum på eller i anslutning till plattformar av detta slag.

Detta faktum innebär ett nytt sammanhang och skapar en bakgrund till studiet av massmedier och journalistik som är viktig både vad beträffar strukturell analys (ekonomiska flöden, materiella förutsättningar, maktrelationer osv) och analys av villkoren för kunskapsproduktion i detta medielandskap (sanningsanspråk, problemformuleringsprivilegier, skildringsmonopol osv). Även medieorganisationer som inte aktivt verkar på dessa internetplattformar har kommit att påverkas, då de konkurrerar med dessa plattformar och med de medieorganisationer som är direkt aktiva på dem.

Mot bakgrund av detta tar jag i detta lärobokskapitel ett medieekologiskt perspektiv på journalistikens villkor i det samtida digitala medielandskapet. Genom att betrakta den sociala mediecirkulationen och den redaktionella mediecirkulationen som två distinkta system, vilka dock inte är separata utan ständigt sammankopplade och på många sätt hopflätade med varandra, kan vi bättre förstå villkoren för vår tids internetmedierade journalistik.

Andersson, J. (2019) Journalistikens roll i den nya medieekologin. I: M. Karlsson & J. Strömbäck (red.) Handbok i journalistikforskning. Lund: Stidentlitteratur. 409-422.

Länk

🇬🇧 Umwelt and individuation: Digital signals and technical being

This chapter, which forms part of a deep and existentially far-reaching anthology on Digital Existence, is essentially a plea for a more responsive, cooperative information infrastructure. I address this by taking Facebook as an example.

Today’s digital landscape is quite literally premised on a theory of information that was in fact intended for machines – Claude Shannon’s theorem from 1948. Thus, the digital imaginary of our time is unfortunately of a very rigid, mute, non-vitalist kind – essentially inhuman.

My chapter is an attempt at reaching towards a more integrated, dynamic, vitalistic, and inclusive theory of digital information, by adopting the theory of Gilbert Simondon, a French 1950s thinker of technology.

Simondon affirms technology as a symbiotic process, enabling a utopian future where humans and digital infrastructures can be allowed to truly co-habit this planet – in contrast with today’s mainstream paradigm, which rather seems to stipulate an alienated relationship to technology, humans in one ringside and machines in the other. In Simondon’s theory, the individual is not a being but an act, and individuality is always an aspect of generation, ever-evolving, an ongoing genesis.

This stands in stark contrast to prevailing technocratic “solutions” (apps, platforms, databases) that are essentially systems of control, where users are deprived of genuine participation and are at best offered limited forms of co-creation that are always conditional on the proprietors or owners in question. At worst, the participation allowed for users is only illusory. The very act of trying to encapsulate human being into predefined, finite and locked-down boxes – trying to “pin down” individuals and groups by recourse to palimpsests, intended to “freeze” system states as if these were reliable and objective snapshots of human behaviour – is reductive and regressive at its core.

Believe it or not: These rather outlandish epistemological convictions actually lay at the root of today’s tech companies that base their business models on behavioural data, leading the operatives inside of these companies to pretend that the signals gathered are truthful and representative renditions of human behaviour.

What is more, once these operatives implement new applications based on the data that they are constantly gathering and feeding into algorithmic systems of behavioural manipulation and control, these systems actually begin to actively shape the real world that they are interacting with.

Soon, sinister feedback loops emerge: By observing the behaviours that these algorithmic systems prescribe, indeed dictate, users are taught to behave in specific ways in order to navigate the interface in the expected ways. By doing so, they become enticed to make further interactions which will, in turn, be farmed into new, interesting content for other users to interact with: Think of how Facebook users are compelled to publish and share content that is expected to be desirable among their peers.

More importantly, any move that a user would make is monitored and recorded so as to enable the corporation to interpret these signals in order to make selections of content and advertisements that they believe that the user him- or herself would find interesting, based on what they read these signals to indicate.

Moreover, users would arguably adapt also their own behaviours in order to suit the algorithmic infrastructure: In order to maintain peer visibility, users are compelled to design their posts in accordance with what the algorithmic interface tends to value as popular or recognizable to a large audience (Gillespie, 2014: 183). This precipitates a kind of built-in conformism; a popularity bias (Webster 2014).

Algorithms indirectly construct culture by way of feedback loops like this. Individuals seem to act based on what they observe that these semi-automated systems seem to value.

My argument, in brief

There is a funny thing though.

Do you see how the humans in the loop always have to second-guess what the system would prefer or predict? Essentially, the corporation makes educated guesses from all the vast amounts of user signals that they collect, and try to make target groups and so that they can increase the chances for advertisers to place ads that actually engage the users. Essentially, users themselves try to “game” the system so that they can reap as many benefits as they can from using it.

Researchers like Taina Bucher and John Cheney-Lippold have come to similar conclusions.

In order to understand all of this better, let us think of these media-technological systems as Umwelts for individuals to roam through. The concept of Umwelt was developed in the early 20th century by the Baltic German biologist Jakob von Uexküll, and refers to the cognized environment, the “self-centered world” which all organisms live in. All organisms experience life in terms of subjective reference frames: a bumblebee is at the center of its own world, much like the Facebook user is at the center of her own world, uniquely personalised for her, by Facebook the corporation™.

So, as users interact with environments-that-are-unique-to-them-and-only-them, they would at the same time give off signals as they keep interacting with this built environment. After all, this is an environment that is built on surveillance, all the way through. These signals are then instantly harvested by the platform proprietors and are read to be indicative of the assumed internal states of these individuals.

The really clever thing with this argument, though, is that we can think of also the platform infrastructure’s intelligence as a form of technical Umwelt unto itself!

Facebook doesn’t magically “know” you, as if we were dealing with some kind of sentient fairy-tale being, a Leviathan of some kind (although some critical scholars would definitely seem to want to frame it like that!) The platform operators and managers can actually only “see” that which takes place in the direct interactions, the actual “clicks” and measurable movements made. This is, quite literally, all that the automated systems have to go on. A system is a sum of inputs. It is by compiling signals, encoded in the form of “behavioural data,” that the engineers, behavioural scientists and marketing experts who build and maintain this infrastructure make their decisions.

Consequentially, we should not underestimate the degree to which the actual operatives inside the platform corporations are informed by estimations that risk being very reductive, if not even blind to a lot of aspects of human life.

A stunning addition!

After having finished this article in 2018, I was reminded of the concept of affordances, pioneered by cognitive psychologist J.J. Gibson in 1979. It is a bit embarrassing that his work hadn’t actually crossed my mind before. I’m schooled in a field somewhat indebted to continental philosophy and the Frankfurt school, so the work of an American mid-20th century psychologist hadn’t really cropped up on may radar.

But, conversely, Gibson himself had no reference to Umwelt either.

Andersson Schwarz, J. (2018). Umwelt and individuation: Digital signals and technical being. In: A. Lagerkvist (Ed.) Digital Existence. London & New York: Routledge. 61-80.

Paywalled / contact me for access