Platforms, algorithms and AI: Issues and hypotheses in the mediatization perspective

On hypermediatization as a process and hypermediatized societies as an outcome 287 the broadcast yourself of amateurs), and Twitter, which enabled organic agents/enunciators (individuals and groups) to manage their own “means of communication.” Likely, corporate technol- ogy companies are not media (a long debate, as is known), but that does not imply they have not provided the possibility of managing their own “mean of communication” to each agent/ enunciator (more than half of the world’s population). Since then, they have not radically changed the public, private, and intimate space and the way of producing, publishing, receiving, and appropriating discourses produced by all kinds of institutional agents (from those generated by the Catholic Church or FIFA to the amateur user). In other words, to give rise to a con- temporary scene (nowadays, there are more means than ever in history, with different statuses and much more “communica- tion”). Algorithms: a few words about the complex field of re- search that goes from algorithms to A.I. There seems no doubt that these developments are already affecting all levels of analy- sis, from micro to macro and vice versa. And that, as with other fields of research, there will be two approaches, one anthropocentric, which seems already dominant because it focuses di- rectly on the problem of inequality13, and another non-anthro- pocentric, which should focus more on the transformations that non-human-scale automatism introduces at all levels of semio- sis (which it has transformed into a non-anthropocentric net- work). As there is so much to say about this, and since from the perspective that articulates the mediatization that focuses on meanings so little has been said, in this text, we will limit our- selves to making two observations. The first is on the level of “production of meaning.” Al- though there is a lack of specific studies and publications, there is no doubt that it radically affects all levels. The main current 13 See Crawford: “A.I. is not a neutral computational technique that makes deter- minations without human direction. Their systems are embedded in social, po- litical, cultural, and economic worlds, delineated by humans, institutions, and imperatives that determine what they do and what they do and how they do it. They are designed to discriminate, expand hierarchies, and encode narrow clas- sifications. When applied in social contexts such as the police, the legal system, health care, and education, they can reproduce, optimize, and amplify structural inequalities. This is not fortuitous. A.I. systems are built to see and intervene in the world in ways that primarily benefit the states, institutions, and corporations they serve” (Crawford, 2022: 321).

RkJQdWJsaXNoZXIy MjEzNzYz