top of page

CFP Special Issue of JEA 8(2), 2024: Algorithmic Governance at Work

Call for Papers

Algorithmic Governance at Work: Interrogating Labour Power and Contemporary Epistemologies of Social Control

Special Issue of the Journal of Extreme Anthropology 8(2), 2024

Edited by: Tereza Østbø Kuldova & Inger Marie Hagen




Announcement on the Journal of Extreme Anthropology website: https://journals.uio.no/JEA/announcement/view/459


The rise of big data, machine learning, and artificial intelligence has not only ushered in a new era of ‘surveillance capitalism’ (Zuboff, 2019), but in new modes of social control. Or else, in new ways in which we are sought governed, managed, policed, risk assessed, punished, nudged, or made more efficient and productive. We are not only subjected but also subjectivated by (Kuldova, 2021) the proliferating and more or less opaque algorithmic architectures and ‘roboprocesses’ (Besteman & Gusterson, 2019) that seek to shape not only our behaviours and our decisions, but the very horizons of our thought and of ‘common sense’. ‘Algorithmic governance’ (Kalpokas, 2019; Katzenbach & Ulbricht, 2019; Kuldova et al., 2021) has become ubiquitous, accelerating pre-existing visions of techno-managerial and seemingly apolitical ‘governance by numbers’ (Supiot, 2017), or else, visions of ‘engineering’ the social and humanity (Frischmann & Selinger, 2018) that see the world as ultimately programmable and humans as imperfect and fallible in comparison to machine perfection.

 

While much critical literature has emerged by now – pointing to the ‘black-boxed’ nature of algorithms and ‘dirty data’ (O’Neil, 2016; Pasquale, 2015), to the injustices and harms caused by algorithmic systems and to the ways in which they reproduce and even exacerbate class, caste, racial, gender, and other forms of inequality and prejudice (Benjamin, 2019; Mbadiwe, 2018), to the unaccountable powers of the new surveillant gaze of the state, corporations but also peers (Andrejevic, 2005; Lyon, 2003; Strittmatter, 2020; Zuboff, 2019), and to the undermining of (data) privacy and civil liberties (Lageson, 2020) – few have sought to tackle the epistemologies underpinning algorithmic governance and the ways in which they both materialize and reinforce pre-existing governance trends, shaping the limits of the thinkable and the frames of our analysis, and our possibilities for resistance.

 

Hence, much of the academic and activist critique has been translated into calls for more transparency, better and ‘non-biased’ datasets, algorithmic auditing and risk-assessments, and various other policy instruments aligned with the logic of accountability, audit, risk mitigation, and so-called ‘good governance’ (Hansen, 2015; Kuldova, 2022; Shore & Wright, 2015). In other words, the solutions being proposed rest on the same logic of ‘governance by numbers’ (Supiot, 2017) that is built into these systems in the first place. We thus see calls for ‘regulation by design’ (Almada, 2023) of these data-driven technologies, and thus for the embedding of regulatory and legal compliance of various algorithmic architectures into their very code, effectively delegating both rule-making power and the power to translate law and regulations to software designers and other knowledge brokers and intermediaries. We see calls for rapid datafication of law itself, some even arguing for the ‘personalization’ of law where different people would be subject to different rules (Ben-Shahar & Porat, 2021), and thus effectively proposing to do away with foundational legal concepts such as equality before law. Markets are already flooded with data-driven compliance products that promise to ensure legal compliance: from sanction screening to anti-corruption (Kuldova, 2022). We could go on, but what is clear is that ‘technosolutionism’ (Morozov, 2013) has come to dominate the very field of governance itself: crime, social problems, security threats, workers, and even the climate are to be tackled through ever-expanding techno-managerial algorithmic infrastructures that claim to be powered by ‘pure’ data and by various metrics of evidence. We are witnessing a revival of naïve positivism, that seeks to make the world measurable and thus controllable, reducing us all to numbers in the process.

 

While the effects and injustices stemming from algorithmic systems are by now being widely debated, the epistemologies that underpin these various modes of algorithmic governance are still largely being glossed over – despite the immense epistemic power of these systems. And despite the likelihood that the future battles will be epistemic: or else, over how we know the world, whose knowledge is to count, and what kind of a world this knowledge brings forth. These questions become even more acute as we are effectively at every step governed, nudged, evaluated, profiled and so on by what are effectively knowledge products: products with built-in theories of behaviour, of risk, and even morality, of the good and the bad. Algorithmic governance, as manifest already in the few examples, operates at several levels, often at the same time: (1) at the level of material expression of hegemonic modes of governance, ordering, and understanding of the social, encoding particular epistemologies and amplifying their epistemic power, (2) at the level of infrastructural power and ‘extrastatecraft’ (Easterling, 2016), including that of algorithmic architectures which steer us in particular ways, while foreclosing alternative paths, (3) and at the level coding laws, regulations, and guidelines into algorithmic architectures, or else the power to interpret and translate these into products and practice, be it the laws to which the society, citizens or employees are to submit, and the regulations to which the software itself must comply with. In this special issue, we seek to understand how this form of power plays out in the workplace and in the context of the labour movement, such as in the work of trade unions or in new forms of resistance to algorithmic governance and management.

 

Workplace monitoring, algorithmic management systems, automated decision-making support systems, performance quantification and similar technologies represent a new form of workplace governance (Kuldova, 2022). The covid-19 pandemic accelerated the implementation of practices and regimes that were underway but were turbo-charged in response to emergency conditions (Green & Fazi, 2023). While some herald digital technology as the tool to transform working practices in productive, innovative, and environmentally friendly ways, critical voices question the growth of surveillance and the use of AI to measure, quantify and intensify patterns of work (Altenried, 2020; Ball, 2010; De Angelis & Harvie, 2009; Kuldova, 2021; Noponen et al., 2023; Sempill, 2001). While a number of scholars have focused on platform labour and gig workers, and on the negative effects of surveillance, fewer have questioned the challenge that the ‘epistemic power’ of managerialism (Kuldova & Nordrik, 2023), now embedded into algorithmic architectures, poses not only for worker resistance, for trade unions, for the labour movement, but also for the very possibility to reconceptualize work and labour more fundamentally – beyond the quest for better working conditions.

 

This special issues invites scholars in humanities and critical social science to engage with epistemologies of work and labour in relation to techno-managerialism, algorithmic management, and workplace governance. In particular, we are interested in qualitative research grounded in fieldwork, but we also welcome theoretical pieces.

 

Interested contributors are encouraged to submit an abstract of 300 words and a short bio by the 15 April 2024 to the journal Editor-in-Chief, Tereza Østbø Kuldova and co-editor Inger Marie Hagen, at tkuld@oslomet.no & imhagen@oslomet.no.

 

Deadline for full submissions will be the 1 September 2024. Peer-reviewed article submissions are expected to be between 7-9000 words, following journal guidelines. Other non-peer-reviewed contributions, such as book reviews, essays or interviews are also welcome. For submission guidelines, and other details, please visit: https://journals.uio.no/JEA

 

This special issue is created with the financial support of The Research Council of Norway under project no. project no. 314486 – Digital Prism and the Nordic Model of Workplace Democracy under Pressure (DigiWORK).

 

Journal of Extreme Anthropology is an international, peer-reviewed, interdisciplinary and indexed journal that publishes articles written in the fields of anthropology, social sciences and humanities, specializing on extreme and challenging subjects, practices and theory.


 

References

Almada, M. (2023). Regulation by Design and the Governance of Technological Futures. European Journal of Risk Regulation, 1-13.

Altenried, M. (2020). The Platform as Factory: Crowdwork and the hidden labour behind artificial intelligence. Capital & Class, 44(2), 145-158.

Andrejevic, M. (2005). The Work of Watching One Another: Lateral Surveillance, Risk, and Governance. Surveillance & Society, 2(4), 479-497.

Ball, K. (2010). Workplace Surveillance: An Overview. Labor History, 51(1), 87-106.

Ben-Shahar, O., & Porat, A. (2021). Personalized Law: Different Rules for Different People. Oxford University Press.

Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity Press.

Besteman, C., & Gusterson, H. (Eds.). (2019). Life By Algorithms: How Roboprocesses Are Remaking Our World. University of Chicago Press.

De Angelis, M., & Harvie, D. (2009). ‘Cognitive Capitalism’ and the Rat-Race: How Capital Measures Immaterial Labour in British Universities. Historical Materialism, 17, 3-30.

Easterling, K. (2016). Extrastatecraft: The Power of Infrastructure Space. Verso.

Frischmann, B., & Selinger, E. (2018). Re-engineering Humanity. Cambridge University Press.

Green, T., & Fazi, T. (2023). The Covid Consensus: The Global Assault on Democracy and the Poor?A Critique from the Left. Hurst.

Hansen, H. K. (2015). Numerical operations, transparency illusions and the datafication of governance. European Journal of Social Theory, 18(2), 203-220.

Kalpokas, I. (2019). Algorithmic Governance: Politics and Law in the Post-Human Era. Palgrave Macmillan.

Katzenbach, C., & Ulbricht, L. (2019). Algorithmic Governance. Internet Policy Review, 8(4), 1-18.

Kuldova, T. (2021). The cynical university: Gamified subjectivity in Norwegian academia. Ephemera: Theory & Politics in Organization, 21(3), 1-29.

Kuldova, T. Ø. (2022). Compliance-Industrial Complex: The Operating System of a Pre-Crime Society. Palgrave Pivot.

Kuldova, T. Ø., & Nordrik, B. (2023). Workplace Investigations, the Epistemic Power of Managerialism, and the Hollowing Out of the Norwegian Model of Co-determination. Class and Capital. https://doi.org/10.1177/03098168231179971

Kuldova, T. Ø., Wathne, C. T., & Nordrik, B. (2021). Editorial: Algorithmic Governance: Fantasies of Social Control. Journal of Extreme Anthropology, 5(1), i-v.

Lageson, S. E. (2020). Digital Punishment: Privacy, Stigma, and the Harms of Data-Driven Criminal Justice. Oxford University Press.

Lordon, F. (2014). Willing Slaves of Capital: Spinoza & Marx on Desire. Verso.

Lyon, D. (Ed.). (2003). Surveillance as Social Sorting: Privacy, Risk and Digital Discrimination. Routledge.

Mbadiwe, T. (2018). Algorithmic Injustice. The New Atlantis: A Journal of Technology & Society, Winter, 3-28.

Morozov, E. (2013). To Save Everything, Click Here. Public Affairs.

Noponen, N., Feschenko, P., Auvinen, T., Luaho-aho, V., & Abrahamsson, P. (2023). Taylorism on Steroids or Enabling Autonomy? A Systematic Review of Algorithmic Management. Management Review Quarterly, Online First.

O’Neil, C. (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown.

Pasquale, F. (2015). The Black Box Society: The Secret Algorithms that Control Money and Information. Harvard University Press.

Sempill, J. (2001). Under the Lens: Electronic Workplace Surveillance. Australian Journal of Labour Law, 14, 1-34.

Shore, C., & Wright, S. (2015). Audit Culture Revisited: Rankings, Ratings, and the Reassembling of Society. Current Anthropology, 56(3), 421-444.

Strittmatter, K. (2020). We Have Been Harmonised: Life in China’s Surveillance State. Old Street Publishing.

Supiot, A. (2017). Governance by Numbers: The Making of a Legal Model of Allegiance. Bloomsbury.

Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books.

Commentaires


bottom of page