The use of data to profile and make semi-automated decisions about ‘data subjects’ on issues of citizenship, for targeted advertising, to make loan offers or predict criminal recidivism, is eminently normalised. However, the use of data for the semi-automation of tasks, and the use of data to analyse and measure work and productivity via algorithmic management, occurs within different social relations and holds different implications for working data subjects than data collection and use within other populations. European privacy and data protection, as well as occupational safety and health, law, is heralded as the strongest in the world. However, it holds insufficient recognition that working data subjects’ subjective positions are different from others; and does not account for the structural features of inequality at work within a capitalist data political economy. Recent findings from Moore’s (et al) research for the European Safety and Health Organisation (EU OSHA) demonstrates that there are significant risks faced by workers in semi-automated environments where cognitive AI systems are increasingly integrated. This argument contributes to policy and legal studies, sociology of work, and critical political economy research, arguing that overlooking analogue social relations of data production and rampant semi-automation of both tasks and critical decision-making, will lead to significantly weakened capacity for the protection from data harms and for worker agency.
Prof Phoebe V Moore, University of Essex, is an expert on the quantified self at work and international policy related to monitoring technologies, data use and surveillance in workplaces. Moore has published several pieces on technology at work and workers’ rights, including The Quantified Self in Precarity: Work, Technology and What Counts, and is regularly consulted and commissioned to work for the European Commission, European Safety and Health Agency, and the International Labour Organization on these topics. phoebevmoore.wordpress.com