Much of the literature produced in recent years has tried to systematize the various forms of platform work while considering how to (re)classify those working through digital platforms. While this contributes to our understanding of the nature of work in the digital economy and the extent to which current labour law is problematic for some platform workers, the gender dimension and implications, especially the growing concern about discrimination between men and women in the digital economy, appear to have received little attention so far. A recent study showed that some female platform workers receive lower pay than their male counterparts. As some online platforms use algorithms to determine pay levels, the key question addressed here is to the extent to which current EU gender equality law, and the principle of equal pay for women and men in particular, is adequate for protecting platform workers in a situation where work-related decisions are not taken by a human being but by an algorithm that is the potential source of discrimination. To understand how regulation should be ‘calibrated’ in cases where algorithms result in discrimination, the theory of classification bias can be helpful. It is assumed that the reason for providing protection based on the equal pay principle to a specific group of employees is not compelling enough to exclude platform workers who are classified as self-employed. This article starts with a brief examination of the challenges of working in the digital economy and then goes on to analyse the role of algorithms and their potential to discriminate based on gender. It is argued that the theory of classification bias could be used to address discriminatory algorithmic decision-making. The theory is then applied to the EU’s principle of equal pay for women and men, suggesting some improvements in relation to platform work.
International Journal of Comparative Labour Law and Industrial Relations