The advent and implementation of tools relying on algorithms to make decisions has further penalized specific social categories by normalizing inequalities, in the name of efficiency and rationalization. I tried to deconstruct this narrative by highlighting the risks that automated process and predictive model bring with them, specifically in terms of reinforcing inequalities. The digital underclass do not have sufficient skills to escape from algorithmic suggestions or those who, for their personal and social characteristics, are penalized by the automated decision-making software. The consequences of algorithmic decision-making on citizens’ everyday lives are reinforcing social inequalities. The existing socio-economic disadvantages are further reinforced, since citizens belonging to the digital underclass are fully or partly excluded from both the social and digital realms, or are penalized by the algorithm software used, among the others, for profiling or processing applications.

Inequalities in knowledge: inequalities intended as the different levels of understanding of how algorithms influence everyday life and different skills and creative techniques to escape algorithms’ “suggestions”. Inequalities in key dataset that serve as foundations for algorithms and AI systems. These datasets are highly skewed, in terms of race, wealth, gender, disability, following the main axes of social inequalities. Inequalities are deeply embedded in the way the system is built. Finally, Inequalities in Treatment. In fact, if a system is “designed” by a small elite, by using a biased dataset, encoding social hierarchies and inequalities, the price will be paid by the weaker segments of the population. Biased data used by algorithms discriminate citizens, treating them differently based on socio-demographic and socio-economic features. The unequal treatment acts in a double way: a) giving access or restriction to services only to certain social categories and b) monitoring and punishing certain categories more than others.

These three levels of (new) digital inequalities are tied both with the main axes of social inequalities and with the rise of digital technologies, and affect, citizens’ lives and social hierarchy. I concluded by remarking how the negative effects of algorithmic decision-making for vulnerable population groups should stay at the centre of political agenda and the research.

Thanks to Nico Carpentier, Andrea Medrado, Bruce Girard, Valeria Zamisch and the whole IAMCR team for launching this series of webinars.