A showcase of ÉTS researchers’ publications and other contributions

Local data debiasing for fairness based on generative adversarial training


Downloads per month over past year

Aivodji, Ulrich, Bidet, François, Gambs, Sébastien, Ngueveu, Rosin Claude et Tapp, Alain. 2021. « Local data debiasing for fairness based on generative adversarial training ». Algorithms, vol. 14, nº 3.

[thumbnail of Aivodji-U-2021-23913.pdf]
Aivodji-U-2021-23913.pdf - Published Version
Use licence: Creative Commons CC BY.

Download (2MB) | Preview


The widespread use of automated decision processes in many areas of our society raises serious ethical issues with respect to the fairness of the process and the possible resulting discrimination. To solve this issue, we propose a novel adversarial training approach called GANSan for learning a sanitizer whose objective is to prevent the possibility of any discrimination (i.e., direct and indirect) based on a sensitive attribute by removing the attribute itself as well as the existing correlations with the remaining attributes. Our method GANSan is partially inspired by the powerful framework of generative adversarial networks (in particular Cycle-GANs), which offers a flexible way to learn a distribution empirically or to translate between two different distributions. In contrast to prior work, one of the strengths of our approach is that the sanitization is performed in the same space as the original data by only modifying the other attributes as little as possible, thus preserving the interpretability of the sanitized data. Consequently, once the sanitizer is trained, it can be applied to new data locally by an individual on their profile before releasing it. Finally, experiments on real datasets demonstrate the effectiveness of the approach as well as the achievable trade-off between fairness and utility

Item Type: Peer reviewed article published in a journal
Aïvodji, Ulrich
Affiliation: Autres
Date Deposited: 28 Jan 2022 20:48
Last Modified: 03 Mar 2022 16:09

Actions (login required)

View Item View Item