Logo Logo
Help
Contact
Switch Language to German

Dandl, Susanne; Pfisterer, Florian and Bischl, Bernd ORCID logoORCID: https://orcid.org/0000-0001-6002-6980 (2022): Multi-objective counterfactual fairness. GECCO '22, Genetic and Evolutionary Computation Conference, Boston Massachusetts, July 9 - 13, 2022. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, New York: Association for Computing Machinery. pp. 328-331

Full text not available from 'Open Access LMU'.

Abstract

When machine learning is used to automate judgments, e.g. in areas like lending or crime prediction, incorrect decisions can lead to adverse effects for affected individuals. This occurs, e.g., if the data used to train these models is based on prior decisions that are unfairly skewed against specific subpopulations. If models should automate decision-making, they must account for these biases to prevent perpetuating or creating discriminatory practices. Counter-factual fairness audits models with respect to a notion of fairness that asks for equal outcomes between a decision made in the real world and a counterfactual world where the individual subject to a decision comes from a different protected demographic group. In this work, we propose a method to conduct such audits without access to the underlying causal structure of the data generating process by framing it as a multi-objective optimization task that can be efficiently solved using a genetic algorithm.

Actions (login required)

View Item View Item