Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen: https://doi.org/10.21256/zhaw-25909
Publikationstyp: Konferenz: Paper
Art der Begutachtung: Peer review (Publikation)
Titel: Enforcing group fairness in algorithmic decision making : utility maximization under sufficiency
Autor/-in: Baumann, Joachim
Hannák, Anikó
Heitz, Christoph
et. al: No
DOI: 10.1145/3531146.3534645
10.21256/zhaw-25909
Tagungsband: Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency
Seite(n): 2315
Seiten bis: 2326
Angaben zur Konferenz: 5th ACM Conference on Fairness, Accountability, and Transparency (FAccT), Seoul, Republic of Korea, 21-24 June 2022
Erscheinungsdatum: 2022
Verlag / Hrsg. Institution: Association for Computing Machinery
Verlag / Hrsg. Institution: New York
ISBN: 978-1-4503-9352-2
Sprache: Englisch
Schlagwörter: Computing methodology; Machine learning; Applied computing; Decision analysis; Group fairness metrics; Algorithmic fairness; Prediction-based decision making
Fachgebiet (DDC): 005: Computerprogrammierung, Programme und Daten
658.403: Entscheidungsfindung, Informationsmanagement
Zusammenfassung: Binary decision making classifiers are not fair by default. Fairness requirements are an additional element to the decision making rationale, which is typically driven by maximizing some utility function. In that sense, algorithmic fairness can be formulated as a constrained optimization problem. This paper contributes to the discussion on how to implement fairness, focusing on the fairness concepts of positive predictive value (PPV) parity, false omission rate (FOR) parity, and sufficiency (which combines the former two). We show that group-specific threshold rules are optimal for PPV parity and FOR parity, similar to well-known results for other group fairness criteria. However, depending on the underlying population distributions and the utility function, we find that sometimes an upper-bound threshold rule for one group is optimal: utility maximization under PPV parity (or FOR parity) might thus lead to selecting the individuals with the smallest utility for one group, instead of selecting the most promising individuals. This result is counter-intuitive and in contrast to the analogous solutions for statistical parity and equality of opportunity. We also provide a solution for the optimal decision rules satisfying the fairness constraint sufficiency. We show that more complex decision rules are required and that this leads to within-group unfairness for all but one of the groups. We illustrate our findings based on simulated and real data.
URI: https://digitalcollection.zhaw.ch/handle/11475/25909
Volltext Version: Akzeptierte Version
Lizenz (gemäss Verlagsvertrag): Lizenz gemäss Verlagsvertrag
Departement: School of Engineering
Organisationseinheit: Institut für Datenanalyse und Prozessdesign (IDP)
Publiziert im Rahmen des ZHAW-Projekts: Algorithmic Fairness in data-based decision making: Combining ethics and technology
Enthalten in den Sammlungen:Publikationen School of Engineering

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
2022_Baumann-Hannak-Heitz_Enforcing-group-fairness-algorithmic-decision-making_ACM.pdfAccepted Version2.07 MBAdobe PDFMiniaturbild
Öffnen/Anzeigen
Zur Langanzeige
Baumann, J., Hannák, A., & Heitz, C. (2022). Enforcing group fairness in algorithmic decision making : utility maximization under sufficiency [Conference paper]. Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency, 2315–2326. https://doi.org/10.1145/3531146.3534645
Baumann, J., Hannák, A. and Heitz, C. (2022) ‘Enforcing group fairness in algorithmic decision making : utility maximization under sufficiency’, in Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency. New York: Association for Computing Machinery, pp. 2315–2326. Available at: https://doi.org/10.1145/3531146.3534645.
J. Baumann, A. Hannák, and C. Heitz, “Enforcing group fairness in algorithmic decision making : utility maximization under sufficiency,” in Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency, 2022, pp. 2315–2326. doi: 10.1145/3531146.3534645.
BAUMANN, Joachim, Anikó HANNÁK und Christoph HEITZ, 2022. Enforcing group fairness in algorithmic decision making : utility maximization under sufficiency. In: Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency. Conference paper. New York: Association for Computing Machinery. 2022. S. 2315–2326. ISBN 978-1-4503-9352-2
Baumann, Joachim, Anikó Hannák, and Christoph Heitz. 2022. “Enforcing Group Fairness in Algorithmic Decision Making : Utility Maximization under Sufficiency.” Conference paper. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency, 2315–26. New York: Association for Computing Machinery. https://doi.org/10.1145/3531146.3534645.
Baumann, Joachim, et al. “Enforcing Group Fairness in Algorithmic Decision Making : Utility Maximization under Sufficiency.” Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency, Association for Computing Machinery, 2022, pp. 2315–26, https://doi.org/10.1145/3531146.3534645.


Alle Ressourcen in diesem Repository sind urheberrechtlich geschützt, soweit nicht anderweitig angezeigt.