Please use this identifier to cite or link to this item:
Publication type: Conference paper
Type of review: Peer review (publication)
Title: A systematic approach to group fairness in automated decision making
Authors: Hertweck, Corinna
Heitz, Christoph
et. al: No
DOI: 10.1109/SDS51136.2021.00008
Proceedings: Proceedings of the 8th SDS
Pages: 1
Pages to: 6
Conference details: 8th Swiss Conference on Data Science, Lucerne, Switzerland, 9 June 2021
Issue Date: 7-Jul-2021
Publisher / Ed. Institution: IEEE
ISBN: 978-1-6654-3874-2
Language: English
Subjects: Algorithmic fairness; Group fairness; Statistical parity; Independence; Separation; Sufficiency
Subject (DDC): 006: Special computer methods
170: Ethics
Abstract: While the field of algorithmic fairness has brought forth many ways to measure and improve the fairness of machine learning models, these findings are still not widely used in practice. We suspect that one reason for this is that the field of algorithmic fairness came up with a lot of definitions of fairness, which are difficult to navigate. The goal of this paper is to provide data scientists with an accessible introduction to group fairness metrics and to give some insight into the philosophical reasoning for caring about these metrics. We will do this by considering in which sense socio-demographic groups are compared for making a statement on fairness.
Fulltext version: Accepted version
License (according to publishing contract): Licence according to publishing contract
Departement: School of Engineering
Organisational Unit: Institute of Data Analysis and Process Design (IDP)
Published as part of the ZHAW project: Socially acceptable AI and fairness trade-offs in predictive analytics
Appears in collections:Publikationen School of Engineering

Files in This Item:
File Description SizeFormat 
2021_Hertweck-Heitz_Group-fairness-automated-decision-making.pdfAccepted Version107.87 kBAdobe PDFThumbnail

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.