Please use this identifier to cite or link to this item:
https://rfos.fon.bg.ac.rs/handle/123456789/1900| Title: | Making hospital readmission classifier fair - What is the cost? | Authors: | Radovanović, Sandro Petrović, Andrija Delibašić, Boris Suknović, Milija |
Keywords: | Machine Learning;Hospital Readmission;Fairness;Bias Mitigation | Issue Date: | 2019 | Publisher: | Fac Organization And Informatics, Univ Zagreb, Varazdin | Abstract: | Creating predictive models using machine learning algorithms is often understood as a job where Data Scientist provides data to the algorithm without much intervention. With the rise of ethics in machine learning, predictive models need to be made fair. In this paper, we inspect the effects of pre-processing, in-processing and post-processing techniques for making predictive models fair. These techniques are applied to the hospital readmission prediction problem, where gender is considered as a sensitive attribute. The goal of the paper is to check whether unwanted discrimination between female and male in the logistic regression model exists and if exists to alleviate this problem making classifier fair. We employed logistic regression model which obtained AUC = 0.7959 and AUPRC = 0.5263. We have shown that reweighting strategy is a good trade-off between fairness and predictive performance. Namely, fairness is greatly improved, without much sacrificing predictive performance. We also show that adversarial debiasing is a good technique which combines predictive performance and fairness, and Equality of Odds technique optimizes Theil index. | URI: | https://rfos.fon.bg.ac.rs/handle/123456789/1900 | ISSN: | 1847-2001 |
| Appears in Collections: | Radovi istraživača / Researchers’ publications |
Show full item record
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.