GitHub Repository
Created On
Updated On
Clinical predictive models that include race as a predictor have the potential to exacerbate disparities in healthcare. Such models can be respecified to exclude race or optimized to reduce racial bias. We investigated the impact of such respecifications in a predictive model – UTICalc – which was designed to reduce catheterizations in young children with suspected urinary tract infections. To reduce racial bias, race was removed from the UTICalc logistic regression model and replaced with two new features. We compared the two versions of UTICalc using fairness and predictive performance metrics to understand the effects on racial bias. In addition, we derived three new models for UTICalc to specifically improve racial fairness. Our results show that, as predicted by previously described impossibility results, fairness cannot be simultaneously improved on all fairness metrics, and model respecification may improve racial fairness but decrease overall predictive performance.

More details

Use Cases Limitations Evidence Owner's Insight N/A N/A N/A N/A

Warning: Not intended for clinical use. Assume outputs are unsafe and unvalidated. Use carefully.

  • Favorites: 0
  • Executions: 31

  • Clinical Informatics


J Joshua Anderson

Member since