**C.2 Dataset bias**: Have we examined the data for possible sources of bias and taken steps to mitigate or address these biases (e.g., stereotype perpetuation, confirmation bias, imbalanced classes, or omitted confounding variables)? | <ul><li>[A widely used commercial algorithm in the healthcare industry underestimates the care needs of black patients, assigning them lower risk scores compared to equivalently sick white patients.](https://www.nature.com/articles/d41586-019-03228-6)</li><li>[-- Related academic study.](https://science.sciencemag.org/content/366/6464/447)</li><li>[word2vec, trained on Google News corpus, reinforces gender stereotypes.](https://www.technologyreview.com/s/602025/how-vector-space-mathematics-reveals-the-hidden-sexism-in-language/)</li><li>[-- Related academic study.](https://arxiv.org/abs/1607.06520)</li><li>[Women are more likely to be shown lower-paying jobs than men in Google ads.](https://www.theguardian.com/technology/2015/jul/08/women-less-likely-ads-high-paid-jobs-google-study)</li></ul>
0 commit comments