Skip to content

Commit 69e30d5

Browse files
committed
code review edits
1 parent 37016ab commit 69e30d5

10 files changed

+27
-16
lines changed

README.md

+1
Original file line numberDiff line numberDiff line change
@@ -158,6 +158,7 @@ Options:
158158
- [ ] **A.1 Informed consent**: If there are human subjects, have they given informed consent, where subjects affirmatively opt-in and have a clear understanding of the data uses to which they consent?
159159
- [ ] **A.2 Collection bias**: Have we considered sources of bias that could be introduced during data collection and survey design and taken steps to mitigate those?
160160
- [ ] **A.3 Limit PII exposure**: Have we considered ways to minimize exposure of personally identifiable information (PII) for example through anonymization or not collecting information that isn't relevant for analysis?
161+
- [ ] **A.4 Downstream bias mitigation**: Have we considered ways to enable testing downstream results for biased outcomes (e.g. collecting data on protected group status like race or gender)?
161162

162163
## B. Data Storage
163164
- [ ] **B.1 Data security**: Do we have a plan to protect and secure data (e.g., encryption at rest and in transit, access controls on internal users and third parties, access logs, and up-to-date software)?

deon/assets/checklist.yml

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
title: Data Science Ethics Checklist
2-
sections:
2+
sections:
33
- title: Data Collection
44
section_id: A
55
lines:
@@ -14,7 +14,7 @@ sections:
1414
line: Have we considered ways to minimize exposure of personally identifiable information (PII) for example through anonymization or not collecting information that isn't relevant for analysis?
1515
- line_id: A.4
1616
line_summary: Downstream bias mitigation
17-
line: Have we considered ways to include demographic data (e.g. race, gender) where possible to enable testing downstream results for biased outcomes?
17+
line: Have we considered ways to enable testing downstream results for biased outcomes (e.g. collecting data on protected group status like race or gender)?
1818
- title: Data Storage
1919
section_id: B
2020
lines:

deon/assets/examples_of_ethical_issues.yml

+8-8
Original file line numberDiff line numberDiff line change
@@ -18,12 +18,12 @@
1818
url: https://www.wired.com/2007/12/why-anonymous-data-sometimes-isnt/
1919
- line_id: A.4
2020
links:
21-
- text: Reports of credit bias in Apple card credit limits spark negative press and investigation.
22-
url: https://www.wired.com/story/the-apple-card-didnt-see-genderand-thats-the-problem/
21+
- text: In six major cities, Amazon's same day delivery service excludes many predominantly black neighborhoods.
22+
url: https://www.bloomberg.com/graphics/2016-amazon-same-day/
2323
- text: Facial recognition software is significanty worse at identifying people with darker skin.
2424
url: https://www.theregister.co.uk/2018/02/13/facial_recognition_software_is_better_at_white_men_than_black_women/
2525
- text: -- Related academic study.
26-
url: http://proceedings.mlr.press/v81/buolamwini18a.html
26+
url: http://proceedings.mlr.press/v81/buolamwini18a.html
2727
- line_id: B.1
2828
links:
2929
- text: Personal and financial data for more than 146 million people was stolen in Equifax data breach.
@@ -60,7 +60,7 @@
6060
links:
6161
- text: Misleading chart shown at Planned Parenthood hearing distorts actual trends of abortions vs. cancer screenings and preventative services.
6262
url: https://www.politifact.com/truth-o-meter/statements/2015/oct/01/jason-chaffetz/chart-shown-planned-parenthood-hearing-misleading-/
63-
- text: Georgia Dept. of Health graph of COVID-19 cases falsely suggests a steeper decline when dates are ordered by total cases rather than chronologically
63+
- text: Georgia Dept. of Health graph of COVID-19 cases falsely suggests a steeper decline when dates are ordered by total cases rather than chronologically.
6464
url: https://www.vox.com/covid-19-coronavirus-us-response-trump/2020/5/18/21262265/georgia-covid-19-cases-declining-reopening
6565
- line_id: C.4
6666
links:
@@ -72,16 +72,18 @@
7272
url: https://www.bbc.com/news/magazine-22223190
7373
- line_id: D.1
7474
links:
75-
- text: In six major cities, Amazon's same day delivery service excludes many predominantly black neighborhoods.
76-
url: https://www.bloomberg.com/graphics/2016-amazon-same-day/
7775
- text: Variables used to predict child abuse and neglect are direct measurements of poverty, unfairly targeting low-income families for child welfare scrutiny.
7876
url: https://www.wired.com/story/excerpt-from-automating-inequality/
77+
- text: Amazon scraps AI recruiting tool that showed bias against women.
78+
url: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
7979
- text: Criminal sentencing risk asessments don't ask directly about race or income, but other demographic factors can end up being proxies.
8080
url: https://www.themarshallproject.org/2015/08/04/the-new-science-of-sentencing
8181
- text: Creditworthiness algorithms based on nontraditional criteria such as grammatic habits, preferred grocery stores, and friends' credit scores can perpetuate systemic bias.
8282
url: https://www.whitecase.com/publications/insight/algorithms-and-bias-what-lenders-need-know
8383
- line_id: D.2
8484
links:
85+
- text: Apple credit card offers smaller lines of credit to women than men.
86+
url: https://www.wired.com/story/the-apple-card-didnt-see-genderand-thats-the-problem/
8587
- text: Google Photos tags two African-Americans as gorillas.
8688
url: https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-african-americans-as-gorillas-through-facial-recognition-software/#12bdb1fd713d
8789
- text: With COMPAS, a risk-assessment algorithm used in criminal sentencing, black defendants are almost twice as likely as white defendants to be mislabeled as likely to reoffend.
@@ -92,8 +94,6 @@
9294
url: https://www.liebertpub.com/doi/pdf/10.1089/big.2016.0047
9395
- text: Google's speech recognition software doesn't recognize women's voices as well as men's.
9496
url: https://www.dailydot.com/debug/google-voice-recognition-gender-bias/
95-
- text: Amazon scraps AI recruiting tool that showed bias against women.
96-
url: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
9797
- text: Google searches involving black-sounding names are more likely to serve up ads suggestive of a criminal record than white-sounding names.
9898
url: https://www.technologyreview.com/s/510646/racism-is-poisoning-online-ad-delivery-says-harvard-professor/
9999
- text: -- Related academic study.

0 commit comments

Comments
 (0)