-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathethics_and_research.qmd
72 lines (50 loc) · 5.07 KB
/
ethics_and_research.qmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
---
title: Ethics in Research
author: Neil Ernst
format:
revealjs:
theme: solarized
smaller: true
scrollable: true
incremental: true
footer: "©️ Neil Ernst"
---
## Learning Objectives
* Appreciate the need to develop an ethical framework for your research
* Learn about ethical implications of technology
* Acquire the TriCouncil certificate in human subjects research
## Notes
- Difference between ethics of human subjects research, and ethics of the research goals itself (the latter is usually not reviewed for academic freedom reasons, but has public opinion review), and ethical conduct in research (plagiarism, fake data, self citation, review circles).
- Ethics is rarely black and white. We have to balance competing priorities and it may not seem unethical until later. The pressure in academia can feel intense! Reach out to university counselling resources or trusted professors if you feel you need some help.
## Human Subjects and Ethical Research
Discuss why we need such a review.
Neil's sample application.
Some examples of ethics applications.
## Ethical Research Directions
I don't believe technology is value-neutral. Algorithms for example can encode the biases of the data and the implementer. Many examples of seriously biased and/or destructive technology exist, such as using [IBM card readers as part of the Nazi genocide and Holocaust](https://www.huffpost.com/entry/ibm-holocaust_b_1301691), to atomic weapons, to most recently, [biased search results or bail decisions](http://algorithmsofoppression.com/). Bitcoin mining consumes as [much energy as Switzerland](https://www.bbc.com/news/technology-48853230)!
There is a growing recognition (and pushback, to be fair) that technology, and research into it, has consequences. One example is how NeurIPS treated ethical directions in ML research[^1], by requiring a special ethical statement assessing the approach. What are your thoughts on this?
## Black Mirror Exercise: speculate!
- Watch [a snippet (caution: language)](https://www.youtube.com/watch?v=YrpK90bHO2U)
- Get into small groups of 3. I will pick a leader, and the three of you will take the leader's research area, and explore possible ethical implications—social media privacy, algorithmic bias, online harassment, intellectual property, misinformation.
- Where could technology or a social system go next that would be worthy of a *Black Mirror* episode? What might be the Light Mirror equivalent?
## Ethical Research Behavior
As students at UVic you agree to abide by several [ethical](https://www.uvic.ca/sexualizedviolence/policy/index.php) [pledges](https://www.uvic.ca/students/academics/academic-integrity/). In addition, if you publish at places like Elsevier journals, ACM conferences, or IEEE venues, those organizations have their [own](https://sigchi.org/ethics/) [codes](https://conf.researchr.org/attending/icse-2020/Code+of+Conduct) of conduct and expectations. Some obvious DONTs:
## Don'ts
- Don't [fake data](https://retractionwatch.com/2013/06/28/diederik-stapel-settles-with-dutch-prosectors-wont-face-jail-time/).
- Don't plagiarize other people's work, ideas, tools etc.
- Don't violate [blinding and confidentiality](https://www.acm.org/publications/policies/pre-publication-evaluation) (e.g., sharing unpublished material)
- Don't engage in [poor research practices like p-hacking](http://daniellakens.blogspot.com/2020/09/p-hacking-and-optional-stopping-have.html).
- Don't ignore [Codes of Conduct](https://conf.researchr.org/attending/esem-2023/ESEM+CoC).
- Don't [cite research just to increase your own citation scores](https://retractionwatch.com/2020/06/29/major-indexing-service-sounds-alarm-on-self-citations-by-nearly-50-journals/).
While tempting, even the faint hint that you engage in these practices can be destructive to your career (and that of your collaborators!). It is hard to build a reputation but really easy to destroy it.
## Readings (before class)
* [Car Wars](https://doctorow.medium.com/car-wars-a01718a27e9e) by Cory Doctorow ([audio](https://archive.org/details/CarWars))
* [Does ACM's code of ethics change decision making](https://dl.acm.org/doi/10.1145/3236024.3264833)
* An episode of Black Mirror or other near-future speculative fiction like Gattaca.
## Optional and Recommended
* [The Tuskegee Study](https://www.mcgill.ca/oss/article/history/40-years-human-experimentation-america-tuskegee-study )
* Casey Fiesler's twitter feed (and [post that inspired](https://howwegettonext.com/the-black-mirror-writers-room-teaching-technology-ethics-through-speculation-f1a9e2deccf4) this class)
* The story of Huxiang Chen, specifically [the investigation](https://www.sigarch.org/wp-content/uploads/2021/02/JIC-Public-Announcement-Feb-8-2021.pdf) [into his claims](https://twitter.com/JBalkind/status/1358855826170023936) (trigger warning: this content discusses suicide)
* [RetractionWatch](https://retractionwatch.com)
* [ACM Policies](https://www.acm.org/publications/policies)
[^1]: [What we learned from NeurIPS 2020 reviewing process](https://neuripsconf.medium.com/what-we-learned-from-neurips-2020-reviewing-process-e24549eea38f)