Dissertation, Bowling Green State University (2019)
In this dissertation, I construct scientifically and practically adequate moral analogs of cognitive heuristics and biases. Cognitive heuristics are reasoning “shortcuts” that are efficient but flawed. Such flaws yield systematic judgment errors—i.e., cognitive biases. For example, the availability heuristic infers an event’s probability by seeing how easy it is to recall similar events. Since dramatic events, such as airplane crashes, are disproportionately easy to recall, this heuristic explains systematic overestimations of their probability (availability bias). The research program on cognitive heuristics and biases (e.g., Daniel Kahneman’s work) has been scientifically successful and has yielded useful error-prevention techniques—i.e., cognitive debiasing. I attempt to apply this framework to moral reasoning to yield moral heuristics and biases. For instance, a moral bias of unjustified differences in the treatment of particular animal species might be partially explained by a moral heuristic that dubiously infers animals’ moral status from their aesthetic features. While the basis for identifying judgments as cognitive errors is often unassailable (e.g., per violating laws of logic), identifying moral errors seemingly requires appealing to moral truth, which, I argue, is problematic within science. Such appeals can be avoided by repackaging moral theories as mere “standards-of-interest” (a la non-normative metrics of purportedly right-making features/properties). However, standards-of-interest do not provide authority, which is needed for effective debiasing. Nevertheless, since each person deems their own subjective morality authoritative, subjective morality (qua standard-of-interest and not moral subjectivism) satisfies both scientific and practical concerns. As such, (idealized) subjective morality grounds a moral analog of cognitive biases—namely, subjective moral biases (e.g., committed anti-racists unconsciously discriminating). I also argue that "cognitive heuristic" is defined by its contrast with rationality. Consequently, heuristics explain biases, which are also so defined. However, such contrasting with rationality is causally irrelevant to cognition. This frustrates the presumed usefulness of the kind, heuristic, in causal explanation. As such, in the moral case, I jettison the role of causal explanation and tailor categories solely for contrastive explanation. As such, “moral heuristic” is replaced with "subjective moral fallacy," which is defined by its contrast with subjective morality and explains subjective moral biases. The resultant subjective moral biases and fallacies framework can undergird future empirical research.