Being Rational and Being Wrong

Abstract

Do people tend to be overconfident in their opinions? Many think so. They’ve run studies to test whether people are calibrated: whether their confidence in their opinions matches the proportion of those opinions that are true. Under certain conditions, people are systematically “over-calibrated”—for example, of the opinions they’re 80% confident in, only 60% are true. From this observed over-calibration, it’s inferred that people are irrationally overconfident. My question: When—and why—is this inference warranted? Answering this question requires articulating a general connection between being rational and being right—something extant studies have not done. I show how to do so using the notion of deference. This provides a theoretical foundation to calibration research, but also reveals a flaw: the connection between being rational and being right is much weaker than is commonly assumed; as a result, rational people can often be expected to be miscalibrated. Thus we can’t test whether people are overconfident by simply testing whether they are over-calibrated; instead, we must first predict the expected rational deviations from calibration, and then compare those predictions to people’s performance. I show how in principle this can be done—and that doing so has the potential to overturn the standard interpretation of robust empirical effects. In short: rational people can be expected to be wrong more often than you might think.

Author's Profile

Kevin Dorst
Massachusetts Institute of Technology

Analytics

Added to PP
2020-01-30

Downloads
1,355 (#4,915)

6 months
260 (#2,848)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?