Switch to: Citations

Add references

You must login to add references.
  1. Heuristics and Biases: The Psychology of Intuitive Judgment.Thomas Gilovich, Dale Griffin & Daniel Kahneman (eds.) - 2002 - Cambridge: Cambridge University Press.
    Is our case strong enough to go to trial? Will interest rates go up? Can I trust this person? Such questions - and the judgments required to answer them - are woven into the fabric of everyday experience. This book, first published in 2002, examines how people make such judgments. The study of human judgment was transformed in the 1970s, when Kahneman and Tversky introduced their 'heuristics and biases' approach and challenged the dominance of strictly rational models. Their work highlighted (...)
    Download  
     
    Export citation  
     
    Bookmark   129 citations  
  • Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment.Amos Tversky & Daniel Kahneman - 1983 - Psychological Review 90 (4):293-315.
    Download  
     
    Export citation  
     
    Bookmark   617 citations  
  • Why Most Published Research Findings Are False.John P. A. Ioannidis - 2005 - PLoS Med 2 (8):e124.
    Published research findings are sometimes refuted by subsequent evidence, says Ioannidis, with ensuing confusion and disappointment.
    Download  
     
    Export citation  
     
    Bookmark   366 citations  
  • (1 other version)Judging Judgment.Bruce Bueno de Mesquita - 2010 - Critical Review: A Journal of Politics and Society 22 (4):355-388.
    Philip E. Tetlock and I agree that forecasting tools are best evaluated in peer-reviewed settings and in comparison not only to expert judgments, but also to alternative modeling strategies. Applying his suggested standards of assessment, however, certain forecasting models not only outperform expert judgments, but also have gone head-to-head with alternative models and outperformed them. This track record demonstrates the capability to make significant, reliable predictions of difficult, complex events. The record has unfolded, contrary to Tetlock's contention, not only in (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • When and why do hedgehogs and foxes differ?Frank C. Keil - 2010 - Critical Review: A Journal of Politics and Society 22 (4):415-426.
    Philip E. Tetlock's finding that "hedgehog" experts are worse predictors than "foxes" offers fertile ground for future research. Are experts as likely to exhibit hedgehog- or fox-like tendencies in areas that call for explanatory, diagnostic, and skill-based expertise-as they did when Tetlock called on experts to make predictions? Do particular domains of expertise curtail or encourage different styles of expertise? Can we trace these different styles to childhood? Finally, can we nudge hedgehogs to be more like foxes? Current research can (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Tetlock and counterfactuals: Saving methodological ambition from empirical findings.Ian S. Lustick - 2010 - Critical Review: A Journal of Politics and Society 22 (4):427-447.
    In five works spanning a decade, Philip E. Tetlock's interest in counterfactuals has changed. He began with an optimistic desire to make social science more rigorous by identifying best practices in the absence of non-imagined controls for experimentation. Soon, however, he adopted a more pessimistic analysis of the cognitive and psychological barriers facing experts. This shift was brought on by an awareness that experts are not rational Bayesians who continually update their theories to keep up with new information; but instead (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Social functionalist frameworks for judgment and choice: Intuitive politicians, theologians, and prosecutors.Philip E. Tetlock - 2002 - Psychological Review 109 (3):451-471.
    Download  
     
    Export citation  
     
    Bookmark   45 citations  
  • The trouble with experts.Paul J. Quirk - 2010 - Critical Review: A Journal of Politics and Society 22 (4):449-465.
    In his justly celebrated Expert Political Judgment, Philip E. Tetlock evaluates the judgment of economic and political experts by rigorously testing their ability to make accurate predictions. He finds that ability profoundly limited, implying that expert judgment is virtually useless, if not worse. He concludes by proposing a project that would seek to improve experts' performance by holding them publicly accountable for their claims. But Tetlock's methods severely underestimate the value of expert opinion. Despite their notorious disagreements, experts have highly (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Thinking the Unthinkable as a Radical Scientific Project.Steve Fuller - 2010 - Critical Review: A Journal of Politics and Society 22 (4):397-413.
    Philip Tetlock underestimates the import of his own Expert Political Judgment. It is much more than a critical scientific evaluation of the accuracy and consistency of political pundits. It also offers a blueprint for challenging expertise more generally-in the name of scientific advancement. “Thinking the unthinkable”-a strategy Tetlock employs when he gets experts to consider counterfactual scenarios that are far from their epistemic comfort zones-has had explosive consequences historically for both knowledge and morality by extending our sense of what is (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Have the experts been weighed, measured, and found wanting?Bryan Caplan - 2007 - Critical Review: A Journal of Politics and Society 19 (1):81-91.
    ABSTRACT Tetlock's Expert Political Judgment is a creative, careful, and mostly convincing study of the predictive accuracy of political experts. My only major complaints are that Tetlock (1) understates the predictive accuracy of experts, and (2) does too little to discourage demagogues from misinterpreting his work as a vindication of the wisdom of the average citizen. Experts have much to learn from Tetlock's epistemological audit, but there is still ample evidence that, compared to laymen, experts are very good.
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Putting political experts to the test.Zeljka Buturovic - 2010 - Critical Review: A Journal of Politics and Society 22 (4):389-396.
    In his remarkably meticulous and even-handed 2005 book, Expert Political Judgment, Philip E. Tetlock establishes that the only thing we can count on in the political experts' predictions is that they will underperform-in some cases significantly-the predictions made by mechanical statistical procedures, including random chance. Experts have many uses and Tetlock does not claim that they have no value. However, Tetlock zeroes in on experts' important political role-as prognosticators. Tetlock does not attempt the impossible by trying to judge experts on (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • .Daniel Kahneman & Shane Frederick - 2002 - Cambridge University Press.
    Download  
     
    Export citation  
     
    Bookmark   355 citations