Switch to: Citations

Add references

You must login to add references.
  1. (1 other version)Prior Probabilities.Edwin T. Jaynes - 1968 - Ieee Transactions on Systems and Cybernetics (3):227-241.
    Download  
     
    Export citation  
     
    Bookmark   50 citations  
  • The Foundations of Statistics.Leonard Savage - 1954 - Wiley Publications in Statistics.
    Classic analysis of the subject and the development of personal probability; one of the greatest controversies in modern statistcal thought.
    Download  
     
    Export citation  
     
    Bookmark   903 citations  
  • Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy.J. E. Shore & R. W. Johnson - 1980 - IEEE Transactions on Information Theory:26-37.
    Download  
     
    Export citation  
     
    Bookmark   32 citations  
  • The Logic of Decision.Richard C. Jeffrey - 1965 - New York, NY, USA: University of Chicago Press.
    "[This book] proposes new foundations for the Bayesian principle of rational action, and goes on to develop a new logic of desirability and probabtility."—Frederic Schick, _Journal of Philosophy_.
    Download  
     
    Export citation  
     
    Bookmark   770 citations  
  • The status of the principle of maximum entropy.Abner Shimony - 1985 - Synthese 63 (1):35 - 53.
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • (1 other version)Theory of Probability: A Critical Introductory Treatment.Bruno de Finetti - 1970 - New York: John Wiley.
    Download  
     
    Export citation  
     
    Bookmark   155 citations  
  • Information theory and statistical mechanics.Edwin T. Jaynes - 1957 - Physical Review 106:620–630.
    Information theory and statistical mechanics.
    Download  
     
    Export citation  
     
    Bookmark   59 citations  
  • (1 other version)Information Theory and Statistical Mechanics. II.Edwin T. Jaynes - 1957 - Physical Review 108 (2):171.
    Information theory and statistical mechanics II.
    Download  
     
    Export citation  
     
    Bookmark   91 citations  
  • Probability Theory. The Logic of Science.Edwin T. Jaynes - 2002 - Cambridge University Press: Cambridge. Edited by G. Larry Bretthorst.
    Download  
     
    Export citation  
     
    Bookmark   220 citations  
  • E. T. Jaynes, Papers On Probability, Statistics And Statistical Physics.[author unknown] - 1985 - Revue d'Histoire des Sciences 38 (2):179-180.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Probabilities over rich languages, testing and randomness.Haim Gaifman & Marc Snir - 1982 - Journal of Symbolic Logic 47 (3):495-548.
    Download  
     
    Export citation  
     
    Bookmark   97 citations  
  • (1 other version)The Foundations of Statistics.Leonard J. Savage - 1954 - Synthese 11 (1):86-89.
    Download  
     
    Export citation  
     
    Bookmark   870 citations  
  • Degree of confirmation’ and Inductive Logic.Hilary Putnam - 1963 - In Paul Arthur Schilpp (ed.), The philosophy of Rudolf Carnap. La Salle, Ill.,: Open Court. pp. 761-783.
    Download  
     
    Export citation  
     
    Bookmark   53 citations  
  • Paradoxes of infinity and self-applications, I.Haim Gaifman - 1983 - Erkenntnis 20 (2):131 - 155.
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • (1 other version)The Foundations of Statistics.Leonard J. Savage - 1956 - Philosophy of Science 23 (2):166-166.
    Download  
     
    Export citation  
     
    Bookmark   856 citations  
  • A Mathematical Theory of Communication.Claude Elwood Shannon - 1948 - Bell System Technical Journal 27 (April 1924):379–423.
    The mathematical theory of communication.
    Download  
     
    Export citation  
     
    Bookmark   1203 citations  
  • Reasoning with limited resources and assigning probabilities to arithmetical statements.Haim Gaifman - 2004 - Synthese 140 (1-2):97 - 119.
    There are three sections in this paper. The first is a philosophical discussion of the general problem of reasoning under limited deductive capacity. The second sketches a rigorous way of assigning probabilities to statements in pure arithmetic; motivated by the preceding discussion, it can nonetheless be read separately. The third is a philosophical discussion that highlights the shifting contextual character of subjective probabilities and beliefs.
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • Why I am not an objective Bayesian; some reflections prompted by Rosenkrantz.Teddy Seidenfeld - 1979 - Theory and Decision 11 (4):413-440.
    Download  
     
    Export citation  
     
    Bookmark   57 citations  
  • A problem for relative information minimizers in probability kinematics.Bas C. van Fraassen - 1981 - British Journal for the Philosophy of Science 32 (4):375-379.
    Download  
     
    Export citation  
     
    Bookmark   61 citations  
  • Entropy and uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity of (...)
    Download  
     
    Export citation  
     
    Bookmark   55 citations  
  • Imprecision and indeterminacy in probability judgment.Isaac Levi - 1985 - Philosophy of Science 52 (3):390-409.
    Bayesians often confuse insistence that probability judgment ought to be indeterminate (which is incompatible with Bayesian ideals) with recognition of the presence of imprecision in the determination or measurement of personal probabilities (which is compatible with these ideals). The confusion is discussed and illustrated by remarks in a recent essay by R. C. Jeffrey.
    Download  
     
    Export citation  
     
    Bookmark   58 citations  
  • In defense of the maximum entropy inference process.J. Paris & A. Vencovská - 1997 - International Journal of Approximate Reasoning 17 (1):77-103.
    This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the maximum entropy inference process, ME, is the only inference process respecting “common sense.” This result was criticized on the grounds that the probabilistic knowledge bases considered are unnatural and that ignorance of dependence should not be identified with statistical independence. We argue against these criticisms and also against the more general criticism that ME is representation dependent. In a (...)
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and (...)
    Download  
     
    Export citation  
     
    Bookmark   20 citations  
  • Update.[author unknown] - 2000 - New Vico Studies 18:149-154.
    Download  
     
    Export citation  
     
    Bookmark   36 citations