Switch to: Citations

Add references

You must login to add references.
  1. (1 other version)An Introduction to Probability Theory and its Applications, Volume 1.William Feller - 1968 - J. Wiley & Sons: New York.
    The nature of probability theory. The sample space. Elements of combinatorial analysis. Fluctuations in coin tossing and random walks. Combination of events. Conditional probability, stochastic independence. The binomial and the Poisson distributions. The Normal approximation to the binomial distribution. Unlimited sequences of Bernoulli trials. Random variables, expectation. Laws of large numbers. Integral valued variables, generating functions. Compound distributions. Branching processes. Recurrent events. Renewal theory. Random walk and ruin problems. Markov chains. Algebraic treatment of finite Markov chains. The simplest time-dependent stochastic (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • A Mathematical Theory of Communication.Claude Elwood Shannon - 1948 - Bell System Technical Journal 27 (April 1924):379–423.
    The mathematical theory of communication.
    Download  
     
    Export citation  
     
    Bookmark   1203 citations  
  • The Well-Posed Problem.Edwin T. Jaynes - 1973 - Foundations of Physics 3 (4):477-493.
    Many statistical problems, including some of the most important for physical applications, have long been regarded as underdetermined from the standpoint of a strict frequency definition of probability; yet they may appear wellposed or even overdetermined by the principles of maximum entropy and transformation groups. Furthermore, the distributions found by these methods turn out to have a definite frequency correspondence; the distribution obtained by invariance under a transformation group is by far the most likely to be observed experimentally, in the (...)
    Download  
     
    Export citation  
     
    Bookmark   67 citations  
  • Boltzmann's Approach to Statistical Mechanics.Sheldon Goldstein - unknown
    In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the time-reversible laws of microscopic physics. Boltzmann’s analysis, the essence of which I shall review here, is basically correct. The most famous criticisms of Boltzmann’s later work on the subject have little merit. Most twentieth century innovations – such as the identification of the state of a physical system with a probability distribution on its phase space, (...)
    Download  
     
    Export citation  
     
    Bookmark   91 citations  
  • A new resolution of the Judy Benjamin Problem.Igor Douven & Jan-Willem Romeijn - 2011 - Mind 120 (479):637 - 670.
    A paper on how to adapt your probabilisitc beliefs when learning a conditional.
    Download  
     
    Export citation  
     
    Bookmark   32 citations  
  • A problem for relative information minimizers in probability kinematics.Bas C. van Fraassen - 1981 - British Journal for the Philosophy of Science 32 (4):375-379.
    Download  
     
    Export citation  
     
    Bookmark   61 citations  
  • (2 other versions)Theory of Probability.B. O. Koopman - 1943 - Journal of Symbolic Logic 8 (1):34-35.
    Download  
     
    Export citation  
     
    Bookmark   75 citations  
  • (1 other version)Information Theory and Statistical Mechanics. II.Edwin T. Jaynes - 1957 - Physical Review 108 (2):171.
    Information theory and statistical mechanics II.
    Download  
     
    Export citation  
     
    Bookmark   91 citations  
  • (1 other version)A treatise on probability.J. Keynes - 1924 - Revue de Métaphysique et de Morale 31 (1):11-12.
    Download  
     
    Export citation  
     
    Bookmark   291 citations  
  • (1 other version)A Treatise on Probability.J. M. Keynes - 1989 - British Journal for the Philosophy of Science 40 (2):219-222.
    Download  
     
    Export citation  
     
    Bookmark   296 citations  
  • (1 other version)Prior Probabilities.Edwin T. Jaynes - 1968 - Ieee Transactions on Systems and Cybernetics (3):227-241.
    Download  
     
    Export citation  
     
    Bookmark   50 citations  
  • Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy.J. E. Shore & R. W. Johnson - 1980 - IEEE Transactions on Information Theory:26-37.
    Download  
     
    Export citation  
     
    Bookmark   32 citations  
  • Information theory and statistical mechanics.Edwin T. Jaynes - 1957 - Physical Review 106:620–630.
    Information theory and statistical mechanics.
    Download  
     
    Export citation  
     
    Bookmark   59 citations  
  • A Treatise on Probability. [REVIEW]Harry T. Costello - 1923 - Journal of Philosophy 20 (11):301-306.
    Download  
     
    Export citation  
     
    Bookmark   297 citations  
  • E. T. Jaynes, Papers On Probability, Statistics And Statistical Physics.[author unknown] - 1985 - Revue d'Histoire des Sciences 38 (2):179-180.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Theory of Probability.Harold Jeffreys - 1940 - Philosophy of Science 7 (2):263-264.
    Download  
     
    Export citation  
     
    Bookmark   245 citations  
  • Deceptive updating and minimal information methods.Haim Gaifman & Anubav Vasudevan - 2012 - Synthese 187 (1):147-178.
    The technique of minimizing information (infomin) has been commonly employed as a general method for both choosing and updating a subjective probability function. We argue that, in a wide class of cases, the use of infomin methods fails to cohere with our standard conception of rational degrees of belief. We introduce the notion of a deceptive updating method and argue that non-deceptiveness is a necessary condition for rational coherence. Infomin has been criticized on the grounds that there are no higher (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Why I am not an objective Bayesian; some reflections prompted by Rosenkrantz.Teddy Seidenfeld - 1979 - Theory and Decision 11 (4):413-440.
    Download  
     
    Export citation  
     
    Bookmark   57 citations  
  • Entropy and uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity of (...)
    Download  
     
    Export citation  
     
    Bookmark   55 citations  
  • A Problem for Relative Information Minimizers, Continued.Bas van Fraassen - 1986 - British Journal for the Philosophy of Science 37 (4):453-463.
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • On Indeterminate Updating of Credences.Leendert Huisman - 2014 - Philosophy of Science 81 (4):537-557.
    The strategy of updating credences by minimizing the relative entropy has been questioned by many authors, most strongly by means of the Judy Benjamin puzzle. I present a new analysis of Judy Benjamin–like forms of new information and defend the thesis that in general the rational posterior is indeterminate, meaning that a family of posterior credence functions rather than a single one is the rational response when that type of information becomes available. The proposed thesis extends naturally to all cases (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • On the a priori and a posteriori assessment of probabilities.Anubav Vasudevan - 2013 - Journal of Applied Logic 11 (4):440-451.
    We argue that in spite of their apparent dissimilarity, the methodologies employed in the a priori and a posteriori assessment of probabilities can both be justified by appeal to a single principle of inductive reasoning, viz., the principle of symmetry. The difference between these two methodologies consists in the way in which information about the single-trial probabilities in a repeatable chance process is extracted from the constraints imposed by this principle. In the case of a posteriori reasoning, these constraints inform (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation