Switch to: Citations

Add references

You must login to add references.
  1. Analogs of de Finetti's theorem and interpretative problems of quantum mechanics.R. L. Hudson - 1981 - Foundations of Physics 11 (9-10):805-808.
    It is argued that the characterization of the states of an infinite system of indistinguishable particles satisfying Bose-Einstein statistics which follows from the quantum-mechanical analog of de Finetti's theorem (2) can be used to interpret the nonuniqueness of the resolution into a convex combination of pure states of a quantum-mechanical mixed state.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • (1 other version)An Introduction to Probability Theory and its Applications Vol. I.William Feller - 1965 - Wiley.
    · Introduction: The Nature of Probability Theory· The Sample Space· Elements of Combinatorial Analysis· Fluctuations in Coin Tossing and Random Walks· Combination of Events· Conditional Probability· Stochastic Independence· The Binomial and Poisson Distributions· The Normal Approximation to the Binomial Distribution· Unlimited Sequences of Bernoulli Trials· Random Variables· Expectation· Laws of Large Numbers· Integral Valued Variables· Generating Functions· Compound Distributions· Branching Processes· Recurrent Events· Renewal Theory· Random Walk and Ruin Problems· Markov Chains· Algebraic Treatment of Finite Markov Chains· The Simplest Time-Dependent (...)
    Download  
     
    Export citation  
     
    Bookmark   54 citations  
  • An Essay towards solving a Problem in the Doctrine of Chances.T. Bayes - 1763 - Philosophical Transactions 53:370-418.
    Download  
     
    Export citation  
     
    Bookmark   108 citations  
  • Finite Exchangeable Sequences.P. Diaconis & D. Freedman - 1980 - The Annals of Probability 8:745--64.
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • La Prévision: Ses Lois Logiques, Ses Sources Subjectives.Bruno de Finetti - 1937 - Annales de l'Institut Henri Poincaré 7 (1):1-68.
    Download  
     
    Export citation  
     
    Bookmark   209 citations  
  • Probability, Induction and Statistics: The Art of Guessing.Bruno De Finetti - 1972 - New York: John Wiley.
    Download  
     
    Export citation  
     
    Bookmark   100 citations  
  • Mathematical foundations of information theory.Aleksandr I͡Akovlevich Khinchin - 1957 - New York,: Dover Publications.
    First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
    Download  
     
    Export citation  
     
    Bookmark   31 citations  
  • The axiomatization of randomness.Michiel van Lambalgen - 1990 - Journal of Symbolic Logic 55 (3):1143-1167.
    We present a faithful axiomatization of von Mises' notion of a random sequence, using an abstract independence relation. A byproduct is a quantifier elimination theorem for Friedman's "almost all" quantifier in terms of this independence relation.
    Download  
     
    Export citation  
     
    Bookmark   20 citations  
  • The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...)
    Download  
     
    Export citation  
     
    Bookmark   22 citations  
  • Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Download  
     
    Export citation  
     
    Bookmark   28 citations  
  • (1 other version)On the Concept of a Random Sequence.Alonzo Church - 1940 - Bulletin of the American Mathematical Society 46 (2):130--135.
    Download  
     
    Export citation  
     
    Bookmark   44 citations