Switch to: Citations

Add references

You must login to add references.
  1. The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):1-20.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same probabilities. While the conjecture is (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Pure inductive logic with functions.Elizabeth Howarth & Jeffrey B. Paris - 2019 - Journal of Symbolic Logic 84 (4):1382-1402.
    We consider the version of Pure Inductive Logic which obtains for the language with equality and a single unary function symbol giving a complete characterization of the probability functions on this language which satisfy Constant Exchangeability.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Six Problems in Pure Inductive Logic.J. B. Paris & A. Vencovská - 2019 - Journal of Philosophical Logic 48 (4):731-747.
    We present six significant open problems in Pure Inductive Logic, together with their background and current status, with the intention of raising awareness and leading ultimately to their resolution.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference.J. Pearl, F. Bacchus, P. Spirtes, C. Glymour & R. Scheines - 1988 - Synthese 104 (1):161-176.
    Download  
     
    Export citation  
     
    Bookmark   243 citations  
  • Equivocation Axiom on First Order Languages.Soroush Rafiee Rad - 2017 - Studia Logica 105 (1):121-152.
    In this paper we investigate some mathematical consequences of the Equivocation Principle, and the Maximum Entropy models arising from that, for first order languages. We study the existence of Maximum Entropy models for these theories in terms of the quantifier complexity of the theory and will investigate some invariance and structural properties of such models.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • (1 other version)Information Theory and Statistical Mechanics. II.Edwin T. Jaynes - 1957 - Physical Review 108 (2):171.
    Information theory and statistical mechanics II.
    Download  
     
    Export citation  
     
    Bookmark   91 citations  
  • Probabilistic Logics and Probabilistic Networks.Rolf Haenni, Jan-Willem Romeijn, Gregory Wheeler & Jon Williamson - 2010 - Dordrecht, Netherland: Synthese Library. Edited by Gregory Wheeler, Rolf Haenni, Jan-Willem Romeijn & and Jon Williamson.
    Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.
    Download  
     
    Export citation  
     
    Bookmark   23 citations  
  • Bayesian conditionalisation and the principle of minimum information.P. M. Williams - 1980 - British Journal for the Philosophy of Science 31 (2):131-144.
    Download  
     
    Export citation  
     
    Bookmark   65 citations  
  • Probabilistic characterisation of models of first-order theories.Soroush Rafiee Rad - 2021 - Annals of Pure and Applied Logic 172 (1):102875.
    We study probabilistic characterisation of a random model of a finite set of first order axioms. Given a set of first order axioms.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Finite additivity, another lottery paradox and conditionalisation.Colin Howson - 2014 - Synthese 191 (5):1-24.
    In this paper I argue that de Finetti provided compelling reasons for rejecting countable additivity. It is ironical therefore that the main argument advanced by Bayesians against following his recommendation is based on the consistency criterion, coherence, he himself developed. I will show that this argument is mistaken. Nevertheless, there remain some counter-intuitive consequences of rejecting countable additivity, and one in particular has all the appearances of a full-blown paradox. I will end by arguing that in fact it is no (...)
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • A Mathematical Theory of Communication.Claude Elwood Shannon - 1948 - Bell System Technical Journal 27 (April 1924):379–423.
    The mathematical theory of communication.
    Download  
     
    Export citation  
     
    Bookmark   1203 citations  
  • Justifying Objective Bayesianism on Predicate Languages.Jürgen Landes & Jon Williamson - 2015 - Entropy 17 (4):2459-2543.
    Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Towards the entropy-limit conjecture.Jürgen Landes, Soroush Rafiee Rad & Jon Williamson - 2020 - Annals of Pure and Applied Logic 172 (2):102870.
    The maximum entropy principle is widely used to determine non-committal probabilities on a finite domain, subject to a set of constraints, but its application to continuous domains is notoriously problematic. This paper concerns an intermediate case, where the domain is a first-order predicate language. Two strategies have been put forward for applying the maximum entropy principle on such a domain: applying it to finite sublanguages and taking the pointwise limit of the resulting probabilities as the size n of the sublanguage (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • (1 other version)What does a conditional knowledge base entail?Daniel Lehmann & Menachem Magidor - 1992 - Artificial Intelligence 55 (1):1-60.
    Download  
     
    Export citation  
     
    Bookmark   159 citations  
  • The Continuum of Inductive Methods.Rudolf Carnap - 1953 - Philosophy 28 (106):272-273.
    Download  
     
    Export citation  
     
    Bookmark   141 citations  
  • Entropy and uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity of (...)
    Download  
     
    Export citation  
     
    Bookmark   55 citations  
  • Objective Bayesian nets for integrating consistent datasets.Jürgen Landes & Jon Williamson - 2022 - Journal of Artificial Intelligence Research 74:393-458.
    This paper addresses a data integration problem: given several mutually consistent datasets each of which measures a subset of the variables of interest, how can one construct a probabilistic model that fits the data and gives reasonable answers to questions which are under-determined by the data? Here we show how to obtain a Bayesian network model which represents the unique probability function that agrees with the probability distributions measured by the datasets and otherwise has maximum entropy. We provide a general (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • In defense of the maximum entropy inference process.J. Paris & A. Vencovská - 1997 - International Journal of Approximate Reasoning 17 (1):77-103.
    This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the maximum entropy inference process, ME, is the only inference process respecting “common sense.” This result was criticized on the grounds that the probabilistic knowledge bases considered are unnatural and that ignorance of dependence should not be identified with statistical independence. We argue against these criticisms and also against the more general criticism that ME is representation dependent. In a (...)
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • Objective Bayesian Nets from Consistent Datasets.Jürgen Landes & Jon Williamson - unknown
    This paper addresses the problem of finding a Bayesian net representation of the probability function that agrees with the distributions of multiple consistent datasets and otherwise has maximum entropy. We give a general algorithm which is significantly more efficient than the standard brute-force approach. Furthermore, we show that in a wide range of cases such a Bayesian net can be obtained without solving any optimisation problem.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and (...)
    Download  
     
    Export citation  
     
    Bookmark   20 citations  
  • The Uncertain Reasoner’s Companion. [REVIEW]J. B. Paris - 1997 - Erkenntnis 46 (3):397-400.
    Download  
     
    Export citation  
     
    Bookmark   56 citations  
  • Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Download  
     
    Export citation  
     
    Bookmark   9 citations