Switch to: Citations

Add references

You must login to add references.
  1. Objective Bayesianism and the maximum entropy principle.Jürgen Landes & Jon Williamson - 2013 - Entropy 15 (9):3528-3591.
    Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities, they should be calibrated to our evidence of physical probabilities, and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • In Defence of Objective Bayesianism.Jon Williamson - 2010 - Oxford University Press.
    Objective Bayesianism is a methodological theory that is currently applied in statistics, philosophy, artificial intelligence, physics and other sciences. This book develops the formal and philosophical foundations of the theory, at a level accessible to a graduate student with some familiarity with mathematical notation.
    Download  
     
    Export citation  
     
    Bookmark   134 citations  
  • In defense of the maximum entropy inference process.J. Paris & A. Vencovská - 1997 - International Journal of Approximate Reasoning 17 (1):77-103.
    This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the maximum entropy inference process, ME, is the only inference process respecting “common sense.” This result was criticized on the grounds that the probabilistic knowledge bases considered are unnatural and that ignorance of dependence should not be identified with statistical independence. We argue against these criticisms and also against the more general criticism that ME is representation dependent. In a (...)
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • From Bayesian epistemology to inductive logic.Jon Williamson - 2013 - Journal of Applied Logic 11 (4):468-486.
    Inductive logic admits a variety of semantics (Haenni et al., 2011, Part 1). This paper develops semantics based on the norms of Bayesian epistemology (Williamson, 2010, Chapter 7). §1 introduces the semantics and then, in §2, the paper explores methods for drawing inferences in the resulting logic and compares the methods of this paper with the methods of Barnett and Paris (2008). §3 then evaluates this Bayesian inductive logic in the light of four traditional critiques of inductive logic, arguing (i) (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Objective Bayesianism with predicate languages.Jon Williamson - 2008 - Synthese 163 (3):341-356.
    Objective Bayesian probability is often defined over rather simple domains, e.g., finite event spaces or propositional languages. This paper investigates the extension of objective Bayesianism to first-order logical languages. It is argued that the objective Bayesian should choose a probability function, from all those that satisfy constraints imposed by background knowledge, that is closest to a particular frequency-induced probability function which generalises the λ = 0 function of Carnap’s continuum of inductive methods.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • The two concepts of probability: The problem of probability.Rudolf Carnap - 1945 - Philosophy and Phenomenological Research 5 (4):513-532.
    Download  
     
    Export citation  
     
    Bookmark   104 citations  
  • Some Aspects of Polyadic Inductive Logic.Jürgen Landes, Jeff Paris & Alena Vencovská - 2008 - Studia Logica 90 (1):3-16.
    We give a brief account of some de Finetti style representation theorems for probability functions satisfying Spectrum Exchangeability in Polyadic Inductive Logic, together with applications to Non-splitting, Language Invariance, extensions with Equality and Instantial Relevance.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Lectures on Inductive Logic.Jon Williamson - 2017 - Oxford, England: Oxford University Press.
    Logic is a field studied mainly by researchers and students of philosophy, mathematics and computing. Inductive logic seeks to determine the extent to which the premises of an argument entail its conclusion, aiming to provide a theory of how one should reason in the face of uncertainty. It has applications to decision making and artificial intelligence, as well as how scientists should reason when not in possession of the full facts. In this work, Jon Williamson embarks on a quest to (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • Justifying Objective Bayesianism on Predicate Languages.Jürgen Landes & Jon Williamson - 2015 - Entropy 17 (4):2459-2543.
    Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Probability Theory. The Logic of Science.Edwin T. Jaynes - 2002 - Cambridge University Press: Cambridge. Edited by G. Larry Bretthorst.
    Download  
     
    Export citation  
     
    Bookmark   220 citations  
  • A survey of some recent results on Spectrum Exchangeability in Polyadic Inductive Logic.J. Landes, J. B. Paris & A. Vencovská - 2011 - Synthese 181 (S1):19 - 47.
    We give a unified account of some results in the development of Polyadic Inductive Logic in the last decade with particular reference to the Principle of Spectrum Exchangeability, its consequences for Instantial Relevance, Language Invariance and Johnson's Sufficientness Principle, and the corresponding de Finetti style representation theorems.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and (...)
    Download  
     
    Export citation  
     
    Bookmark   20 citations  
  • Equivocation Axiom on First Order Languages.Soroush Rafiee Rad - 2017 - Studia Logica 105 (1):121-152.
    In this paper we investigate some mathematical consequences of the Equivocation Principle, and the Maximum Entropy models arising from that, for first order languages. We study the existence of Maximum Entropy models for these theories in terms of the quantifier complexity of the theory and will investigate some invariance and structural properties of such models.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Towards the entropy-limit conjecture.Jürgen Landes, Soroush Rafiee Rad & Jon Williamson - 2020 - Annals of Pure and Applied Logic 172 (2):102870.
    The maximum entropy principle is widely used to determine non-committal probabilities on a finite domain, subject to a set of constraints, but its application to continuous domains is notoriously problematic. This paper concerns an intermediate case, where the domain is a first-order predicate language. Two strategies have been put forward for applying the maximum entropy principle on such a domain: applying it to finite sublanguages and taking the pointwise limit of the resulting probabilities as the size n of the sublanguage (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations