Switch to: References

Add citations

You must login to add citations.
  1. Bertrand’s Paradox and the Principle of Indifference.Nicholas Shackel - 2023 - Abingdon: Routledge.
    Events between which we have no epistemic reason to discriminate have equal epistemic probabilities. Bertrand’s chord paradox, however, appears to show this to be false, and thereby poses a general threat to probabilities for continuum sized state spaces. Articulating the nature of such spaces involves some deep mathematics and that is perhaps why the recent literature on Bertrand’s Paradox has been almost entirely from mathematicians and physicists, who have often deployed elegant mathematics of considerable sophistication. At the same time, the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Determining Maximal Entropy Functions for Objective Bayesian Inductive Logic.Juergen Landes, Soroush Rafiee Rad & Jon Williamson - 2022 - Journal of Philosophical Logic 52 (2):555-608.
    According to the objective Bayesian approach to inductive logic, premisses inductively entail a conclusion just when every probability function with maximal entropy, from all those that satisfy the premisses, satisfies the conclusion. When premisses and conclusion are constraints on probabilities of sentences of a first-order predicate language, however, it is by no means obvious how to determine these maximal entropy functions. This paper makes progress on the problem in the following ways. Firstly, we introduce the concept of a limit in (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Objective Bayesian nets for integrating consistent datasets.Jürgen Landes & Jon Williamson - 2022 - Journal of Artificial Intelligence Research 74:393-458.
    This paper addresses a data integration problem: given several mutually consistent datasets each of which measures a subset of the variables of interest, how can one construct a probabilistic model that fits the data and gives reasonable answers to questions which are under-determined by the data? Here we show how to obtain a Bayesian network model which represents the unique probability function that agrees with the probability distributions measured by the datasets and otherwise has maximum entropy. We provide a general (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Aggregating agents with opinions about different propositions.Richard Pettigrew - 2022 - Synthese 200 (5):1-25.
    There are many reasons we might want to take the opinions of various individuals and pool them to give the opinions of the group they constitute. If all the individuals in the group have probabilistic opinions about the same propositions, there is a host of pooling functions we might deploy, such as linear or geometric pooling. However, there are also cases where different members of the group assign probabilities to different sets of propositions, which might overlap a lot, a little, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Epistemic Risk and the Demands of Rationality.Richard Pettigrew - 2022 - Oxford, UK: Oxford University Press.
    How much does rationality constrain what we should believe on the basis of our evidence? According to this book, not very much. For most people and most bodies of evidence, there is a wide range of beliefs that rationality permits them to have in response to that evidence. The argument, which takes inspiration from William James' ideas in 'The Will to Believe', proceeds from two premises. The first is a theory about the basis of epistemic rationality. It's called epistemic utility (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Combining probabilistic logic programming with the power of maximum entropy.Gabriele Kern-Isberner & Thomas Lukasiewicz - 2004 - Artificial Intelligence 157 (1-2):139-202.
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Bertrand's Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - 2019 - Philosophy and Phenomenological Research 101 (3):505-523.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the chords, that is unique (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • On probabilistic inference in relational conditional logics.M. Thimm & G. Kern-Isberner - 2012 - Logic Journal of the IGPL 20 (5):872-908.
    Download  
     
    Export citation  
     
    Bookmark  
  • Defeasible Conditionalization.Paul D. Thorn - 2014 - Journal of Philosophical Logic 43 (2-3):283-302.
    The applicability of Bayesian conditionalization in setting one’s posterior probability for a proposition, α, is limited to cases where the value of a corresponding prior probability, PPRI(α|∧E), is available, where ∧E represents one’s complete body of evidence. In order to extend probability updating to cases where the prior probabilities needed for Bayesian conditionalization are unavailable, I introduce an inference schema, defeasible conditionalization, which allows one to update one’s personal probability in a proposition by conditioning on a proposition that represents a (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • (1 other version)Interpretations of probability.Alan Hájek - 2007 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark   162 citations  
  • Formal Epistemology Meets Mechanism Design.Jürgen Landes - 2023 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 54 (2):215-231.
    This article connects recent work in formal epistemology to work in economics and computer science. Analysing the Dutch Book Arguments, Epistemic Utility Theory and Objective Bayesian Epistemology we discover that formal epistemologists employ the same argument structure as economists and computer scientists. Since similar approaches often have similar problems and have shared solutions, opportunities for cross-fertilisation abound.
    Download  
     
    Export citation  
     
    Bookmark  
  • Tracking probabilistic truths: a logic for statistical learning.Alexandru Baltag, Soroush Rafiee Rad & Sonja Smets - 2021 - Synthese 199 (3-4):9041-9087.
    We propose a new model for forming and revising beliefs about unknown probabilities. To go beyond what is known with certainty and represent the agent’s beliefs about probability, we consider a plausibility map, associating to each possible distribution a plausibility ranking. Beliefs are defined as in Belief Revision Theory, in terms of truth in the most plausible worlds. We consider two forms of conditioning or belief update, corresponding to the acquisition of two types of information: learning observable evidence obtained by (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Reward versus risk in uncertain inference: Theorems and simulations.Gerhard Schurz & Paul D. Thorn - 2012 - Review of Symbolic Logic 5 (4):574-612.
    Systems of logico-probabilistic reasoning characterize inference from conditional assertions that express high conditional probabilities. In this paper we investigate four prominent LP systems, the systems _O, P_, _Z_, and _QC_. These systems differ in the number of inferences they licence _. LP systems that license more inferences enjoy the possible reward of deriving more true and informative conclusions, but with this possible reward comes the risk of drawing more false or uninformative conclusions. In the first part of the paper, we (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Causal Discovery and the Problem of Ignorance. An Adaptive Logic Approach.Bert Leuridan - 2009 - Journal of Applied Logic 7 (2):188-205.
    In this paper, I want to substantiate three related claims regarding causal discovery from non-experimental data. Firstly, in scientific practice, the problem of ignorance is ubiquitous, persistent, and far-reaching. Intuitively, the problem of ignorance bears upon the following situation. A set of random variables V is studied but only partly tested for (conditional) independencies; i.e. for some variables A and B it is not known whether they are (conditionally) independent. Secondly, Judea Pearl’s most meritorious and influential algorithm for causal discovery (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and (...)
    Download  
     
    Export citation  
     
    Bookmark   20 citations  
  • An overview of algorithmic approaches to compute optimum entropy distributions in the expert system shell MECore.Nico Potyka, Engelbert Mittermeier & David Marenke - 2016 - Journal of Applied Logic 19:71-86.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Bayesianism and language change.Jon Williamson - 2003 - Journal of Logic, Language and Information 12 (1):53-97.
    Bayesian probability is normally defined over a fixed language or eventspace. But in practice language is susceptible to change, and thequestion naturally arises as to how Bayesian degrees of belief shouldchange as language changes. I argue here that this question poses aserious challenge to Bayesianism. The Bayesian may be able to meet thischallenge however, and I outline a practical method for changing degreesof belief over changes in finite propositional languages.
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Deceptive updating and minimal information methods.Haim Gaifman & Anubav Vasudevan - 2012 - Synthese 187 (1):147-178.
    The technique of minimizing information (infomin) has been commonly employed as a general method for both choosing and updating a subjective probability function. We argue that, in a wide class of cases, the use of infomin methods fails to cohere with our standard conception of rational degrees of belief. We introduce the notion of a deceptive updating method and argue that non-deceptiveness is a necessary condition for rational coherence. Infomin has been criticized on the grounds that there are no higher (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Rationality As Conformity.Hykel Hosni & Jeff Paris - 2005 - Synthese 144 (2):249-285.
    We argue in favour of identifying one aspect of rational choice with the tendency to conform to the choice you expect another like-minded, but non-communicating, agent to make and study this idea in the very basic case where the choice is from a non-empty subset K of 2 A and no further structure or knowledge of A is assumed.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):1-20.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same probabilities. While the conjecture is (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Invariant Equivocation.Jürgen Landes & George Masterton - 2017 - Erkenntnis 82 (1):141-167.
    Objective Bayesians hold that degrees of belief ought to be chosen in the set of probability functions calibrated with one’s evidence. The particular choice of degrees of belief is via some objective, i.e., not agent-dependent, inference process that, in general, selects the most equivocal probabilities from among those compatible with one’s evidence. Maximising entropy is what drives these inference processes in recent works by Williamson and Masterton though they disagree as to what should have its entropy maximised. With regard to (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations