Switch to: Citations

Add references

You must login to add references.
  1. (1 other version)Model Theory.Michael Makkai, C. C. Chang & H. J. Keisler - 1991 - Journal of Symbolic Logic 56 (3):1096.
    Download  
     
    Export citation  
     
    Bookmark   410 citations  
  • Justifying Objective Bayesianism on Predicate Languages.Jürgen Landes & Jon Williamson - 2015 - Entropy 17 (4):2459-2543.
    Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • On inductive logic.Rudolf Carnap - 1945 - Philosophy of Science 12 (2):72-97.
    Among the various meanings in which the word ‘probability’ is used in everyday language, in the discussion of scientists, and in the theories of probability, there are especially two which must be clearly distinguished. We shall use for them the terms ‘probability1’ and ‘probability2'. Probability1 is a logical concept, a certain logical relation between two sentences ; it is the same as the concept of degree of confirmation. I shall write briefly “c” for “degree of confirmation,” and “c” for “the (...)
    Download  
     
    Export citation  
     
    Bookmark   78 citations  
  • Equivocation Axiom on First Order Languages.Soroush Rafiee Rad - 2017 - Studia Logica 105 (1):121-152.
    In this paper we investigate some mathematical consequences of the Equivocation Principle, and the Maximum Entropy models arising from that, for first order languages. We study the existence of Maximum Entropy models for these theories in terms of the quantifier complexity of the theory and will investigate some invariance and structural properties of such models.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Bayesian conditionalisation and the principle of minimum information.P. M. Williams - 1980 - British Journal for the Philosophy of Science 31 (2):131-144.
    Download  
     
    Export citation  
     
    Bookmark   65 citations  
  • Invariant Equivocation.Jürgen Landes & George Masterton - 2017 - Erkenntnis 82 (1):141-167.
    Objective Bayesians hold that degrees of belief ought to be chosen in the set of probability functions calibrated with one’s evidence. The particular choice of degrees of belief is via some objective, i.e., not agent-dependent, inference process that, in general, selects the most equivocal probabilities from among those compatible with one’s evidence. Maximising entropy is what drives these inference processes in recent works by Williamson and Masterton though they disagree as to what should have its entropy maximised. With regard to (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Asymptotic conditional probabilities: The non-unary case.Adam J. Grove, Joseph Y. Halpern & Daphne Koller - 1996 - Journal of Symbolic Logic 61 (1):250-276.
    Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for first-order sentences. Given first-order sentences φ and θ, we consider the structures with domain {1,..., N} that satisfy θ, and compute the fraction of them in which φ is true. We then consider what happens to this fraction as N gets large. This extends the work on 0-1 laws that considers the limiting probability of first-order sentences, by considering asymptotic conditional (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • From statistical knowledge bases to degrees of belief.Fahiem Bacchus, Adam J. Grove, Joseph Y. Halpern & Daphne Koller - 1996 - Artificial Intelligence 87 (1-2):75-143.
    Download  
     
    Export citation  
     
    Bookmark   33 citations  
  • Probabilistic Logic and Probabilistic Networks. Haenni, R., Romeijn, J.-W., Wheeler, G. & Williamson, J. - unknown
    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches to probabilistic logic into a simple unifying framework: logically complex evidence can be used to associate probability intervals or probabilities with sentences.
    Download  
     
    Export citation  
     
    Bookmark   28 citations  
  • (3 other versions)The Authors.[author unknown] - 1973 - Proceedings of the Heraclitean Society 1 (1).
    Download  
     
    Export citation  
     
    Bookmark   28 citations  
  • Characterizing the principle of minimum cross-entropy within a conditional-logical framework.Gabriele Kern-Isberner - 1998 - Artificial Intelligence 98 (1-2):169-208.
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Objective Bayesian Nets from Consistent Datasets.Jürgen Landes & Jon Williamson - unknown
    This paper addresses the problem of finding a Bayesian net representation of the probability function that agrees with the distributions of multiple consistent datasets and otherwise has maximum entropy. We give a general algorithm which is significantly more efficient than the standard brute-force approach. Furthermore, we show that in a wide range of cases such a Bayesian net can be obtained without solving any optimisation problem.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Combining probabilistic logic programming with the power of maximum entropy.Gabriele Kern-Isberner & Thomas Lukasiewicz - 2004 - Artificial Intelligence 157 (1-2):139-202.
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • On the applicability of the ‘number of possible states’ argument in multi-expert reasoning.Martin Adamčík - 2016 - Journal of Applied Logic 19:20-49.
    Download  
     
    Export citation  
     
    Bookmark   2 citations