Switch to: References

Add citations

You must login to add citations.
  1. On the Ecological and Internal Rationality of Bayesian Conditionalization and Other Belief Updating Strategies.Olav Benjamin Vassend - forthcoming - British Journal for the Philosophy of Science.
    Download  
     
    Export citation  
     
    Bookmark  
  • Probability and time.Marco Zaffalon & Enrique Miranda - 2013 - Artificial Intelligence 198 (C):1-51.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Learning as Hypothesis Testing: Learning Conditional and Probabilistic Information.Jonathan Vandenburgh - manuscript
    Complex constraints like conditionals ('If A, then B') and probabilistic constraints ('The probability that A is p') pose problems for Bayesian theories of learning. Since these propositions do not express constraints on outcomes, agents cannot simply conditionalize on the new information. Furthermore, a natural extension of conditionalization, relative information minimization, leads to many counterintuitive predictions, evidenced by the sundowners problem and the Judy Benjamin problem. Building on the notion of a `paradigm shift' and empirical research in psychology and economics, I (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The Value of Biased Information.Nilanjan Das - 2023 - British Journal for the Philosophy of Science 74 (1):25-55.
    In this article, I cast doubt on an apparent truism, namely, that if evidence is available for gathering and use at a negligible cost, then it’s always instrumentally rational for us to gather that evidence and use it for making decisions. Call this ‘value of information’ (VOI). I show that VOI conflicts with two other plausible theses. The first is the view that an agent’s evidence can entail non-trivial propositions about the external world. The second is the view that epistemic (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • The Principal Principle and subjective Bayesianism.Christian Wallmann & Jon Williamson - 2019 - European Journal for Philosophy of Science 10 (1):1-14.
    This paper poses a problem for Lewis’ Principal Principle in a subjective Bayesian framework: we show that, where chances inform degrees of belief, subjective Bayesianism fails to validate normal informal standards of what is reasonable. This problem points to a tension between the Principal Principle and the claim that conditional degrees of belief are conditional probabilities. However, one version of objective Bayesianism has a straightforward resolution to this problem, because it avoids this latter claim. The problem, then, offers some support (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Higher-Order Beliefs and the Undermining Problem for Bayesianism.Lisa Cassell - 2019 - Acta Analytica 34 (2):197-213.
    Jonathan Weisberg has argued that Bayesianism’s rigid updating rules make Bayesian updating incompatible with undermining defeat. In this paper, I argue that when we attend to the higher-order beliefs we must ascribe to agents in the kinds of cases Weisberg considers, the problem he raises disappears. Once we acknowledge the importance of higher-order beliefs to the undermining story, we are led to a different understanding of how these cases arise. And on this different understanding of things, the rigid nature of (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Diachronic Dutch Books and Evidential Import.J. Dmitri Gallow - 2019 - Philosophy and Phenomenological Research 99 (1):49-80.
    A handful of well-known arguments (the 'diachronic Dutch book arguments') rely upon theorems establishing that, in certain circumstances, you are immune from sure monetary loss (you are not 'diachronically Dutch bookable') if and only if you adopt the strategy of conditionalizing (or Jeffrey conditionalizing) on whatever evidence you happen to receive. These theorems require non-trivial assumptions about which evidence you might acquire---in the case of conditionalization, the assumption is that, if you might learn that e, then it is not the (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • Précis and replies to contributors for book symposium on accuracy and the laws of credence.Richard Pettigrew - 2017 - Episteme 14 (1):1-30.
    ABSTRACTThis book symposium onAccuracy and the Laws of Credenceconsists of an overview of the book’s argument by the author, Richard Pettigrew, together with four commentaries on different aspects of that argument. Ben Levinstein challenges the characterisation of the legitimate measures of inaccuracy that plays a central role in the arguments of the book. Julia Staffel asks whether the arguments of the book are compatible with an ontology of doxastic states that includes full beliefs as well as credences. Fabrizio Cariani raises (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Conditionalization Does Not Maximize Expected Accuracy.Miriam Schoenfield - 2017 - Mind 126 (504):1155-1187.
    Greaves and Wallace argue that conditionalization maximizes expected accuracy. In this paper I show that their result only applies to a restricted range of cases. I then show that the update procedure that maximizes expected accuracy in general is one in which, upon learning P, we conditionalize, not on P, but on the proposition that we learned P. After proving this result, I provide further generalizations and show that much of the accuracy-first epistemology program is committed to KK-like iteration principles (...)
    Download  
     
    Export citation  
     
    Bookmark   55 citations  
  • The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...)
    Download  
     
    Export citation  
     
    Bookmark   22 citations  
  • Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Download  
     
    Export citation  
     
    Bookmark   28 citations  
  • Finite additivity, another lottery paradox and conditionalisation.Colin Howson - 2014 - Synthese 191 (5):1-24.
    In this paper I argue that de Finetti provided compelling reasons for rejecting countable additivity. It is ironical therefore that the main argument advanced by Bayesians against following his recommendation is based on the consistency criterion, coherence, he himself developed. I will show that this argument is mistaken. Nevertheless, there remain some counter-intuitive consequences of rejecting countable additivity, and one in particular has all the appearances of a full-blown paradox. I will end by arguing that in fact it is no (...)
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • Rational Belief and Probability Kinematics.Bas C. Van Fraassen - 1980 - Philosophy of Science 47 (2):165-187.
    A general form is proposed for epistemological theories, the relevant factors being: the family of epistemic judgments, the epistemic state, the epistemic commitment, and the family of possible epistemic inputs. First a simple theory is examined in which the states are probability functions, and the subject of probability kinematics introduced by Richard Jeffrey is explored. Then a second theory is examined in which the state has as constituents a body of information and a recipe that determines the accepted epistemic judgments (...)
    Download  
     
    Export citation  
     
    Bookmark   25 citations  
  • Changing minds in a changing world.Wolfgang Schwarz - 2012 - Philosophical Studies 159 (2):219-239.
    I defend a general rule for updating beliefs that takes into account both the impact of new evidence and changes in the subject’s location. The rule combines standard conditioning with a shifting operation that moves the center of each doxastic possibility forward to the next point where information arrives. I show that well-known arguments for conditioning lead to this combination when centered information is taken into account. I also discuss how my proposal relates to other recent proposals, what results it (...)
    Download  
     
    Export citation  
     
    Bookmark   25 citations  
  • On the Revision of Probabilistic Belief States.Craig Boutilier - 1995 - Notre Dame Journal of Formal Logic 36 (1):158-183.
    In this paper we describe two approaches to the revision of probability functions. We assume that a probabilistic state of belief is captured by a counterfactual probability or Popper function, the revision of which determines a new Popper function. We describe methods whereby the original function determines the nature of the revised function. The first is based on a probabilistic extension of Spohn's OCFs, whereas the second exploits the structure implicit in the Popper function itself. This stands in contrast with (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Contemporary Approaches to Statistical Mechanical Probabilities: A Critical Commentary - Part I: The Indifference Approach.Christopher J. G. Meacham - 2010 - Philosophy Compass 5 (12):1116-1126.
    This pair of articles provides a critical commentary on contemporary approaches to statistical mechanical probabilities. These articles focus on the two ways of understanding these probabilities that have received the most attention in the recent literature: the epistemic indifference approach, and the Lewis-style regularity approach. These articles describe these approaches, highlight the main points of contention, and make some attempts to advance the discussion. The first of these articles provides a brief sketch of statistical mechanics, and discusses the indifference approach (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • An Objective Justification of Bayesianism II: The Consequences of Minimizing Inaccuracy.Hannes Leitgeb & Richard Pettigrew - 2010 - Philosophy of Science 77 (2):236-272.
    One of the fundamental problems of epistemology is to say when the evidence in an agent’s possession justifies the beliefs she holds. In this paper and its prequel, we defend the Bayesian solution to this problem by appealing to the following fundamental norm: Accuracy An epistemic agent ought to minimize the inaccuracy of her partial beliefs. In the prequel, we made this norm mathematically precise; in this paper, we derive its consequences. We show that the two core tenets of Bayesianism (...)
    Download  
     
    Export citation  
     
    Bookmark   151 citations  
  • Self-location is no problem for conditionalization.Darren Bradley - 2011 - Synthese 182 (3):393-411.
    How do temporal and eternal beliefs interact? I argue that acquiring a temporal belief should have no effect on eternal beliefs for an important range of cases. Thus, I oppose the popular view that new norms of belief change must be introduced for cases where the only change is the passing of time. I defend this position from the purported counter-examples of the Prisoner and Sleeping Beauty. I distinguish two importantly different ways in which temporal beliefs can be acquired and (...)
    Download  
     
    Export citation  
     
    Bookmark   33 citations  
  • Non-bayesian foundations for statistical estimation, prediction, and the ravens example.Malcolm R. Forster - 1994 - Erkenntnis 40 (3):357 - 376.
    The paper provides a formal proof that efficient estimates of parameters, which vary as as little as possible when measurements are repeated, may be expected to provide more accurate predictions. The definition of predictive accuracy is motivated by the work of Akaike (1973). Surprisingly, the same explanation provides a novel solution for a well known problem for standard theories of scientific confirmation — the Ravens Paradox. This is significant in light of the fact that standard Bayesian analyses of the paradox (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Varieties of Bayesianism.Jonathan Weisberg - 2011
    Handbook of the History of Logic, vol. 10, eds. Dov Gabbay, Stephan Hartmann, and John Woods, forthcoming.
    Download  
     
    Export citation  
     
    Bookmark   37 citations  
  • Theories of probability.Colin Howson - 1995 - British Journal for the Philosophy of Science 46 (1):1-32.
    My title is intended to recall Terence Fine's excellent survey, Theories of Probability [1973]. I shall consider some developments that have occurred in the intervening years, and try to place some of the theories he discussed in what is now a slightly longer perspective. Completeness is not something one can reasonably hope to achieve in a journal article, and any selection is bound to reflect a view of what is salient. In a subject as prone to dispute as this, there (...)
    Download  
     
    Export citation  
     
    Bookmark   25 citations  
  • Justifying conditionalization: Conditionalization maximizes expected epistemic utility.Hilary Greaves & David Wallace - 2006 - Mind 115 (459):607-632.
    According to Bayesian epistemology, the epistemically rational agent updates her beliefs by conditionalization: that is, her posterior subjective probability after taking account of evidence X, pnew, is to be set equal to her prior conditional probability pold(·|X). Bayesians can be challenged to provide a justification for their claim that conditionalization is recommended by rationality—whence the normative force of the injunction to conditionalize? There are several existing justifications for conditionalization, but none directly addresses the idea that conditionalization will be epistemically rational (...)
    Download  
     
    Export citation  
     
    Bookmark   241 citations  
  • Externalism and exploitability.Nilanjan Das - 2020 - Philosophy and Phenomenological Research 104 (1):101-128.
    According to Bayesian orthodoxy, an agent should update---or at least should plan to update---her credences by conditionalization. Some have defended this claim by means of a diachronic Dutch book argument. They say: an agent who does not plan to update her credences by conditionalization is vulnerable (by her own lights) to a diachronic Dutch book, i.e., a sequence of bets which, when accepted, guarantee loss of utility. Here, I show that this argument is in tension with evidence externalism, i.e., the (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Weighted averaging, Jeffrey conditioning and invariance.Denis Bonnay & Mikaël Cozic - 2018 - Theory and Decision 85 (1):21-39.
    Jeffrey conditioning tells an agent how to update her priors so as to grant a given probability to a particular event. Weighted averaging tells an agent how to update her priors on the basis of testimonial evidence, by changing to a weighted arithmetic mean of her priors and another agent’s priors. We show that, in their respective settings, these two seemingly so different updating rules are axiomatized by essentially the same invariance condition. As a by-product, this sheds new light on (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Learning and Pooling, Pooling and Learning.Rush T. Stewart & Ignacio Ojea Quintana - 2018 - Erkenntnis 83 (3):1-21.
    We explore which types of probabilistic updating commute with convex IP pooling. Positive results are stated for Bayesian conditionalization, imaging, and a certain parameterization of Jeffrey conditioning. This last observation is obtained with the help of a slight generalization of a characterization of externally Bayesian pooling operators due to Wagner :336–345, 2009). These results strengthen the case that pooling should go by imprecise probabilities since no precise pooling method is as versatile.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Foundations of Probability.Rachael Briggs - 2015 - Journal of Philosophical Logic 44 (6):625-640.
    The foundations of probability are viewed through the lens of the subjectivist interpretation. This article surveys conditional probability, arguments for probabilism, probability dynamics, and the evidential and subjective interpretations of probability.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • (1 other version)On the Everettian epistemic problem.Hilary Greaves - 2006 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 38 (1):120-152.
    Recent work in the Everett interpretation has suggested that the problem of probability can be solved by understanding probability in terms of rationality. However, there are *two* problems relating to probability in Everett --- one practical, the other epistemic --- and the rationality-based program *directly* addresses only the practical problem. One might therefore worry that the problem of probability is only `half solved' by this approach. This paper aims to dispel that worry: a solution to the epistemic problem follows from (...)
    Download  
     
    Export citation  
     
    Bookmark   33 citations  
  • The coherence argument against conditionalization.Matthias Hild - 1998 - Synthese 115 (2):229-258.
    I re-examine Coherence Arguments (Dutch Book Arguments, No Arbitrage Arguments) for diachronic constraints on Bayesian reasoning. I suggest to replace the usual game–theoretic coherence condition with a new decision–theoretic condition ('Diachronic Sure Thing Principle'). The new condition meets a large part of the standard objections against the Coherence Argument and frees it, in particular, from a commitment to additive utilities. It also facilitates the proof of the Converse Dutch Book Theorem. I first apply the improved Coherence Argument to van Fraassen's (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • A problem for relative information minimizers in probability kinematics.Bas C. van Fraassen - 1981 - British Journal for the Philosophy of Science 32 (4):375-379.
    Download  
     
    Export citation  
     
    Bookmark   61 citations  
  • Entropy and uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity of (...)
    Download  
     
    Export citation  
     
    Bookmark   55 citations  
  • Non-additive degrees of belief.Rolf Haenni - 2009 - In Franz Huber & Christoph Schmidt-Petri (eds.), Degrees of belief. London: Springer. pp. 121--159.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Simultaneous belief updates via successive Jeffrey conditionalization.Ilho Park - 2013 - Synthese 190 (16):3511-3533.
    This paper discusses simultaneous belief updates. I argue here that modeling such belief updates using the Principle of Minimum Information can be regarded as applying Jeffrey conditionalization successively, and so that, contrary to what many probabilists have thought, the simultaneous belief updates can be successfully modeled by means of Jeffrey conditionalization.
    Download  
     
    Export citation  
     
    Bookmark  
  • Maximum entropy inference as a special case of conditionalization.Brian Skyrms - 1985 - Synthese 63 (1):55 - 74.
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Aggregating incoherent agents who disagree.Richard Pettigrew - 2019 - Synthese 196 (7):2737-2776.
    In this paper, we explore how we should aggregate the degrees of belief of a group of agents to give a single coherent set of degrees of belief, when at least some of those agents might be probabilistically incoherent. There are a number of ways of aggregating degrees of belief, and there are a number of ways of fixing incoherent degrees of belief. When we have picked one of each, should we aggregate first and then fix, or fix first and (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Epistemic importance and minimal changes of belief.Peter Gärdenfors - 1984 - Australasian Journal of Philosophy 62 (2):136 – 157.
    Download  
     
    Export citation  
     
    Bookmark   49 citations  
  • Review of Bayesian Philosophy of Science. [REVIEW]Olav Benjamin Vassend - 2023 - Erkenntnis 88 (5):2245-2249.
    Download  
     
    Export citation  
     
    Bookmark  
  • Open-Minded Orthodox Bayesianism by Epsilon-Conditionalization.Eric Raidl - 2020 - British Journal for the Philosophy of Science 71 (1):139-176.
    Orthodox Bayesianism endorses revising by conditionalization. This paper investigates the zero-raising problem, or equivalently the certainty-dropping problem of orthodox Bayesianism: previously neglected possibilities remain neglected, although the new evidence might suggest otherwise. Yet, one may want to model open-minded agents, that is, agents capable of raising previously neglected possibilities. Different reasons can be given for open-mindedness, one of which is fallibilism. The paper proposes a family of open-minded propositional revisions depending on a parameter ϵ. The basic idea is this: first (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Objective Bayesianism, Bayesian conditionalisation and voluntarism.Jon Williamson - 2011 - Synthese 178 (1):67-85.
    Objective Bayesianism has been criticised on the grounds that objective Bayesian updating, which on a finite outcome space appeals to the maximum entropy principle, differs from Bayesian conditionalisation. The main task of this paper is to show that this objection backfires: the difference between the two forms of updating reflects negatively on Bayesian conditionalisation rather than on objective Bayesian updating. The paper also reviews some existing criticisms and justifications of conditionalisation, arguing in particular that the diachronic Dutch book justification fails (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • On Indeterminate Updating of Credences.Leendert Huisman - 2014 - Philosophy of Science 81 (4):537-557.
    The strategy of updating credences by minimizing the relative entropy has been questioned by many authors, most strongly by means of the Judy Benjamin puzzle. I present a new analysis of Judy Benjamin–like forms of new information and defend the thesis that in general the rational posterior is indeterminate, meaning that a family of posterior credence functions rather than a single one is the rational response when that type of information becomes available. The proposed thesis extends naturally to all cases (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Bayesianism and language change.Jon Williamson - 2003 - Journal of Logic, Language and Information 12 (1):53-97.
    Bayesian probability is normally defined over a fixed language or eventspace. But in practice language is susceptible to change, and thequestion naturally arises as to how Bayesian degrees of belief shouldchange as language changes. I argue here that this question poses aserious challenge to Bayesianism. The Bayesian may be able to meet thischallenge however, and I outline a practical method for changing degreesof belief over changes in finite propositional languages.
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Determining Maximal Entropy Functions for Objective Bayesian Inductive Logic.Juergen Landes, Soroush Rafiee Rad & Jon Williamson - 2022 - Journal of Philosophical Logic 52 (2):555-608.
    According to the objective Bayesian approach to inductive logic, premisses inductively entail a conclusion just when every probability function with maximal entropy, from all those that satisfy the premisses, satisfies the conclusion. When premisses and conclusion are constraints on probabilities of sentences of a first-order predicate language, however, it is by no means obvious how to determine these maximal entropy functions. This paper makes progress on the problem in the following ways. Firstly, we introduce the concept of a limit in (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Objective Bayesianism with predicate languages.Jon Williamson - 2008 - Synthese 163 (3):341-356.
    Objective Bayesian probability is often defined over rather simple domains, e.g., finite event spaces or propositional languages. This paper investigates the extension of objective Bayesianism to first-order logical languages. It is argued that the objective Bayesian should choose a probability function, from all those that satisfy constraints imposed by background knowledge, that is closest to a particular frequency-induced probability function which generalises the λ = 0 function of Carnap’s continuum of inductive methods.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • (1 other version)On the Everettian Epistemic Problem.Hilary Greaves - 2007 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 38 (1):120-152.
    Recent work in the Everett interpretation has suggested that the problem of probability can be solved by understanding probability in terms of rationality. However, there are *two* problems relating to probability in Everett --- one practical, the other epistemic --- and the rationality-based program *directly* addresses only the practical problem. One might therefore worry that the problem of probability is only `half solved' by this approach. This paper aims to dispel that worry: a solution to the epistemic problem follows from (...)
    Download  
     
    Export citation  
     
    Bookmark   31 citations  
  • Bayesian rules of updating.Colin Howson - 1996 - Erkenntnis 45 (2-3):195 - 208.
    This paper discusses the Bayesian updating rules of ordinary and Jeffrey conditionalisation. Their justification has been a topic of interest for the last quarter century, and several strategies proposed. None has been accepted as conclusive, and it is argued here that this is for a good reason; for by extending the domain of the probability function to include propositions describing the agent's present and future degrees of belief one can systematically generate a class of counterexamples to the rules. Dynamic Dutch (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • Bayesian conditionalization and probability kinematics.Colin Howson & Allan Franklin - 1994 - British Journal for the Philosophy of Science 45 (2):451-466.
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • Probability kinematics and representation of belief change.Zoltan Domotor - 1980 - Philosophy of Science 47 (3):384-403.
    Bayesian, Jeffrey and Field conditionals are compared and it is shown why the last two cannot be reduced to the first. Maximum relative entropy is used in two kinds of justification of the Field conditional and the dispensability of entropy principles in general is discussed.
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • The Application of Constraint Semantics to the Language of Subjective Uncertainty.Eric Swanson - 2016 - Journal of Philosophical Logic 45 (2):121-146.
    This paper develops a compositional, type-driven constraint semantic theory for a fragment of the language of subjective uncertainty. In the particular application explored here, the interpretation function of constraint semantics yields not propositions but constraints on credal states as the semantic values of declarative sentences. Constraints are richer than propositions in that constraints can straightforwardly represent assessments of the probability that the world is one way rather than another. The richness of constraints helps us model communicative acts in essentially the (...)
    Download  
     
    Export citation  
     
    Bookmark   50 citations  
  • Updating, supposing, and maxent.Brian Skyrms - 1987 - Theory and Decision 22 (3):225-246.
    Download  
     
    Export citation  
     
    Bookmark   28 citations  
  • Radical Pooling and Imprecise Probabilities.Ignacio Ojea Quintana - forthcoming - Erkenntnis:1-28.
    This paper focuses on radical pooling, or the question of how to aggregate credences when there is a fundamental disagreement about which is the relevant logical space for inquiry. The solution advanced is based on the notion of consensus as common ground, where agents can find it by suspending judgment on logical possibilities. This is exemplified with cases of scientific revolution. On a formal level, the proposal uses algebraic joins and imprecise probabilities; which is shown to be compatible with the (...)
    Download  
     
    Export citation  
     
    Bookmark