Switch to: References

Add citations

You must login to add citations.
  1. Carnapian truthlikeness.Gustavo Cevolani - 2016 - Logic Journal of the IGPL 24 (4):542-556.
    Theories of truthlikeness (or verisimilitude) are currently being classified according to two independent distinctions: that between ‘content’ and ‘likeness’ accounts, and that between ‘conjunctive’ and ‘disjunctive’ ones. In this article, I present and discuss a new definition of truthlikeness, which employs Carnap’s notion of the content elements entailed by a theory or proposition, and is then labelled ‘Carnapian’. After studying in detail the properties and shortcomings of this definition, I argue that it occupies a unique position in the landscape of (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Keep Changing Your Beliefs, Aiming for the Truth.Alexandru Baltag & Sonja Smets - 2011 - Erkenntnis 75 (2):255-270.
    We investigate the process of truth-seeking by iterated belief revision with higher-level doxastic information . We elaborate further on the main results in Baltag and Smets (Proceedings of TARK, 2009a , Proceedings of WOLLIC’09 LNAI 5514, 2009b ), applying them to the issue of convergence to truth . We study the conditions under which the belief revision induced by a series of truthful iterated upgrades eventually stabilizes on true beliefs. We give two different conditions ensuring that beliefs converge to “full” (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • (1 other version)Predictive accuracy as an achievable goal of science.Malcolm R. Forster - 2002 - Proceedings of the Philosophy of Science Association 2002 (3):S124-S134.
    What has science actually achieved? A theory of achievement should define what has been achieved, describe the means or methods used in science, and explain how such methods lead to such achievements. Predictive accuracy is one truth‐related achievement of science, and there is an explanation of why common scientific practices tend to increase predictive accuracy. Akaike’s explanation for the success of AIC is limited to interpolative predictive accuracy. But therein lies the strength of the general framework, for it also provides (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • Getting Accurate about Knowledge.Sam Carter & Simon Goldstein - 2022 - Mind 132 (525):158-191.
    There is a large literature exploring how accuracy constrains rational degrees of belief. This paper turns to the unexplored question of how accuracy constrains knowledge. We begin by introducing a simple hypothesis: increases in the accuracy of an agent’s evidence never lead to decreases in what the agent knows. We explore various precise formulations of this principle, consider arguments in its favour, and explain how it interacts with different conceptions of evidence and accuracy. As we show, the principle has some (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Levin and Ghins on the “no miracle” argument and naturalism.Mario Alai - 2012 - European Journal for Philosophy of Science 2 (1):85-110.
    On the basis of Levin’s claim that truth is not a scientific explanatory factor, Michel Ghins argues that the “no miracle” argument (NMA) is not scientific, therefore scientific realism is not a scientific hypothesis, and naturalism is wrong. I argue that there are genuine senses of ‘scientific’ and ‘explanation’ in which truth can yield scientific explanations. Hence, the NMA can be considered scientific in the sense that it hinges on a scientific explanation, it follows a typically scientific inferential pattern (IBE), (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Translation Invariance and Miller’s Weather Example.J. B. Paris & A. Vencovská - 2019 - Journal of Logic, Language and Information 28 (4):489-514.
    In his 1974 paper “Popper’s qualitative theory of verisimilitude” published in the British Journal for the Philosophy of Science David Miller gave his so called ‘Weather Example’ to argue that the Hamming distance between constituents is flawed as a measure of proximity to truth since the former is not, unlike the latter, translation invariant. In this present paper we generalise David Miller’s Weather Example in both the unary and polyadic cases, characterising precisely which permutations of constituents/atoms can be effected by (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)A partial consequence account of truthlikeness.Gustavo Cevolani & Roberto Festa - 2018 - Synthese:1-20.
    Popper’s original definition of truthlikeness relied on a central insight: that truthlikeness combines truth and information, in the sense that a proposition is closer to the truth the more true consequences and the less false consequences it entails. As intuitively compelling as this definition may be, it is untenable, as proved long ago; still, one can arguably rely on Popper’s intuition to provide an adequate account of truthlikeness. To this aim, we mobilize some classical work on partial entailment in defining (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Scoring in context.Igor Douven - 2020 - Synthese 197 (4):1565-1580.
    A number of authors have recently put forward arguments pro or contra various rules for scoring probability estimates. In doing so, they have skipped over a potentially important consideration in making such assessments, to wit, that the hypotheses whose probabilities are estimated can approximate the truth to different degrees. Once this is recognized, it becomes apparent that the question of how to assess probability estimates depends heavily on context.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Approaching Truth by Resolving Questions.Jakob Süskind - forthcoming - British Journal for the Philosophy of Science.
    Download  
     
    Export citation  
     
    Bookmark  
  • Approaching probabilistic truths: introduction to the Topical Collection.Ilkka Niiniluoto, Gustavo Cevolani & Theo Kuipers - 2022 - Synthese 200 (2):1-8.
    After Karl Popper’s original work, several approaches were developed to provide a sound explication of the notion of verisimilitude. With few exceptions, these contributions have assumed that the truth to be approximated is deterministic. This collection of ten papers addresses the more general problem of approaching probabilistic truths. They include attempts to find appropriate measures for the closeness to probabilistic truth and to evaluate claims about such distances on the basis of empirical evidence. The papers employ multiple analytical approaches, and (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Truthlikeness for probabilistic laws.Alfonso García-Lapeña - 2021 - Synthese 199 (3-4):9359-9389.
    Truthlikeness is a property of a theory or a proposition that represents its closeness to the truth. We start by summarizing Niiniluoto’s proposal of truthlikeness for deterministic laws, which defines truthlikeness as a function of accuracy, and García-Lapeña’s expanded version, which defines truthlikeness for DL as a function of two factors, accuracy and nomicity. Then, we move to develop an appropriate definition of truthlikeness for probabilistic laws based on Niiniluoto’s suggestion to use the Kullback–Leibler divergence to define the distance between (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Escaping the Fundamental Dichotomy of Scientific Realism.Shahin Kaveh - 2023 - British Journal for the Philosophy of Science 74 (4):999-1025.
    The central motivation behind the scientific realism debate is explaining the impressive success of scientific theories. The debate has been dominated by two rival types of explanations: the first relies on some sort of static, referentially transparent relationship between the theory and the unobservable world, such as truthlikeness, representation, or structural similarity; the second relies on no robust relationship between the theory and unobservable reality at all, and instead draws on predictive similarity and the stringent methodology of science to explain (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • (1 other version)Truthlikeness: old and new debates.Ilkka Niiniluoto - 2020 - Synthese 197 (4):1581-1599.
    The notion of truthlikeness or verisimilitude has been a topic of intensive discussion ever since the definition proposed by Karl Popper was refuted in 1974. This paper gives an analysis of old and new debates about this notion. There is a fairly large agreement about the truthlikeness ordering of conjunctive theories, but the main rival approaches differ especially about false disjunctive theories. Continuing the debate between Niiniluoto’s min-sum measure and Schurz’s relevant consequence measure, the paper also gives a critical assessment (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • Tracking probabilistic truths: a logic for statistical learning.Alexandru Baltag, Soroush Rafiee Rad & Sonja Smets - 2021 - Synthese 199 (3-4):9041-9087.
    We propose a new model for forming and revising beliefs about unknown probabilities. To go beyond what is known with certainty and represent the agent’s beliefs about probability, we consider a plausibility map, associating to each possible distribution a plausibility ranking. Beliefs are defined as in Belief Revision Theory, in terms of truth in the most plausible worlds. We consider two forms of conditioning or belief update, corresponding to the acquisition of two types of information: learning observable evidence obtained by (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Predicting the Past from Minimal Traces: Episodic Memory and its Distinction from Imagination and Preservation.Markus Werning - 2020 - Review of Philosophy and Psychology 11 (2):301-333.
    The paper develops an account of minimal traces devoid of representational content and exploits an analogy to a predictive processing framework of perception. As perception can be regarded as a prediction of the present on the basis of sparse sensory inputs without any representational content, episodic memory can be conceived of as a “prediction of the past” on the basis of a minimal trace, i.e., an informationally sparse, merely causal link to a previous experience. The resulting notion of episodic memory (...)
    Download  
     
    Export citation  
     
    Bookmark   40 citations  
  • Verisimilitude and belief change for nomic conjunctive theories.Gustavo Cevolani, Roberto Festa & Theo A. F. Kuipers - 2013 - Synthese 190 (16):3307-3324.
    In this paper, we address the problem of truth approximation through theory change, asking whether revising our theories by newly acquired data leads us closer to the truth about a given domain. More particularly, we focus on “nomic conjunctive theories”, i.e., theories expressed as conjunctions of logically independent statements concerning the physical or, more generally, nomic possibilities and impossibilities of the domain under inquiry. We define both a comparative and a quantitative notion of the verisimilitude of such theories, and identify (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • Truthlikeness for Quantitative Deterministic Laws.Alfonso García-Lapeña - 2023 - British Journal for the Philosophy of Science 74 (3):649-679.
    Truthlikeness is a property of a theory or a proposition that represents its closeness to the truth. According to Niiniluoto, truthlikeness for quantitative deterministic laws can be defined by the Minkowski metric. I present some counterexamples to the definition and argue that it fails because it considers truthlikeness for quantitative deterministic laws to be just a function of accuracy, but an accurate law can be wrong about the actual ‘structure’ or ‘behaviour’ of the system it intends to describe. I develop (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • New Semantics for Bayesian Inference: The Interpretive Problem and Its Solutions.Olav Benjamin Vassend - 2019 - Philosophy of Science 86 (4):696-718.
    Scientists often study hypotheses that they know to be false. This creates an interpretive problem for Bayesians because the probability assigned to a hypothesis is typically interpreted as the probability that the hypothesis is true. I argue that solving the interpretive problem requires coming up with a new semantics for Bayesian inference. I present and contrast two new semantic frameworks, and I argue that both of them support the claim that there is pervasive pragmatic encroachment on whether a given Bayesian (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Truth approximation, belief merging, and peer disagreement.Gustavo Cevolani - 2014 - Synthese 191 (11):2383-2401.
    In this paper, we investigate the problem of truth approximation via belief merging, i.e., we ask whether, and under what conditions, a group of inquirers merging together their beliefs makes progress toward the truth about the underlying domain. We answer this question by proving some formal results on how belief merging operators perform with respect to the task of truth approximation, construed as increasing verisimilitude or truthlikeness. Our results shed new light on the issue of how rational (dis)agreement affects the (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Approaching Truth in Conceptual Spaces.Gustavo Cevolani - 2020 - Erkenntnis 85 (6):1485-1500.
    Knowledge representation is a central issue in a number of areas, but few attempts are usually made to bridge different approaches accross different fields. As a contribution in this direction, in this paper I focus on one such approach, the theory of conceptual spaces developed within cognitive science, and explore its potential applications in the fields of philosophy of science and formal epistemology. My case-study is provided by the theory of truthlikeness, construed as closeness to “the whole truth” about a (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Realists Waiting for Godot? The Verisimilitudinarian and the Cumulative Approach to Scientific Progress.Andrea Roselli - 2020 - Erkenntnis 85 (5):1071-1084.
    After a brief presentation of the Verisimilitudinarian approach to scientific progress, I argue that the notion of estimated verisimilitude is too weak for the purposes of scientific realism. Despite the realist-correspondist intuition that inspires the model—the idea that our theories get closer and closer to ‘the real way the world is’—, Bayesian estimations of truthlikeness are not objective enough to sustain a realist position. The main argument of the paper is that, since estimated verisimilitude is not connected to actual verisimilitude, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Verisimilitude and Content.Ken Gemes - 2007 - Synthese 154 (2):293-306.
    Popper’s original definition of verisimilitude in terms of comparisons of truth content and falsity content has known counter-examples. More complicated approaches have met with mixed success. This paper uses a new account of logical content to develop a definition of verisimilitude that is close to Popper’s original account. It is claimed that Popper’s mistake was to couch his account of truth and falsity content in terms of true and false consequences. Comparison to a similar approach by Schurz and Wiengartner show (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • Truth approximation via abductive belief change.Gustavo Cevolani - 2013 - Logic Journal of the IGPL 21 (6):999-1016.
    We investigate the logical and conceptual connections between abductive reasoning construed as a process of belief change, on the one hand, and truth approximation, construed as increasing (estimated) verisimilitude, on the other. We introduce the notion of ‘(verisimilitude-guided) abductive belief change’ and discuss under what conditions abductively changing our theories or beliefs does lead them closer to the truth, and hence tracks truth approximation conceived as the main aim of inquiry. The consequences of our analysis for some recent discussions concerning (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Complexity and verisimilitude: Realism for ecology. [REVIEW]Gregory M. Mikkelson - 2001 - Biology and Philosophy 16 (4):533-546.
    When data are limited, simple models of complex ecological systems tend to wind up closer to the truth than more complex models of the same systems. This greater proximity to the truth, or verisimilitude, leads to greater predictive success. When more data are available, the advantage of simplicity decreases, and more complex models may gain the upper hand. In ecology, holistic models are usually simpler than reductionistic models. Thus, when data are limited, holistic models have an advantage over reductionistic models, (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • (2 other versions)A Verisimilitude Framework for Inductive Inference, with an Application to Phylogenetics.Olav B. Vassend - 2018 - British Journal for the Philosophy of Science 71 (4):1359-1383.
    Bayesianism and likelihoodism are two of the most important frameworks philosophers of science use to analyse scientific methodology. However, both frameworks face a serious objection: much scientific inquiry takes place in highly idealized frameworks where all the hypotheses are known to be false. Yet, both Bayesianism and likelihoodism seem to be based on the assumption that the goal of scientific inquiry is always truth rather than closeness to the truth. Here, I argue in favour of a verisimilitude framework for inductive (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • (1 other version)Predictive Accuracy as an Achievable Goal of Science.Malcolm R. Forster - 2002 - Philosophy of Science 69 (S3):S124-S134.
    What has science actually achieved? A theory of achievement should define what has been achieved, describe the means or methods used in science, and explain how such methods lead to such achievements. Predictive accuracy is one truth-related achievement of science, and there is an explanation of why common scientific practices tend to increase predictive accuracy. Akaike's explanation for the success of AIC is limited to interpolative predictive accuracy. But therein lies the strength of the general framework, for it also provides (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • (1 other version)Misrepresentation in Context.Woosuk Park - 2014 - Foundations of Science 19 (4):363-374.
    We can witness the recent surge of interest in the interaction between cognitive science, philosophy of science, and aesthetics on the problem of representation. This naturally leads us to rethinking the achievements of Goodman’s monumental book Languages of Art. For, there is no doubt that no one else contributed more than Goodman to throw a light on the cognitive function of art. Ironically, it could be also Goodman who has been the stumbling block for a unified theory of representation. In (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Verisimilitude and Belief Change for Conjunctive Theories.Gustavo Cevolani, Vincenzo Crupi & Roberto Festa - 2011 - Erkenntnis 75 (2):183-202.
    Theory change is a central concern in contemporary epistemology and philosophy of science. In this paper, we investigate the relationships between two ongoing research programs providing formal treatments of theory change: the (post-Popperian) approach to verisimilitude and the AGM theory of belief change. We show that appropriately construed accounts emerging from those two lines of epistemological research do yield convergences relative to a specified kind of theories, here labeled “conjunctive”. In this domain, a set of plausible conditions are identified which (...)
    Download  
     
    Export citation  
     
    Bookmark   40 citations  
  • Bootstrap Confirmation Made Quantitative.Igor Douven & Wouter Meijs - 2006 - Synthese 149 (1):97-132.
    Glymour’s theory of bootstrap confirmation is a purely qualitative account of confirmation; it allows us to say that the evidence confirms a given theory, but not that it confirms the theory to a certain degree. The present paper extends Glymour’s theory to a quantitative account and investigates the resulting theory in some detail. It also considers the question how bootstrap confirmation relates to justification.
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • Revising Beliefs Towards the Truth.Ilkka Niiniluoto - 2011 - Erkenntnis 75 (2):165-181.
    Belief revision (BR) and truthlikeness (TL) emerged independently as two research programmes in formal methodology in the 1970s. A natural way of connecting BR and TL is to ask under what conditions the revision of a belief system by new input information leads the system towards the truth. It turns out that, for the AGM model of belief revision, the only safe case is the expansion of true beliefs by true input, but this is not very interesting or realistic as (...)
    Download  
     
    Export citation  
     
    Bookmark   20 citations  
  • Verisimilitude, cross classification and prediction logic. Approaching the statistical truth by falsified qualitative theories.Roberto Festa - 2007 - Mind and Society 6 (1):91-114.
    In this paper it is argued that qualitative theories (Q-theories) can be used to describe the statistical structure of cross classified populations and that the notion of verisimilitude provides an appropriate tool for measuring the statistical adequacy of Q-theories. First of all, a short outline of the post-Popperian approaches to verisimilitude and of the related verisimilitudinarian non-falsificationist methodologies (VNF-methodologies) is given. Secondly, the notion of Q-theory is explicated, and the qualitative verisimilitude of Q-theories is defined. Afterwards, appropriate measures for the (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • The Verisimilitudinarian approach to ‘the Truth’.Andrea Roselli - 2017 - Perspectives 7 (1):32-39.
    The Verisimilitudinarian approach to scientific progress (VS, for short) is traditionally considered a realist-correspondist model to explain the proximity of our best scientific theories to the way things really are in the world out there (ʻthe Truthʻ, with the capital ʻtʻ). However, VS is based on notions, such as ʻestimated verisimilitudeʻ or ʻapproximate truthʻ, that dilute the model in a functionalist-like theory. My thesis, then, is that VS tries to incorporate notions, such as ʻprogressʻ, in a pre-constituted metaphysical conception of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • A Davidsonian argument against incommensurability.Igor Douven & Henk W. De Regt - 2002 - International Studies in the Philosophy of Science 16 (2):157 – 169.
    The writings of Kuhn and Feyerabend on incommensurability challenged the idea that science progresses towards the truth. Davidson famously criticized the notion of incommensurability, arguing that it is incoherent. Davidson's argument was in turn criticized by Kuhn and others. This article argues that, although at least some of the objections raised against Davidson's argument are formally correct, they do it very little harm. What remains of the argument once the objections have been taken account of is still quite damaging to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Approaching probabilistic laws.Ilkka Niiniluoto - 2021 - Synthese 199 (3-4):10499-10519.
    In the general problem of verisimilitude, we try to define the distance of a statement from a target, which is an informative truth about some domain of investigation. For example, the target can be a state description, a structure description, or a constituent of a first-order language. In the problem of legisimilitude, the target is a deterministic or universal law, which can be expressed by a nomic constituent or a quantitative function involving the operators of physical necessity and possibility. The (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Scoring, truthlikeness, and value.Igor Douven - 2021 - Synthese 199 (3-4):8281-8298.
    There is an ongoing debate about which rule we ought to use for scoring probability estimates. Much of this debate has been premised on scoring-rule monism, according to which there is exactly one best scoring rule. In previous work, I have argued against this position. The argument given there was based on purely a priori considerations, notably the intuition that scoring rules should be sensitive to truthlikeness relations if, and only if, such relations are present among whichever hypotheses are at (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The Structure of Idealization in Biological Theories: The Case of the Wright-Fisher Model. [REVIEW]Xavier Donato Rodríguez & Alfonso Arroyo Santos - 2012 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 43 (1):11-27.
    In this paper we present a new framework of idealization in biology. We characterize idealizations as a network of counterfactual and hypothetical conditionals that can exhibit different “degrees of contingency”. We use this idea to say that, in departing more or less from the actual world, idealizations can serve numerous epistemic, methodological or heuristic purposes within scientific research. We defend that, in part, this structure explains why idealizations, despite being deformations of reality, are so successful in scientific practice. For illustrative (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Verisimilitude and the scientific strategy of economic theory.Jesús P. Zamora Bonilla - 1999 - Journal of Economic Methodology 6 (3):331-350.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • The old evidence problem and the inference to the best explanation.Cristina Sagrafena - 2023 - European Journal for Philosophy of Science 13 (1):1-18.
    The Problem of Old Evidence (POE) states that Bayesian confirmation theory cannot explain why a theory H can be confirmed by a piece of evidence E already known. Different dimensions of POE have been highlighted. Here, I consider the dynamic and static dimension. In the former, we want to explain how the discovery that H accounts for E confirms H. In the latter, we want to understand why E is and will be a reason to prefer H over its competitors. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)The Structure of Idealization in Biological Theories: The Case of the Wright-Fisher Model.Xavier de Donato Rodríguez & Alfonso Arroyo Santos - 2012 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 43 (1):11-27.
    In this paper we present a new framework of idealization in biology. We characterize idealizations as a network of counterfactual and hypothetical conditionals that can exhibit different "degrees of contingency". We use this idea to say that, in departing more or less from the actual world, idealizations can serve numerous epistemic, methodological or heuristic purposes within scientific research. We defend that, in part, this structure explains why idealizations, despite being deformations of reality, are so successful in scientific practice. For illustrative (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • (1 other version)Truthlikeness: old and new debates.Ilkka Niiniluoto - 1990 - Synthese 84 (1):139-152.
    The notion of truthlikeness or verisimilitude has been a topic of intensive discussion ever since the definition proposed by Karl Popper was refuted in 1974. This paper gives an analysis of old and new debates about this notion. There is a fairly large agreement about the truthlikeness ordering of conjunctive theories, but the main rival approaches differ especially about false disjunctive theories. Continuing the debate between Niiniluoto’s min-sum measure and Schurz’s relevant consequence measure, the paper also gives a critical assessment (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • (1 other version)The structure of idealization in biological theories: the case of the Wright-Fisher model.Xavier de Donato Rodríguez & Alfonso Arroyo Santos - 2012 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 43 (1):11-27.
    In this paper we present a new framework of idealization in biology. We characterize idealizations as a network of counterfactual and hypothetical conditionals that can exhibit different “degrees of contingency”. We use this idea to say that, in departing more or less from the actual world, idealizations can serve numerous epistemic, methodological or heuristic purposes within scientific research. We defend that, in part, this structure explains why idealizations, despite being deformations of reality, are so successful in scientific practice. For illustrative (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Introduction and Overview.Theo Kuipers & Gerhard Schurz - 2011 - Erkenntnis 75 (2):151-163.
    Introduction and Overview Content Type Journal Article Category Introduction Pages 151-163 DOI 10.1007/s10670-011-9288-9 Authors Theo Kuipers, Faculty of Philosophy, University of Groningen, Groningen, The Netherlands Gerhard Schurz, Department of Philosophy, University of Duesseldorf, Universitaetsstrasse 1, Geb. 23.21, 40225 Duesseldorf, Germany Journal Erkenntnis Online ISSN 1572-8420 Print ISSN 0165-0106 Journal Volume Volume 75 Journal Issue Volume 75, Number 2.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Science: the rules of the game.Jesús Zamora-Bonilla - 2010 - Logic Journal of the IGPL 18 (2):294-307.
    Popper’s suggestion of taking methodological norms as conventions is examined from the point of view of game theory. The game of research is interpreted as a game of persuasion, in the sense that every scientists tries to advance claims, and that her winning the game consists in her colleagues accepting some of those claims as the conclusions of some arguments. Methodological norms are seen as elements in a contract established amongst researchers, that says what inferential moves are legitimate or compulsory (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation