Switch to: References

Add citations

You must login to add citations.
  1. Demonstrative induction, old and new evidence and the accuracy of the electrostatic inverse square law.Ronald Laymon - 1994 - Synthese 99 (1):23 - 58.
    Maxwell claimed that the electrostatic inverse square law could be deduced from Cavendish's spherical condenser experiment. This is true only if the accuracy claims made by Cavendish and Maxwell are ignored, for both used the inverse square law as a premise in their analyses of experimental accuracy. By so doing, they assumed the very law the accuracy of which the Cavendish experiment was supposed to test. This paper attempts to make rational sense of this apparently circular procedure and to relate (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Approaching Truth by Resolving Questions.Jakob Süskind - forthcoming - British Journal for the Philosophy of Science.
    Download  
     
    Export citation  
     
    Bookmark  
  • Idealización: concepción estructuralista y generalización modelo-teórica.Xavier de Donato Rodríguez & Marek Polanski - 2015 - Metatheoria – Revista de Filosofía E Historia de la Ciencia 5:45--55.
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)Truthlikeness: old and new debates.Ilkka Niiniluoto - 2020 - Synthese 197 (4):1581-1599.
    The notion of truthlikeness or verisimilitude has been a topic of intensive discussion ever since the definition proposed by Karl Popper was refuted in 1974. This paper gives an analysis of old and new debates about this notion. There is a fairly large agreement about the truthlikeness ordering of conjunctive theories, but the main rival approaches differ especially about false disjunctive theories. Continuing the debate between Niiniluoto’s min-sum measure and Schurz’s relevant consequence measure, the paper also gives a critical assessment (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • Diagnostics in computational organic chemistry.Grant Fisher - 2016 - Foundations of Chemistry 18 (3):241-262.
    Focusing on computational studies of pericyclic reactions from the late twentieth century into the twenty-first century, this paper argues that computational diagnostics is a key methodological development that characterize the management and coordination of plural approximation methods in computational organic chemistry. Predictive divergence between semi-empirical and ab initio approximation methods in the study of pericyclic reactions has issued in epistemic dissent. This has resulted in the use of diagnostics to unpack computational greyboxes in order to critically assess the effect of (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • (1 other version)Survey article. Verisimilitude: the third period.Ilkka Niiniluoto - 1998 - British Journal for the Philosophy of Science 49 (1):1-29.
    The modern history of verisimilitude can be divided into three periods. The first began in 1960, when Karl Popper proposed his qualitative definition of what it is for one theory to be more truthlike than another theory, and lasted until 1974, when David Miller and Pavel Trichý published their refutation of Popper's definition. The second period started immediately with the attempt to explicate truthlikeness by means of relations of similarity or resemblance between states of affairs (or their linguistic representations); the (...)
    Download  
     
    Export citation  
     
    Bookmark   47 citations  
  • (1 other version)Verisimilitude: The third period.Ilkka Niiniluoto - 1998 - British Journal for the Philosophy of Science 49 (1):1-29.
    The modern history of verisimilitude can be divided into three periods. The first began in 1960, when Karl Popper proposed his qualitative definition of what it is for one theory to be more truthlike than another theory, and lasted until 1974, when David Miller and Pavel Trich published their refutation of Popper's definition. The second period started immediately with the attempt to explicate truthlikeness by means of relations of similarity or resemblance between states of affairs (or their linguistic representations); the (...)
    Download  
     
    Export citation  
     
    Bookmark   93 citations  
  • How (not) to think about idealisation and ceteris paribus -laws.Robert Kowalenko - 2009 - Synthese 167 (1):183-201.
    "Semantic dispositionalism" is the theory that a speaker's meaning something by a given linguistic symbol is determined by her dispositions to use the symbol in a certain way. According to an objection by Kripke, further elaborated in Kusch :156–163, 2005), semantic dispositionalism involves ceteris paribus-clauses and idealisations, such as unbounded memory, that deviate from standard scientific methodology. I argue that Kusch misrepresents both ceteris paribus-laws and idealisation, neither of which factually "approximate" the behaviour of agents or the course of events, (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • Experimentation and the legitimacy of idealization.Ronald Laymon - 1995 - Philosophical Studies 77 (2-3):353 - 375.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • (1 other version)Computer Simulations, Idealizations and Approximations.Ronald Laymon - 1990 - PSA Proceedings of the Biennial Meeting of the Philosophy of Science Association 1990 (2):519-534.
    It’s uncontroversial that notions of idealization and approximation are central to understanding computer simulations and their rationale. So, for example, one common form of computer simulation is to abandon a realistic approach that is computationally non-tractable for a more idealized but computationally tractable approach. Many simulations of systems of interacting members can be understood this way. In such simulations, realistic descriptions of individual members are replaced with less realistic descriptions which have the virtue of making interactions computationally tractable. Such simulations (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Consciousness and inference to the best explanation: Compiling empirical evidence supporting the access-phenomenal distinction and the overflow hypothesis.Asger Kirkeby-Hinrup & Peter Fazekas - 2021 - Consciousness and Cognition 94 (C):103173.
    A tacit assumption in the field of consciousness studies is that the more empirical evidence a theory can explain, the better it fares when weighed against competitors. If one wants to take seriously the potential for empirical evidence to move forward debates in consciousness studies, there is a need to gather, organize, validate, and compare evidence. We present an inference to the best explanation (IBE) process on the basis of empirical support that is applicable in debates between competing theories of (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Fictionalism, Semantics, and Ontology.Gordon Michael Purves - 2018 - Perspectives on Science 26 (1):52-75.
    In a previous article, I argued that some recent philosophical work on the use of fictions in science is, while illuminating about some aspects of current scientific practice, unduly limited to cases of well-established fictions. In other words, this earlier work contended, a philosophical account of scientific fictions can do more than merely describe scientific practices, but can aid in the resolution of disputes about the proper interpretation of scientific theories and the epistemic status of some scientific models. In the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)Truthlikeness: old and new debates.Ilkka Niiniluoto - 1990 - Synthese 84 (1):139-152.
    The notion of truthlikeness or verisimilitude has been a topic of intensive discussion ever since the definition proposed by Karl Popper was refuted in 1974. This paper gives an analysis of old and new debates about this notion. There is a fairly large agreement about the truthlikeness ordering of conjunctive theories, but the main rival approaches differ especially about false disjunctive theories. Continuing the debate between Niiniluoto’s min-sum measure and Schurz’s relevant consequence measure, the paper also gives a critical assessment (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • Approximation, idealization, and laws of nature.Chang Liu - 1999 - Synthese 118 (2):229-256.
    Traditional theories construe approximate truth or truthlikeness as a measure of closeness to facts, singular facts, and idealization as an act of either assuming zero of otherwise very small differences from facts or imagining ideal conditions under which scientific laws are either approximately true or will be so when the conditions are relaxed. I first explain the serious but not insurmountable difficulties for the theories of approximation, and then argue that more serious and perhaps insurmountable difficulties for the theory of (...)
    Download  
     
    Export citation  
     
    Bookmark   23 citations  
  • Diagnostics and the 'deconstruction' of models.Grant Fisher - unknown
    This paper argues that a significant focus in computational organic chemistry, alongside the construction and deployment of models, is the “deconstruction” of computational models. This practice has arisen in response to difficulties and controversies resulting from the use of plural methods and computational models to study organic reaction mechanisms. Diagnostic controllability is the capacity of cognitive agents to gain epistemic access to grey-boxed computational models, to identify and explain the impact of specific idealizations on results, and to demonstrate the applicability (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Idealization within a Structuralist Perspective.Xavier de Donato Rodríguez - 2011 - Metatheoria – Revista de Filosofía E Historia de la Ciencia 1:65--90.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Explaining with Models: The Role of Idealizations.Julie Jebeile & Ashley Graham Kennedy - 2015 - International Studies in the Philosophy of Science 29 (4):383-392.
    Because they contain idealizations, scientific models are often considered to be misrepresentations of their target systems. An important question is therefore how models can explain the behaviours of these systems. Most of the answers to this question are representationalist in nature. Proponents of this view are generally committed to the claim that models are explanatory if they represent their target systems to some degree of accuracy; in other words, they try to determine the conditions under which idealizations can be made (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • Models of Science: Fictions or Idealizations?Yemima Ben-Menahem - 1988 - Science in Context 2 (1):163-175.
    The ArgumentIdealizations and approximations are an indispensable tool for the scientist. This paper argues that idealizations and approximations are equally indispensable for the philosopher of science. In particular, it is shown that the deductive model of scientific theories is an idealization in precisely the same sense that frictionless motion is an idealization in mechanics. By its very nature, an idealization cannot be criticized as not being absolutely true to the facts, for it need not be. Thus, the usual type of (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Interpolative and extrapolative reasoning in propositional theories using qualitative knowledge about conceptual spaces.Steven Schockaert & Henri Prade - 2013 - Artificial Intelligence 202 (C):86-131.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Finding truth in fictions: identifying non-fictions in imaginary cracks.Gordon Michael Purves - 2013 - Synthese 190 (2):235-251.
    I critically examine some recent work on the philosophy of scientific fictions, focusing on the work of Winsberg. By considering two case studies in fracture mechanics, the strip yield model and the imaginary crack method, I argue that his reliance upon the social norms associated with an element of a model forces him to remain silent whenever those norms fail to clearly match the characteristic of fictions or non-fictions. In its place, I propose a normative epistemology of fictions which clarifies (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Applying idealized scientific theories to engineering.Ronald Laymon - 1989 - Synthese 81 (3):353 - 371.
    The problem for the scientist created by using idealizations is to determine whether failures to achieve experimental fit are attributable to experimental error, falsity of theory, or of idealization. Even in the rare case when experimental fit within experimental error is achieved, the scientist must determine whether this is so because of a true theory and fortuitously canceling idealizations, or due to a fortuitous combination of false theory and false idealizations. For the engineer, the problem seems rather different. Experiment for (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations