Despite widespread evidence that fictional models play an explanatory role in science, resistance remains to the idea that fictions can explain. A central source of this resistance is a particular view about what explanations are, namely, the ontic conception of explanation. According to the ontic conception, explanations just are the concrete entities in the world. I argue this conception is ultimately incoherent and that even a weaker version of the ontic conception fails. Fictional models can succeed in offering genuine explanations (...) by correctly capturing relevant patterns of counterfactual dependence and licensing correct inferences. Using the example of Newtonian force explanations of the tides, I show how, even in science, fiction can be a vehicle for truth. (shrink)
In 2012, the Geological Time Scale, which sets the temporal framework for studying the timing and tempo of all major geological, biological, and climatic events in Earth’s history, had one-quarter of its boundaries moved in a widespread revision of radiometric dates. The philosophy of metrology helps us understand this episode, and it, in turn, elucidates the notions of calibration, coherence, and consilience. I argue that coherence testing is a distinct activity preceding calibration and consilience, and I highlight the value of (...) discordant evidence and trade-offs scientists face in calibration. The iterative nature of calibration, moreover, raises the problem of legacy data. (shrink)
The ontic conception of explanation, according to which explanations are "full-bodied things in the world," is fundamentally misguided. I argue instead for what I call the eikonic conception, according to which explanations are the product of an epistemic activity involving representations of the phenomena to be explained. What is explained in the first instance is a particular conceptualization of the explanandum phenomenon, contextualized within a given research program or explanatory project. I conclude that this eikonic conception has a number of (...) benefits, including making better sense of scientific practice and allowing for the full range of normative constraints on explanation. (shrink)
Despite an enormous philosophical literature on models in science, surprisingly little has been written about data models and how they are constructed. In this paper, I examine the case of how paleodiversity data models are constructed from the fossil data. In particular, I show how paleontologists are using various model-based techniques to correct the data. Drawing on this research, I argue for the following related theses: First, the 'purity' of a data model is not a measure of its epistemic reliability. (...) Instead it is the fidelity of the data that matters. Second, the fidelity of a data model in capturing the signal of interest is a matter of degree. Third, the fidelity of a data model can be improved 'vicariously', such as through the use of post hoc model-based correction techniques. And, fourth, data models, like theoretical models, should be assessed as adequate (or inadequate) for particular purposes. (shrink)
Detailed examinations of scientific practice have revealed that the use of idealized models in the sciences is pervasive. These models play a central role in not only the investigation and prediction of phenomena, but in their received scientific explanations as well. This has led philosophers of science to begin revising the traditional philosophical accounts of scientific explanation in order to make sense of this practice. These new model-based accounts of scientific explanation, however, raise a number of key questions: Can the (...) fictions and falsehoods inherent in the modeling practice do real explanatory work? Do some highly abstract and mathematical models exhibit a noncausal form of scientific explanation? How can one distinguish an exploratory "how-possibly" model explanation from a genuine "how-actually" model explanation? Do modelers face tradeoffs such that a model that is optimized for yielding explanatory insight, for example, might fail to be the most predictively accurate, and vice versa? This chapter explores the various answers that have been given to these questions. (shrink)
The geosciences include a wide spectrum of disciplines ranging from paleontology to climate science, and involve studies of a vast range of spatial and temporal scales, from the deep-time history of microbial life to the future of a system no less immense and complex than the entire Earth. Modeling is thus a central and indispensable tool across the geosciences. Here, we review both the history and current state of model-based inquiry in the geosciences. Research in these fields makes use of (...) a wide variety of models, such as conceptual, physical, and numerical models, and more specifically cellular automata, artificial neural networks, agent-based models, coupled models, and hierarchical models. We note the increasing demands to incorporate biological and social systems into geoscience modeling, challenging the traditional boundaries of these fields. Understanding and articulating the many different sources of scientific uncertainty – and finding tools and methods to address them – has been at the forefront of most research in geoscience modeling. We discuss not only structuralmodel uncertainties, parameter uncertainties, and solution uncertainties, but also the diverse sources of uncertainty arising from the complex nature of geoscience systems themselves. Without an examination of the geosciences, our philosophies of science and our understanding of the nature of model-based science are incomplete. (shrink)
In the spirit of explanatory pluralism, this chapter argues that causal and noncausal explanations of a phenomenon are compatible, each being useful for bringing out different sorts of insights. After reviewing a model-based account of scientific explanation, which can accommodate causal and noncausal explanations alike, an important core conception of noncausal explanation is identified. This noncausal form of model-based explanation is illustrated using the example of how Earth scientists in a subfield known as aeolian geomorphology are explaining the formation of (...) regularlyspaced sand ripples. The chapter concludes that even when it comes to everyday "medium-sized dry goods" such as sand ripples, where there is a complete causal story to be told, one can find examples of noncausal scientific explanations. (shrink)
Model-data symbiosis is the view that there is an interdependent and mutually beneficial relationship between data and models, whereby models are not only data-laden, but data are also model-laden or model filtered. In this paper I elaborate and defend the second, more controversial, component of the symbiosis view. In particular, I construct a preliminary taxonomy of the different ways in which theoretical and simulation models are used in the production of data sets. These include data conversion, data correction, data interpolation, (...) data scaling, data fusion, data assimilation, and synthetic data. Each is defined and briefly illustrated with an example from the geosciences. I argue that model-filtered data are typically more accurate and reliable than the so-called raw data, and hence beneficially serve the epistemic aims of science. By illuminating the methods by which raw data are turned into scientifically useful data sets, this taxonomy provides a foundation for developing a more adequate philosophy of data. (shrink)
The fact that the same equations or mathematical models reappear in the descriptions of what are otherwise disparate physical systems can be seen as yet another manifestation of Wigner's “unreasonable effectiveness of mathematics.” James Clerk Maxwell famously exploited such formal similarities in what he called the “method of physical analogy.” Both Maxwell and Hermann von Helmholtz appealed to the physical analogies between electromagnetism and hydrodynamics in their development of these theories. I argue that a closer historical examination of the different (...) ways in which Maxwell and Helmholtz each deployed this analogy gives further insight into debates about the representational and explanatory power of mathematical models. (shrink)
We critically engage two traditional views of scientific data and outline a novel philosophical view that we call the pragmatic-representational view of data. On the PR view, data are representations that are the product of a process of inquiry, and they should be evaluated in terms of their adequacy or fitness for particular purposes. Some important implications of the PR view for data assessment, related to misrepresentation, context-sensitivity, and complementary use, are highlighted. The PR view provides insight into the common (...) but little-discussed practices of iteratively reusing and repurposing data, which result in many datasets’ having a phylogeny—an origin and complex evolutionary history—that is relevant to their evaluation and future use. We relate these insights to the open-data and data-rescue movements, and highlight several future avenues of research that build on the PR view of data. (shrink)
Traditionally Ψ is used to stand in for both the mathematical wavefunction (the representation) and the quantum state (the thing in the world). This elision has been elevated to a metaphysical thesis by advocates of the view known as wavefunction realism. My aim in this paper is to challenge the hegemony of the wavefunction by calling attention to a little-known formulation of quantum theory that does not make use of the wavefunction in representing the quantum state. This approach, called Lagrangian (...) quantum hydrodynamics (LQH), is not an approximation scheme, but rather a full alternative formulation of quantum theory. I argue that a careful consideration of alternative formalisms is an essential part of any realist project that attempts to read the ontology of a theory off of the mathematical formalism. In particular, I show that LQH undercuts the central presumption of wavefunction realism and falsifies the claim that one must represent the many-body quantum state as living in a 3n-dimensional configuration space. I conclude by briefly sketching three different realist approaches one could take toward LQH, and argue that both models of the quantum state should be admitted. When exploring quantum realism, regaining sight of the proverbial forest of quantum representations beyond the Ψ is just the first step. (shrink)
At the intersection of taxonomy and nomenclature lies the scientific practice of typification. This practice occurs in biology with the use of holotypes (type specimens), in geology with the use of stratotypes, and in metrology with the use of measurement prototypes. In this paper I develop the first general definition of a scientific type and outline a new philosophical theory of types inspired by Pierre Duhem. I use this general framework to resolve the necessity-contingency debate about type specimens in philosophy (...) of biology, to advance the debate over the myth of the absolute accuracy of standards in metrology, and to address the definition-correlation debate in geology. I conclude that just as there has been a productive synergy between philosophical accounts of natural kinds and scientific taxonomic practices, so too there is much to be gained from developing a deeper understanding of the practices and philosophy of scientific types. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.