After pinpointing a conceptual confusion (TCC), a Reality preconception (TRP1), and a fallacious dichotomy (TFD), the famous EPR/EPRB argument for correlated ‘particles’ is studied in the light of the Ontic Probability Interpretation (TOPI) of Quantum Theory (QT). Another Reality preconception (TRP2) is identified, showing that EPR used and ignored QT predictions in a single paralogism. Employing TFD and TRP2, EPR unveiled a contradiction veiled in its premises. By removing nonlocality from QT’s Ontology by fiat, EPR preordained its incompleteness. (...) The Petitio Principii fallacy was at work from the outset. Einstein surmised the solution to his incompleteness/nonlocality dilemma in 1949, but never abandoned his philosophical stance. It is concluded that there are no definitions of Reality: we have to accept that Reality may not conform to our prejudices and, if an otherwise successful theory predicts what we do not believe in, no gedankenexperiment will help because our biases may slither through. Only actual experiments could assist in solving Einstein’s dilemma, as proven in the last 50 years. Notwithstanding, EPR is one of the most influential papers in history and has immensely sparked both conceptual and technological progress. Future articles will further explain TOPI, demonstrating its soundness and potential for nurturing theoretical advance. (shrink)
How were reliable predictions made before Pascal and Fermat's discovery of the mathematics of probability in 1654? What methods in law, science, commerce, philosophy, and logic helped us to get at the truth in cases where certainty was not attainable? The book examines how judges, witch inquisitors, and juries evaluated evidence; how scientists weighed reasons for and against scientific theories; and how merchants counted shipwrecks to determine insurance rates. Also included are the problem of induction before Hume, design arguments (...) for the existence of God, and theories on how to evaluate scientific and historical hypotheses. It is explained how Pascal and Fermat's work on chance arose out of legal thought on aleatory contracts. The book interprets pre-Pascalian unquantified probability in a generally objective Bayesian or logical probabilist sense. (shrink)
This paper was written to study the order of medical advances throughout history. It investigates changing human beliefs concerning the causes of diseases, how modern surgery developed and improved methods of diagnosis and the use of medical statistics. Human beliefs about the causes of disease followed a logical progression from supernatural causes, such as the wrath of the Gods, to natural causes, involving imbalances within the human body. The invention of the microscope led to the discovery of microorganisms which (...) were eventually identified as the cause of infectious diseases. Identification of the particular microorganism causing a disease led to immunization against the disease. Modern surgery only developed after the ending of the taboo against human dissection and the discovery of modern anesthesia and the discovery of the need for anti-septic practices. Modern diagnostic practices began with the discovery of x-rays and the invention of medical scanners. Improved mathematics, especially in probabilitytheory, led to statistical studies which led to a much greater ability, to identify the causes of disease, and to evaluate the effectiveness of treatments. These discoveries all occurred in a necessary and inevitable order with the easiest discoveries being made first and the harder discoveries being made later. The order of discovery determined the course of the history of medicine and is an example of how social and cultural history has to follow a particular course determined by the structure of the world around us. (shrink)
A polemical account of Australian philosophy up to 2003, emphasising its unique aspects (such as commitment to realism) and the connections between philosophers' views and their lives. Topics include early idealism, the dominance of John Anderson in Sydney, the Orr case, Catholic scholasticism, Melbourne Wittgensteinianism, philosophy of science, the Sydney disturbances of the 1970s, Francofeminism, environmental philosophy, the philosophy of law and Mabo, ethics and Peter Singer. Realist theories especially praised are David Armstrong's on universals, David Stove's on logical (...) class='Hi'>probability and the ethical realism of Rai Gaita and Catholic philosophers. In addition to strict philosophy, the book treats non-religious moral traditions to train virtue, such as Freemasonry, civics education and the Greek and Roman classics. (shrink)
Karl Popper discovered in 1938 that the unconditional probability of a conditional of the form ‘If A, then B’ normally exceeds the conditional probability of B given A, provided that ‘If A, then B’ is taken to mean the same as ‘Not (A and not B)’. So it was clear (but presumably only to him at that time) that the conditional probability of B given A cannot be reduced to the unconditional probability of the material conditional (...) ‘If A, then B’. I describe how this insight was developed in Popper’s writings and I add to this historical study a logical one, in which I compare laws of excess in Kolmogorov probabilitytheory with laws of excess in Popper probabilitytheory. (shrink)
John Maynard Keynes’s A Treatise on Probability is the seminal text for the logical interpretation of probability. According to his analysis, probabilities are evidential relations between a hypothesis and some evidence, just like the relations of deductive logic. While some philosophers had suggested similar ideas prior to Keynes, it was not until his Treatise that the logical interpretation of probability was advocated in a clear, systematic and rigorous way. I trace Keynes’s influence in the philosophy of (...) class='Hi'>probability through a heterogeneous sample of thinkers who adopted his interpretation. This sample consists of Frederick C. Benenson, Roy Harrod, Donald C. Williams, Henry E. Kyburg and David Stove. The ideas of Keynes prove to be adaptable to their diverse theories of probability. My discussion indicates both the robustness of Keynes’s probabilitytheory and the importance of its influence on the philosophers whom I describe. I also discuss the Problem of the Priors. I argue that none of those I discuss have obviously improved on Keynes’s theory with respect to this issue. (shrink)
"Capital is moved as much and as little by the degradation and final depopulation of the human race, as by the probable fall of the earth into the sun. Apres moi le deluge! is the watchword of every capitalist and of every capitalist nation" (Marx, CAPITAL Vol 1, 380-381).
Evolutionary theory (ET) is teeming with probabilities. Probabilities exist at all levels: the level of mutation, the level of microevolution, and the level of macroevolution. This uncontroversial claim raises a number of contentious issues. For example, is the evolutionary process (as opposed to the theory) indeterministic, or is it deterministic? Philosophers of biology have taken different sides on this issue. Millstein (1997) has argued that we are not currently able answer this question, and that even scientific realists ought (...) to remain agnostic concerning the determinism or indeterminism of evolutionary processes. If this argument is correct, it suggests that, whatever we take probabilities in ET to be, they must be consistent with either determinism or indeterminism. This raises some interesting philosophical questions: How should we understand the probabilities used in ET? In other words, what is meant by saying that a certain evolutionary change is more or less probable? Which interpretation of probability is the most appropriate for ET? I argue that the probabilities used in ET are objective in a realist sense, if not in an indeterministic sense. Furthermore, there are a number of interpretations of probability that are objective and would be consistent with ET under determinism or indeterminism. However, I argue that evolutionary probabilities are best understood as propensities of population-level kinds. (shrink)
According to the antirealist argument known as the pessimistic induction, the history of science is a graveyard of dead scientific theories and abandoned theoretical posits. Support for this pessimistic picture of the history of science usually comes from a few case histories, such as the demise of the phlogiston theory and the abandonment of caloric as the substance of heat. In this article, I wish to take a new approach to examining the ‘history of science as (...) a graveyard of theories’ picture. Using JSTOR Data for Research and Springer Exemplar, I present new lines of evidence that are at odds with this pessimistic picture of the history of science. When rigorously tested against the historical record of science, I submit, the pessimistic picture of the history of science as a graveyard of dead theories and abandoned posits may turn out to be no more than a philosophers’ myth. (shrink)
The book explores the nature, underlying causes, and the information processing mechanism of serendipity. It proposes that natural or social survival demands drive serendipity, and serendipity is conditional on the environment and the mindset, on both individual and collective levels. From Darwin’s evolution theory to Sun Tzu’s war tactics, major innovations throughout human history are unified by this key concept. In the rapidly changing world, information is abundant but rather chaotic. The adaptive power of serendipity allows people to (...) notice treasures within this wild sea, but only for those who understand how it works. To increase the probability of encountering and attaining serendipity, one should employ the mindsponge mechanism and the 3D process of creativity, for without these frameworks, serendipity is truly an elusive target. The book also discusses methods to build environments and cultures rich in navigational and useful information to maximize the chance of finding and capitalizing on serendipity. As a skill, serendipity has a resemblance to how kingfishers observe and hunt their prey. (shrink)
In finite probabilitytheory, events are subsets S⊆U of the outcome set. Subsets can be represented by 1-dimensional column vectors. By extending the representation of events to two dimensional matrices, we can introduce "superposition events." Probabilities are introduced for classical events, superposition events, and their mixtures by using density matrices. Then probabilities for experiments or `measurements' of all these events can be determined in a manner exactly like in quantum mechanics (QM) using density matrices. Moreover the transformation of (...) the density matrices induced by the experiments or `measurements' is the Lüders mixture operation as in QM. And finally by moving the machinery into the n-dimensional vector space over ℤ₂, different basis sets become different outcome sets. That `non-commutative' extension of finite probabilitytheory yields the pedagogical model of quantum mechanics over ℤ₂ that can model many characteristic non-classical results of QM. (shrink)
Wittgenstein did not write very much on the topic of probability. The little we have comes from a few short pages of the Tractatus, some 'remarks' from the 1930s, and the informal conversations which went on during that decade with the Vienna Circle. Nevertheless, Wittgenstein's views were highly influential in the later development of the logical theory of probability. This paper will attempt to clarify and defend Wittgenstein's conception of probability against some oft-cited criticisms that stem (...) from a misunderstanding of his views. Max Black, for instance, criticises Wittgenstein for formulating a theory of probability that is capable of being used only against the backdrop of the ideal language of the Tractatus. I argue that on the contrary, by appealing to the 'hypothetical laws of nature', Wittgenstein is able to make sense of probability statements involving propositions that have not been completely analysed. G.H. von Wright criticises Wittgenstein's characterisation of these very hypothetical laws. He argues that by introducing them Wittgenstein makes what is distinctive about his theory superfluous, for the hypothetical laws are directly inspired by statistical observations and hence these observations indirectly determine the mechanism by which the logical theory of probability operates. I argue that this is not the case at all, and that while statistical observations play a part in the formation of the hypothetical laws, these observations are only necessary, but not sufficient conditions for the introduction of these hypotheses. (shrink)
Epistemic closure under known implication is the principle that knowledge of "p" and knowledge of "p implies q", together, imply knowledge of "q". This principle is intuitive, yet several putative counterexamples have been formulated against it. This paper addresses the question, why is epistemic closure both intuitive and prone to counterexamples? In particular, the paper examines whether probabilitytheory can offer an answer to this question based on four strategies. The first probability-based strategy rests on the accumulation (...) of risks. The problem with this strategy is that risk accumulation cannot accommodate certain counterexamples to epistemic closure. The second strategy is based on the idea of evidential support, that is, a piece of evidence supports a proposition whenever it increases the probability of the proposition. This strategy makes progress and can accommodate certain putative counterexamples to closure. However, this strategy also gives rise to a number of counterintuitive results. Finally, there are two broadly probabilistic strategies, one based on the idea of resilient probability and the other on the idea of assumptions that are taken for granted. These strategies are promising but are prone to some of the shortcomings of the second strategy. All in all, I conclude that each strategy fails. Probabilitytheory, then, is unlikely to offer the account we need. (shrink)
Decision theory has at its core a set of mathematical theorems that connect rational preferences to functions with certain structural properties. The components of these theorems, as well as their bearing on questions surrounding rationality, can be interpreted in a variety of ways. Philosophy’s current interest in decision theory represents a convergence of two very different lines of thought, one concerned with the question of how one ought to act, and the other concerned with the question of what (...) action consists in and what it reveals about the actor’s mental states. As a result, the theory has come to have two different uses in philosophy, which we might call the normative use and the interpretive use. It also has a related use that is largely within the domain of psychology, the descriptive use. This essay examines the historical development of decision theory and its uses; the relationship between the norm of decision theory and the notion of rationality; and the interdependence of the uses of decision theory. (shrink)
An essay review of Richard Johns "A Theory of Physical Probability" (University of Toronto Press, 2002). Forthcoming in Studies in History and Philosophy of Science.
This dissertation is a contribution to formal and computational philosophy. -/- In the first part, we show that by exploiting the parallels between large, yet finite lotteries on the one hand and countably infinite lotteries on the other, we gain insights in the foundations of probabilitytheory as well as in epistemology. Case 1: Infinite lotteries. We discuss how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. The solution boils down to (...) the introduction of infinitesimal probability values, which can be achieved using non-standard analysis. Our solution can be generalized to uncountable sample spaces, giving rise to a Non-Archimedean Probability (NAP) theory. Case 2: Large but finite lotteries. We propose application of the language of relative analysis (a type of non-standard analysis) to formulate a new model for rational belief, called Stratified Belief. This contextualist model seems well-suited to deal with a concept of beliefs based on probabilities ‘sufficiently close to unity’. -/- The second part presents a case study in social epistemology. We model a group of agents who update their opinions by averaging the opinions of other agents. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating. To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. The probability of ending up in an inconsistent belief state turns out to be always smaller than 2%. (shrink)
The value of knowledge can vary in that knowledge of important facts is more valuable than knowledge of trivialities. This variation in the value of knowledge is mirrored by a variation in evidential standards. Matters of greater importance require greater evidential support. But all knowledge, however trivial, needs to be evidentially certain. So on one hand we have a variable evidential standard that depends on the value of the knowledge, and on the other, we have the invariant standard of evidential (...) certainty. This paradox in the concept of knowledge runs deep in the history of philosophy. We approach this paradox by proposing a bet settlement theory of knowledge. Degrees of belief can be measured by the expected value of a bet divided by stake size, with the highest degree of belief being probability 1, or certainty. Evidence sufficient to settle the bet makes the expectation equal to the stake size and therefore has evidential probability 1. This gives us the invariant evidential certainty standard for knowledge. The value of knowledge relative to a bet is given by the stake size. We propose that evidential probability can vary with stake size, so that evidential certainty at low stakes does not entail evidential certainty at high stakes. This solves the paradox by allowing that certainty is necessary for knowledge at any stakes, but that the evidential standards for knowledge vary according to what is at stake. We give a Stake Size Variation Principle that calculates evidential probability from the value of evidence and the stakes. Stake size variant degrees of belief are probabilistically coherent and explain a greater range of preferences than orthodox expected utility theory, namely the Ellsberg and Allais preferences. The resulting theory of knowledge gives an empirically adequate, rationally grounded, unified account of evidence, value and probability. (shrink)
This proposal serves to enhance scientific and technological literacy, by promoting STEM (Science, Technology, Engineering, and Mathematics) education with particular reference to contemporary physics. The study is presented in the form of a repertoire, and it gives the reader a glimpse of the conceptual structure and development of quantum theory along a rational line of thought, whose understanding might be the key to introducing young generations of students to physics.
Neo-Darwinism, through the combination of natural selection and genetics, has made possible an explanation of adaptive phenomena that claims to be devoid of metaphysical presuppositions. What Bergson already deplored and what we explore in this paper is the implicit finalism of such evolutionary explanations, which turn living beings into closed and static systems rather than understanding biological evolution as a process characterized by its interactions and temporal openness. Without denying the heuristic efficiency of the explanation resting upon natural selection, we (...) analyze what it leaves out and what remains to be explored: the unpredictability of the evolutionary process. We will therefore study the role of contingency in evolution, as Stephen J. Gould proposed, but we will also consider the causality specific to the living world that makes it impossible to reduce it to a simple algorithm, as proposed by Daniel Dennett among others, but that it is really a creative causation, or dialectical spiral. (shrink)
Preliminary version of “Towards a History of Speech Act Theory”, in A. Burkhardt (ed.), Speech Acts, Meanings and Intentions. Critical Approaches to the Philosophy of John R. Searle, Berlin/New York: de Gruyter, 1990, 29–61.
That uses of language not only can, but even normally do, have the character of actions was a fact largely unrealised by those engaged in the study of language before the present century, at least in the sense that there was lacking any attempt to come to terms systematically with the action-theoretic peculiarities of language use. Where the action-character of linguistic phenomena was acknowledged, it was normally regarded as a peripheral matter, relating to derivative or nonstandard aspects of language which (...) could afford to be ignored. (shrink)
Ignited by Einstein and Bohr a century ago, the philosophical struggle about Reality is yet unfinished, with no signs of a swift resolution. Despite vast technological progress fueled by the iconic EPR paper (EPR), the intricate link between ontic and epistemic aspects of Quantum Theory (QT) has greatly hindered our grip on Reality and further progress in physical theory. Fallacies concealed by tortuous logical negations made EPR comprehension much harder than it could have been had Einstein written it (...) himself in German. It is plagued with preconceptions about what a physical property is, the 'Uncertainty Principle', and the Principle of Locality. Numerous interpretations of QT vis à vis Reality exist and are keenly disputed. This is the first of a series of articles arguing for a physical interpretation called ‘The Ontic Probability Interpretation’ (TOPI). A gradual explanation of TOPI is given intertwined with a meticulous logico-philosophical scrutiny of EPR. Part I focuses on the meaning of Einstein’s ‘Incompleteness’ claim. A conceptual confusion, a preconception about Reality, and a flawed dichotomy are shown to be severe obstacles for the EPR argument to succeed. Part II analyzes Einstein’s ‘Incompleteness/Nonlocality Dilemma’. Future articles will further explain TOPI, demonstrating its soundness and potential for nurturing theoretical progress. (shrink)
Most of us are either philosophically naïve scientists or scientifically naïve philosophers, so we misjudged Schrödinger’s “very burlesque” portrait of Quantum Theory (QT) as a profound conundrum. The clear signs of a strawman argument were ignored. The Ontic Probability Interpretation (TOPI) is a metatheory: a theory about the meaning of QT. Ironically, equating Reality with Actuality cannot explain actual data, justifying the century-long philosophical struggle. The actual is real but not everything real is actual. The ontic character (...) of the Probable has been elusive for so long because it cannot be grasped directly from experiment; it can only be inferred from physical setups that do not morph it into the Actual. Born’s Rule and the quantum formalism for the microworld are intuitively surmised from instances in our macroworld. The posited reality of the quanton’s probable states and properties is probed and proved. After almost a century, TOPI aims at setting the record straight: the so-called ‘Basis’ and ‘Measurement’ problems are ill-advised. About the first, all bases are legitimate regardless of state and milieu. As for the second, its premise is false: there is no need for a physical ‘collapse’ process that would convert many states into a single state. Under TOPI, a more sensible variant of the ‘measurement problem’ can be reformulated in non-anthropic terms as a real problem. Yet, as such, it is not part of QT per se and will be tackled in future papers. As for the mythical cat, the ontic state of a radioactive nucleus is not pure, so its evolution is not governed by Schrödinger’s equation -- let alone the rest of his “hellish machine”. Einstein was right: “The Lord is subtle but not malicious”. However, ‘The Lord’ turned out to be much subtler than what Einstein and Schrödinger could have ever accepted. Future articles will reveal how other ‘paradoxes of QT’ are fully explained under TOPI, showing its soundness and potential for nurturing further theoretical/technological advance. (shrink)
In this text the ancient philosophical question of determinism (“Does every event have a cause ?”) will be re-examined. In the philosophy of science and physics communities the orthodox position states that the physical world is indeterministic: quantum events would have no causes but happen by irreducible chance. Arguably the clearest theorem that leads to this conclusion is Bell’s theorem. The commonly accepted ‘solution’ to the theorem is ‘indeterminism’, in agreement with the Copenhagen interpretation. Here it is recalled that indeterminism (...) is not really a physical but rather a philosophical hypothesis, and that it has counterintuitive and far-reaching implications. At the same time another solution to Bell’s theorem exists, often termed ‘superdeterminism’ or ‘total determinism’. Superdeterminism appears to be a philosophical position that is centuries and probably millennia old: it is for instance Spinoza’s determinism. If Bell’s theorem has both indeterministic and deterministic solutions, choosing between determinism and indeterminism is a philosophical question, not a matter of physical experimentation, as is widely believed. If it is impossible to use physics for deciding between both positions, it is legitimate to ask which philosophical theories are of help. Here it is argued that probabilitytheory – more precisely the interpretation of probability – is instrumental for advancing the debate. It appears that the hypothesis of determinism allows to answer a series of precise questions from probabilitytheory, while indeterminism remains silent for these questions. From this point of view determinism appears to be the more reasonable assumption, after all. (shrink)
This paper shows how the classical finite probabilitytheory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or "toy" model of quantum mechanics over sets (QM/sets). There are two parts. The notion of an "event" is reinterpreted from being an epistemological state of indefiniteness to being an objective state of indefiniteness. And the mathematical framework of finite probabilitytheory is recast as the quantum probability calculus for (...) QM/sets. The point is not to clarify finite probabilitytheory but to elucidate quantum mechanics itself by seeing some of its quantum features in a classical setting. (shrink)
I introduce a formalization of probability which takes the concept of 'evidence' as primitive. In parallel to the intuitionistic conception of truth, in which 'proof' is primitive and an assertion A is judged to be true just in case there is a proof witnessing it, here 'evidence' is primitive and A is judged to be probable just in case there is evidence supporting it. I formalize this outlook by representing propositions as types in Martin-Lof type theory (MLTT) and (...) defining a 'probability type' on top of the existing machinery of MLTT, whose inhabitants represent pieces of evidence in favor of a proposition. One upshot of this approach is the potential for a mathematical formalism which treats 'conjectures' as mathematical objects in their own right. Other intuitive properties of evidence occur as theorems in this formalism. (shrink)
Karl Popper (1902-1994) was one of the most influential philosophers of science of the 20th century. He made significant contributions to debates concerning general scientific methodology and theory choice, the demarcation of science from non-science, the nature of probability and quantum mechanics, and the methodology of the social sciences. His work is notable for its wide influence both within the philosophy of science, within science itself, and within a broader social context. Popper’s early work attempts to solve the (...) problem of demarcation and offer a clear criterion that distinguishes scientific theories from metaphysical or mythological claims. Popper’s falsificationist methodology holds that scientific theories are characterized by entailing predictions that future observations might reveal to be false. When theories are falsified by such observations, scientists can respond by revising the theory, or by rejecting the theory in favor of a rival or by maintaining the theory as is and changing an auxiliary hypothesis. In either case, however, this process must aim at the production of new, falsifiable predictions. While Popper recognizes that scientists can and do hold onto theories in the face of failed predictions when there are no predictively superior rivals to turn to. He holds that scientific practice is characterized by its continual effort to test theories against experience and make revisions based on the outcomes of these tests. By contrast, theories that are permanently immunized from falsification by the introduction of untestable ad hoc hypotheses can no longer be classified as scientific. Among other things, Popper argues that his falsificationist proposal allows for a solution of the problem of induction, since inductive reasoning plays no role in his account of theory choice. Along with his general proposals regarding falsification and scientific methodology, Popper is notable for his work on probability and quantum mechanics and on the methodology of the social sciences. Popper defends a propensity theory of probability, according to which probabilities are interpreted as objective, mind-independent properties of experimental setups. Popper then uses this theory to provide a realist interpretation of quantum mechanics, though its applicability goes beyond this specific case. With respect to the social sciences, Popper argued against the historicist attempt to formulate universal laws covering the whole of human history and instead argued in favor of methodological individualism and situational logic. Table of Contents 1. Background 2. Falsification and the Criterion of Demarcation a. Popper on Physics and Psychoanalysis b. Auxiliary and Ad Hoc Hypotheses c. Basic Sentences and the Role of Convention d. Induction, Corroboration, and Verisimilitude 3. Criticisms of Falsificationism 4. Realism, Quantum Mechanics, and Probability 5. Methodology in the Social Sciences 6. Popper’s Legacy 7. References and Further Reading a. Primary Sources b. Secondary Sources -/- . (shrink)
We discuss the relationship between logic, geometry and probabilitytheory under the light of a novel approach to quantum probabilities which generalizes the method developed by R. T. Cox to the quantum logical approach to physical theories.
In the following we will investigate whether von Mises’ frequency interpretation of probability can be modified to make it philosophically acceptable. We will reject certain elements of von Mises’ theory, but retain others. In the interpretation we propose we do not use von Mises’ often criticized ‘infinite collectives’ but we retain two essential claims of his interpretation, stating that probability can only be defined for events that can be repeated in similar conditions, and that exhibit frequency stabilization. (...) The central idea of the present article is that the mentioned ‘conditions’ should be well-defined and ‘partitioned’. More precisely, we will divide probabilistic systems into object, initializing, and probing subsystem, and show that such partitioning allows to solve problems. Moreover we will argue that a key idea of the Copenhagen interpretation of quantum mechanics (the determinant role of the observing system) can be seen as deriving from an analytic definition of probability as frequency. Thus a secondary aim of the article is to illustrate the virtues of analytic definition of concepts, consisting of making explicit what is implicit. (shrink)
We call something a paradox if it strikes us as peculiar in a certain way, if it strikes us as something that is not simply nonsense, and yet it poses some difficulty in seeing how it could make sense. When we examine paradoxes more closely, we find that for some the peculiarity is relieved and for others it intensifies. Some are peculiar because they jar with how we expect things to go, but the jarring is to do with imprecision and (...) misunderstandings in our thought, failures to appreciate the breadth of possibility consistent with our beliefs. Other paradoxes, however, pose deep problems. Closer examination does not explain them away. Instead, they challenge the coherence of certain conceptual resources and hence challenge the significance of beliefs which deploy those resources. I shall call the former kind weak paradoxes and the latter, strong paradoxes. Whether a particular paradox is weak or strong is sometimes a matter of controversy—sometimes it has been realised that what was thought strong is in fact weak, and vice versa,— but the distinction between the two kinds is generally thought to be worth drawing. In this Cchapter, I shall cover both weak and strong probabilistic paradoxes. (shrink)
This book presents not only the mathematical concept of probability, but also its philosophical aspects, the relativity of probability and its applications and even the psychology of probability. All explanations are made in a comprehensible manner and are supported with suggestive examples from nature and daily life, and even with challenging math paradoxes.
The ultimate cause of much historical, social and cultural change is the gradual accumulation of human knowledge of the environment. Human beings use the materials in their environment to meet their needs and increased human knowledge of the environment enables human needs to be met in a more efficient manner. The human environment has a particular structure so that human knowledge of the environment is acquired in a particular order. The simplest knowledge is acquired first and more complex knowledge is (...) acquired later. The order of discovery determines the course of human social and cultural history as knowledge of new and more efficient means of meeting human needs, results in new technology, which results in the development of new social and ideological systems. This means human social and cultural history has to follow a particular course, a course that is determined by the structure of the human environment. Given that a certain level of knowledge will result in a particular type of society, it is possible to ascertain the types of societies that were inevitable in human history. The course of history is not random and can be rationally and scientifically understood. (shrink)
My aims in this essay are two. First (§§1-4), I want to get clear on the very idea of a theory of the history of philosophy, the idea of an overarching account of the evolution of philosophical reflection since the inception of written philosophy. And secondly (§§5-8), I want to actually sketch such a global theory of the history of philosophy, which I call the two-streams theory.
PHILOSOPHY OF SCIENCE, vol. 52, number 1, pp.44-63. R.M. Nugayev, Kazan State |University, USSR. -/- THE HISTORY OF QUANTUM THEORY AS A DECISIVE ARGUMENT FAVORING EINSTEIN OVER LJRENTZ. -/- Abstract. Einstein’s papers on relativity, quantum theory and statistical mechanics were all part of a single research programme ; the aim was to unify mechanics and electrodynamics. It was this broader program – which eventually split into relativistic physics and quantummmechanics – that superseded Lorentz’s theory. The argument (...) of this paper is partly historical and partly methodological. A notion of “crossbred objects” – theoretical objects with contradictory properties which are part of the domain of application of two different research programs – is developed that explains the dynamics of revolutionary theory change. (shrink)
In this paper, I will reread the history of molecular genetics from a psychoanalytical angle, analysing it as a case history. Building on the developmental theories of Freud and his followers, I will distinguish four stages, namely: (1) oedipal childhood, notably the epoch of model building (1943–1953); (2) the latency period, with a focus on the development of basic skills (1953–1989); (3) adolescence, exemplified by the Human Genome Project, with its fierce conflicts, great expectations and grandiose claims (1989–2003) (...) and (4) adulthood (2003–present) during which revolutionary research areas such as molecular biology and genomics have achieved a certain level of normalcy—have evolved into a normal science. I will indicate how a psychoanalytical assessment conducted in this manner may help us to interpret and address some of the key normative issues that have been raised with regard to molecular genetics over the years, such as ‘relevance’, ‘responsible innovation’ and ‘promise management’. (shrink)
The theory of lower previsions is designed around the principles of coherence and sure-loss avoidance, thus steers clear of all the updating anomalies highlighted in Gong and Meng's "Judicious Judgment Meets Unsettling Updating: Dilation, Sure Loss, and Simpson's Paradox" except dilation. In fact, the traditional problem with the theory of imprecise probability is that coherent inference is too complicated rather than unsettling. Progress has been made simplifying coherent inference by demoting sets of probabilities from fundamental building blocks (...) to secondary representations that are derived or discarded as needed. (shrink)
Bolzano hat seine Wahrscheinlichkeitslehre in 15 Punkten im § 14 des zweiten Teils seiner Religionswissenschaft sowie in 20 Punkten im § 161 des zweiten Bandes seiner Wissenschaftslehre niedergelegt. (Ich verweise auf die Religionswissenschaft mit 'RW II', auf die Wissenschaftslehre mit 'WL II'.) In der RW II (vgl. p. 37) ist seine Wahrscheinlichkeitslehre eingebettet in seine Ausführungen "Über die Natur der historischen Erkenntniß, besonders in Hinsicht auf Wunder", und die Lehrsätze, die er dort zusammenstellt, dienen dem ausdrücklichen Zweck, mit mathematischem Rüstzeug (...) Lehrmeinungen entgegentreten zu können, gemäß denen Wundererzählungen keine Glaubwürdigkeit zukommen könne. In der WL II (vgl. p. 171) führt Bolzano im großen und ganzen dieselben Lehrsätze an wie in der RW II, entwickelt nun aber die Wahrscheinlichkeitslehre innerhalb seiner Lehre von den Sätzen an sich. Dabei orientiert er sich zwar durchaus an den Lehrsätzen in den damaligen "Schriften über die Wahrscheinlichkeitsrechnung" (vg. WL II, p. 190), korrigiert aber dort, wo es ihm nötig erscheint (vgl. WL II, pp. 187–191), und leistet so im Grunde eine Reformulierung des elementaren Teils der Wahrscheinlichkeitslehre seiner Zeit innerhalb seiner logischen Theorie von den Sätzen an sich. — Ich bezwecke hier keine historische Studie über Bolzanos Wahrscheinlichkeitslehre, obwohl es von Interesse sein mag, herauszuschälen, worin Bolzano mit welchen Wahrscheinlichkeitstheoretikern seiner Zeit übereinstimmt, und worin nicht, insbesondere welche Schwächen von Bolzanos Wahrscheinlichkeitslehre Schwächen aller damaligen Wahrscheinlichkeitslehren waren. Eine wichtige systematische Studie über Bolzanos Wahrscheinlichkeitslehre bestünde — wie von Berg (1962, pp. 148-149) ansatzweise begonnen — in einer exakten Rekonstruktion seiner Wahrscheinlichkeitslehre innerhalb eines konsistenten logischen Systems der Sätze an sich. Ich werde im folgenden etwas bei weitem Bescheideneres, doch möglicherweise durchaus Fruchtbares versuchen, nämlich die Lehrsätze von Bolzanos Wahrscheinlichkeitslehre in die Sprache einer heutigen Wahrscheinlichkeitstheorie zu übersetzen und die übersetzten Lehrsätze dort herzuleiten, soweit dies möglich ist. Man könnte dann in einem zweiten Schritt, der hier nicht mehr unternommen wird, untersuchen, inwieweit jene Thesen, die den Herleitungstest überstanden haben, jenen Zweck erfüllen, den Bolzano ihnen ursprünglich zugedacht hat: als mathematisches Rüstzeug für seine Argumentationen gegen die Auffassung zu dienen, Wundererzählungen könnten nicht glaubwürdig sein. (shrink)
A lot of words investigated by philosophers get their inception for conventional or extra-philosophical dialect. Yet the idea of substance is basically a philosophical term of art. Its employments in normal dialect tend to derive, often in a twisted way, different from its philosophical usage. Despite this, the idea of substance differs from philosophers, reliant upon the school of thought in which it is been expressed. There is an ordinary concept in play when philosophers discuss “substance”, and this is seen (...) in the concept of object, or thing when this is contrasted with properties, attributes or events. There is also a difference in view when in the sense that while the realists would develop a materialistic theory of substance, the idealist would develop a metaphysical theory of substance. The problem surrounding substance spans through the history of philosophy. The queries have often been what is substance of? And can there be substance without its attributes? This paper tends to expose the historical problems surrounding substance. This paper criticizes the thinking which presupposes that there could be a substance without its attributes or substance existing alone. This paper adopts complimentary ontology principles which state that for anything to exist, it must serve as a missing connection to reality. This suggests that everything interconnects to each other and substance cannot exist in isolation. (shrink)
The main objective of this dissertation is to philosophically assess how the use of informational concepts in the field of classical thermostatistical physics has historically evolved from the late 1940s to the present day. I will first analyze in depth the main notions that form the conceptual basis on which 'informational physics' historically unfolded, encompassing (i) different entropy, probability and information notions, (ii) their multiple interpretative variations, and (iii) the formal, numerical and semantic-interpretative relationships among them. In the following, (...) I will assess the history of informational thermophysics during the second half of the twentieth century. Firstly, I analyse the intellectual factors that gave rise to this current in the late forties (i.e., popularization of Shannon's theory, interest in a naturalized epistemology of science, etc.), then study its consolidation in the Brillouinian and Jaynesian programs, and finally claim how Carnap (1977) and his disciples tried to criticize this tendency within the scientific community. Then, I evaluate how informational physics became a predominant intellectual current in the scientific community in the nineties, made possible by the convergence of Jaynesianism and Brillouinism in proposals such as that of Tribus and McIrvine (1971) or Bekenstein (1973) and the application of algorithmic information theory into the thermophysical domain. As a sign of its radicality at this historical stage, I explore the main proposals to include information as part of our physical reality, such as Wheeler’s (1990), Stonier’s (1990) or Landauer’s (1991), detailing the main philosophical arguments (e.g., Timpson, 2013; Lombardi et al. 2016a) against those inflationary attitudes towards information. Following this historical assessment, I systematically analyze whether the descriptive exploitation of informational concepts has historically contributed to providing us with knowledge of thermophysical reality via (i) explaining thermal processes such as equilibrium approximation, (ii) advantageously predicting thermal phenomena, or (iii) enabling understanding of thermal property such as thermodynamic entropy. I argue that these epistemic shortcomings would make it impossible to draw ontological conclusions in a justified way about the physical nature of information. In conclusion, I will argue that the historical exploitation of informational concepts has not contributed significantly to the epistemic progress of thermophysics. This would lead to characterize informational proposals as 'degenerate science' (à la Lakatos 1978a) regarding classical thermostatistical physics or as theoretically underdeveloped regarding the study of the cognitive dynamics of scientists in this physical domain. (shrink)
Biological evolution and technological innovation, while differing in many respects, also share common features. In particular, implementation of a new technology in the market is analogous to the spreading of a new genetic trait in a population. Technological innovation may occur either through the accumulation of quantitative changes, as in the development of the ocean clipper, or it may be initiated by a new combination of features or subsystems, as in the case of steamships. Other examples of the latter type (...) are electric networks that combine the generation, distribution, and use of electricity, and containerized transportation that combines standardized containers, logistics, and ships. Biological evolution proceeds, phenotypically, in many small steps, but at the genetic level novel features may arise not only through the accumulation of many small, common mutational changes, but also when distinct, relatively rare genetic changes are followed by many further mutations. In particular, capabilities of biologically modern man may have been initiated, perhaps some 150 000 years ago, by one or few accidental but distinct combinations of modules and subroutines of gene regulation which are involved in the generation of the neural network in the cerebral cortex. It is even conceivable that it was one primary genetic event that initiated the evolution of biologically modern man, introducing some novel but subtle feature of connectivity into the cerebral cortex which allowed for meta-levels of abstraction and upgraded modes of information processing. This may have set the stage for the evolution of integrated but diverse higher capabilities such as structured language, symbolic thought, strategic thought, and cognition based empathy. (shrink)
This chapter provides a brief overview of the history of behavioral neurology, dividing it roughly into six eras. In the ancient and classical eras, emphasis is placed on two transitions: firstly, from descriptions of head trauma and attempted neurosurgical treatments to the exploratory dissections during the Hellenistic period and the replacement of cardiocentrism; and secondly, to the more systematic investigations of Galenus and the rise of pneumatic ventricular theory. In the medieval through post-Renaissance eras, the scholastic consolidation of (...) knowledge and the role of compendia are emphasized, along with the use of new methods from within a mechanistic framework. With the discovery of electrical conductance and the rise of experimentalism, we frame the modern era as period of intense debate over localization, decomposition, and other mechanistic principles, and marked by rapid discovery about the brain. The chapter ends with a discussion of the contemporary era, focusing on the establishment of behavioral neurology research on aphasia, apraxia, and neuropsychiatric conditions. (shrink)
In this article I trace some of the main tenets of the struggle between nominalism and realism as identified by John Deely in his Four ages of understanding. The aim is to assess Deely’s claim that the Age of Modernity was nominalist and that the coming age, the Age of Postmodernism — which he portrays as a renaissance of the late middle ages and as starting with Peirce — is realist. After a general overview of how Peirce interpreted the nominalist-realist (...) controversy, Deely gives special attention to Thomas Aquinas’s On being and essence and the realism it entails. A subsequent discussion of the Modern Period shows that the issue of nominalism and realism is very much tied up with di¤erent conceptions of the intellect. Deely credits the theory of evolution with bringing us a conception of the intellect that is closer to that of the Middle Ages and that opens the way for a truly realistic ‘‘fourth age’’ of the understanding. (shrink)
I defend an analog of probabilism that characterizes rationally coherent estimates for chances. Specifically, I demonstrate the following accuracy-dominance result for stochastic theories in the C*-algebraic framework: supposing an assignment of chance values is possible if and only if it is given by a pure state on a given algebra, your estimates for chances avoid accuracy-dominance if and only if they are given by a state on that algebra. When your estimates avoid accuracy-dominance (roughly: when you cannot guarantee that other (...) estimates would be more accurate), I say that they are sufficiently coherent. In formal epistemology and quantum foundations, the notion of rational coherence that gets more attention requires that you never allow for a sure loss (or “Dutch book”) in a given sort of betting game; I call this notion full coherence. I characterize when these two notions of rational coherence align, and I show that there is a quantum state giving estimates that are sufficiently coherent, but not fully coherent. (shrink)
In “Conceptual Differences Between Children and Adults,” Susan Carey discusses phlogiston theory in order to defend the view that there can be non-translatability between scientific languages. I present an objection to her defence.
Using “brute reason” I will show why there can be only one valid interpretation of probability. The valid interpretation turns out to be a further refinement of Popper’s Propensity interpretation of probability. Via some famous probability puzzles and new thought experiments I will show how all other interpretations of probability fail, in particular the Bayesian interpretations, while these puzzles do not present any difficulties for the interpretation proposed here. In addition, the new interpretation casts doubt on (...) some concepts often taken as basic and unproblematic, like rationality, utility and expectation. This in turn has implications for decision theory, economic theory and the philosophy of physics. (shrink)
The major competing statistical paradigms share a common remarkable but unremarked thread: in many of their inferential applications, different probability interpretations are combined. How this plays out in different theories of inference depends on the type of question asked. We distinguish four question types: confirmation, evidence, decision, and prediction. We show that Bayesian confirmation theory mixes what are intuitively “subjective” and “objective” interpretations of probability, whereas the likelihood-based account of evidence melds three conceptions of what constitutes an (...) “objective” probability. (shrink)
Recent perspectival interpretations of Kant suggest a way of relating his epistemology to empirical science that makes it plausible to regard Einstein’stheory of relativity as having a Kantian grounding. This first of two articles exploring this topic focuses on how the foregoing hypothesis accounts for variousresonances between Kant’s philosophy and Einstein’s science. The great attention young Einstein paid to Kant in his early intellectual development demonstrates the plausibility of this hypothesis, while certain features of Einstein’s cultural-political context account for his (...) reluctance to acknowledge Kant’s influence, even though contemporary philosophers who regarded themselves as Kantians urged him to do so. The sequel argues that this Kantian grounding probably had a formative influence not only on Einstein’s discovery of the theory of relativity and his view of the nature of science, but also on his quasi-mystical, religious disposition. (shrink)
When do probability distribution functions (PDFs) about future climate misrepresent uncertainty? How can we recognise when such misrepresentation occurs and thus avoid it in reasoning about or communicating our uncertainty? And when we should not use a PDF, what should we do instead? In this paper we address these three questions. We start by providing a classification of types of uncertainty and using this classification to illustrate when PDFs misrepresent our uncertainty in a way that may adversely affect decisions. (...) We then discuss when it is reasonable and appropriate to use a PDF to reason about or communicate uncertainty about climate. We consider two perspectives on this issue. On one, which we argue is preferable, available theory and evidence in climate science basically excludes using PDFs to represent our uncertainty. On the other, PDFs can legitimately be provided when resting on appropriate expert judgement and recognition of associated risks. Once we have specified the border between appropriate and inappropriate uses of PDFs, we explore alternatives to their use. We briefly describe two formal alternatives, namely imprecise probabilities and possibilistic distribution functions, as well as informal possibilistic alternatives. We suggest that the possibilistic alternatives are preferable. -/- . (shrink)
The claim that Galileo Galilei transformed the spyglass into an astronomical instrument has never been disputed and is considered a historical fact. However, the question what was the procedure which Galileo followed is moot, for he did not disclose his research method. On the traditional view, Galileo was guided by experience, more precisely, systematized experience, which was current among northern Italian artisans and men of science. In other words, it was a trial-and-error procedure—no theory was involved. A scientific analysis (...) of the optical properties of Galileo’s first improved spyglass shows that his procedure could not have been an informed extension of the traditional optics of spectacles. We argue that most likely Galileo realized that the objective and the eyepiece form a system and proceeded accordingly. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.