Critically growing problems of fundamental science organisation and content are analysed with examples from physics and emerging interdisciplinary fields. Their origin is specified and new science structure (organisation and content) is proposed as a unified solution.
We investigate a basic probabilistic dynamic semantics for a fragment containing conditionals, probability operators, modals, and attitude verbs, with the aim of shedding light on the prospects for adding probabilistic structure to models of the conversational common ground.
Famous results by David Lewis show that plausible-sounding constraints on the probabilities of conditionals or evaluative claims lead to unacceptable results, by standard probabilistic reasoning. Existing presentations of these results rely on stronger assumptions than they really need. When we strip these arguments down to a minimal core, we can see both how certain replies miss the mark, and also how to devise parallel arguments for other domains, including epistemic “might,” probability claims, claims about comparative value, and so on. (...) A popular reply to Lewis's results is to claim that conditional claims, or claims about subjective value, lack truth conditions. For this strategy to have a chance of success, it needs to give up basic structural principles about how epistemic states can be updated—in a way that is strikingly parallel to the commitments of the project of dynamic semantics. (shrink)
Many biological processes and objects can be described by fractals. The paper uses a new type of objects – blinking fractals – that are not covered by traditional theories considering dynamics of self-similarity processes. It is shown that both traditional and blinking fractals can be successfully studied by a recent approach allowing one to work numerically with infinite and infinitesimal numbers. It is shown that blinking fractals can be applied for modeling complex processes of growth of biological systems including their (...) season changes. The new approach allows one to give various quantitative characteristics of the obtained blinking fractals models of biological systems. (shrink)
In recent years, a number of theorists have claimed that beliefs about probability are transparent. To believe probably p is simply to have a high credence that p. In this paper, I prove a variety of triviality results for theses like the above. I show that such claims are inconsistent with the thesis that probabilistic modal sentences have propositions or sets of worlds as their meaning. Then I consider the extent to which a dynamic semantics for probabilistic modals (...) can capture theses connecting belief, certainty, credence, and probability. I show that although a dynamic semantics for probabilistic modals does allow one to validate such theses, it can only do so at a cost. I prove that such theses can only be valid if probabilistic modals do not satisfy the axioms of the probability calculus. (shrink)
Where three teacher-researchers discuss bipedal walking in humans and where the long-term dynamics of this phenomenon are interpreted in terms of complexity and predictability, two concepts that can be quantified using fractals and the mathematical tools to study them.
According to the traditional Bayesian view of credence, its structure is that of precise probability, its objects are descriptive propositions about the empirical world, and its dynamics are given by conditionalization. Each of the three essays that make up this thesis deals with a different variation on this traditional picture. The first variation replaces precise probability with sets of probabilities. The resulting imprecise Bayesianism is sometimes motivated on the grounds that our beliefs should not be more precise than the evidence (...) calls for. One known problem for this evidentially motivated imprecise view is that in certain cases, our imprecise credence in a particular proposition will remain the same no matter how much evidence we receive. In the first essay I argue that the problem is much more general than has been appreciated so far, and that it’s difficult to avoid without compromising the initial evidentialist motivation. The second variation replaces descriptive claims with moral claims as the objects of credence. I consider three standard arguments for probabilism with respect to descriptive uncertainty—representation theorem arguments, Dutch book arguments, and accuracy arguments—in order to examine whether such arguments can also be used to establish probabilism with respect to moral uncertainty. In the second essay, I argue that by and large they can, with some caveats. First, I don’t examine whether these arguments can be given sound non-cognitivist readings, and any conclusions therefore only hold conditional on cognitivism. Second, decision-theoretic representation theorems are found to be less convincing in the moral case, because there they implausibly commit us to thinking that intertheoretic comparisons of value are always possible. Third and finally, certain considerations may lead one to think that imprecise probabilism provides a more plausible model of moral epistemology. The third variation considers whether, in addition to conditionalization, agents may also change their minds by becoming aware of propositions they had not previously entertained, and therefore not previously assigned any probability. More specifically, I argue that if we wish to make room for reflective equilibrium in a probabilistic moral epistemology, we must allow for awareness growth. In the third essay, I sketch the outline of such a Bayesian account of reflective equilibrium. Given that this account gives a central place to awareness growth, and that the rationality constraints on belief change by awareness growth are much weaker than those on belief change by conditionalization, it follows that the rationality constraints on the credences of agents who are seeking reflective equilibrium are correspondingly weaker. (shrink)
Entanglement is one of the most striking features of quantum mechanics, and yet it is not specifically quantum. More specific to quantum mechanics is the connection between entanglement and thermodynamics, which leads to an identification between entropies and measures of pure state entanglement. Here we search for the roots of this connection, investigating the relation between entanglement and thermodynamics in the framework of general probabilistic theories. We first address the question whether an entangled state can be transformed into another (...) by means of local operations and classical communication. Under two operational requirements, we prove a general version of the Lo-Popescu theorem, which lies at the foundations of the theory of pure-state entanglement. We then consider a resource theory of purity where free operations are random reversible transformations, modelling the scenario where an agent has limited control over the dynamics of a closed system. Our key result is a duality between the resource theory of entanglement and the resource theory of purity, valid for every physical theory where all processes arise from pure states and reversible interactions at the fundamental level. As an application of the main result, we establish a one-to-one correspondence between entropies and measures of pure bipartite entanglement and exploit it to define entanglement measures in the general probabilistic framework. In addition, we show a duality between the task of information erasure and the task of entanglement generation, whereby the existence of entropy sinks (systems that can absorb arbitrary amounts of information) becomes equivalent to the existence of entanglement sources (correlated systems from which arbitrary amounts of entanglement can be extracted). (shrink)
The nature and topology of time remains an open question in philosophy, both tensed and tenseless concepts of time appear to have merit. A concept of time including both kinds of time evolution of physical systems in quantum mechanics subsumes the properties of both notions. The linear dynamics defines the universe probabilistically throughout space-time, and can be seen as the definition of a block universe. The collapse dynamics is the time evolution of the linear dynamics, and is thus of different (...) logical type to the linear dynamics. These two different kinds of time evolution are respectively tensed and tenseless. Ascribing tensed semantics to the collapse dynamics is problematic in the light of special relativity, but this difficulty does not apply to a relational quantum mechanics. In this context, while the linear dynamics is the time evolution of the universe objectively, the collapse dynamics is the time evolution of the universe subjectively, applying solely in the functional frame of reference of the observer. (shrink)
This article introduces Hegel's Eurocentric philosophy of dialectics in the 19th century and its transformation to Kelly’s planetary paradigm at the turn of the 20th-21st century. The new theory develops Hegel’s thesis—antitheses—synthesis to identity—difference—new-identity which is applicable for the entire human history, including the planetary era. The new triad generalizes Hegel’s mechanic view of nature by suggesting a dominant worldview which is featured by a series of tightening and converging dynamic fractal cycles.
Ranking theory is a formal epistemology that has been developed in over 600 pages in Spohn's recent book The Laws of Belief, which aims to provide a normative account of the dynamics of beliefs that presents an alternative to current probabilistic approaches. It has long been received in the AI community, but it has not yet found application in experimental psychology. The purpose of this paper is to derive clear, quantitative predictions by exploiting a parallel between ranking theory and (...) a statistical model called logistic regression. This approach is illustrated by the development of a model for the conditional inference task using Spohn's ranking theoretic approach to conditionals. (shrink)
Merging of opinions results underwrite Bayesian rejoinders to complaints about the subjective nature of personal probability. Such results establish that sufficiently similar priors achieve consensus in the long run when fed the same increasing stream of evidence. Initial subjectivity, the line goes, is of mere transient significance, giving way to intersubjective agreement eventually. Here, we establish a merging result for sets of probability measures that are updated by Jeffrey conditioning. This generalizes a number of different merging results in the literature. (...) We also show that such sets converge to a shared, maximally informed opinion. Convergence to a maximally informed opinion is a (weak) Jeffrey conditioning analogue of Bayesian “convergence to the truth” for conditional probabilities. Finally, we demonstrate the philosophical significance of our study by detailing applications to the topics of dynamic coherence, imprecise probabilities, and probabilistic opinion pooling. (shrink)
Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the most important notions (...) of entropy and to clarify the relations between them, After setting the stage by introducing the thermodynamic entropy, we discuss notions of entropy in information theory, statistical mechanics, dynamical systems theory and fractal geometry. (shrink)
Many recent theories of epistemic discourse exploit an informational notion of consequence, i.e. a notion that defines entailment as preservation of support by an information state. This paper investigates how informational consequence fits with probabilistic reasoning. I raise two problems. First, all informational inferences that are not also classical inferences are, intuitively, probabilistically invalid. Second, all these inferences can be exploited, in a systematic way, to generate triviality results. The informational theorist is left with two options, both of them (...) radical: they can either deny that epistemic modal claims have probability at all, or they can move to a nonstandard probability theory. (shrink)
In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there are (...) two sources of randomness in such a universe: the quantum-mechanical probabilities of the Born rule and the statistical mechanical probabilities of the Statistical Postulate. I propose a new way to understand time's arrow in a quantum universe. It is based on what I call the Thermodynamic Theories of Quantum Mechanics. According to this perspective, there is a natural choice for the initial quantum state of the universe, which is given by not a wave function but by a density matrix. The density matrix plays a microscopic role: it appears in the fundamental dynamical equations of those theories. The density matrix also plays a macroscopic / thermodynamic role: it is exactly the projection operator onto the Past Hypothesis subspace. Thus, given an initial subspace, we obtain a unique choice of the initial density matrix. I call this property "the conditional uniqueness" of the initial quantum state. The conditional uniqueness provides a new and general strategy to eliminate statistical mechanical probabilities in the fundamental physical theories, by which we can reduce the two sources of randomness to only the quantum mechanical one. I also explore the idea of an absolutely unique initial quantum state, in a way that might realize Penrose's idea of a strongly deterministic universe. (shrink)
Bayesianism is our leading theory of uncertainty. Epistemology is defined as the theory of knowledge. So “Bayesian Epistemology” may sound like an oxymoron. Bayesianism, after all, studies the properties and dynamics of degrees of belief, understood to be probabilities. Traditional epistemology, on the other hand, places the singularly non-probabilistic notion of knowledge at centre stage, and to the extent that it traffics in belief, that notion does not come in degrees. So how can there be a Bayesian epistemology?
This doctoral dissertation investigates the notion of physical necessity. Specifically, it studies whether it is possible to account for non-accidental regularities without the standard assumption of a pre-existent set of governing laws. Thus, it takes side with the so called deflationist accounts of laws of nature, like the humean or the antirealist. The specific aim is to complement such accounts by providing a missing explanation of the appearance of physical necessity. In order to provide an explanation, I recur to fields (...) that have not been appealed to so far in discussions about the metaphysics of laws. Namely, I recur to complex systems’ theory, and to the foundations of statistical mechanics. The explanation proposed is inspired by how complex systems’ theory has elucidated the way patterns emerge, and by the probabilistic explanations of the 2nd law of thermodynamics. More specifically, this thesis studies how some constraints that make no direct reference to the dynamics can be a sufficient condition for obtaining in the long run, with high probability, stable regular behavior. I hope to show how certain metaphysical accounts of laws might benefit from the insights achieved in these other fields. According to the proposal studied in this thesis, some regularities are not accidental not in virtue of an underlying physical necessity. The non-accidental character of certain regular behavior is only due to its overwhelming stability. Thus, from this point of view the goal becomes to explain the stability of temporal patterns without assuming a set of pre-existent guiding laws. It is argued that the stability can be the result of a process of convergence to simpler and stable regularities from a more complex lower level. According to this project, if successful, there would be no need to postulate a (mysterious) intermediate category between logical necessity and pure contingency. Similarly, there would be no need to postulate a (mysterious) set of pre-existent governing laws. Part I of the thesis motivates part II, mostly by arguing why further explanation of the notions of physical necessity and governing laws should be welcomed (chapter 1), and by studying the plausibility of a lawless fundamental level (chapters 2 and 3). Given so, part II develops the explanation of formation of simpler and stable behavior from a more complex underlying level. (shrink)
Spontaneous collapse theories of quantum mechanics turn the usual Schrödinger equation into a stochastic dynamical law. In particular, in this paper, I will focus on the GRW theory. Two philosophical issues that can be raised about GRW concern (i) the ontology of the theory, in particular the nature of the wave function and its role within the theory, and (ii) the interpretation of the objective probabilities involved in the dynamics of the theory. During the last years, it has been (...) claimed that we can take advantage of dispositional properties in order to develop an ontology for GRW theory, and also in order to ground the objective probabilities which are postulated by it. However, in this paper, I will argue that the dispositional interpretations which have been discussed in the literature so far are either flawed or – at best – incomplete. If we want to endorse a dispositional interpretation of GRW theory we thus need an extended account that specifies the precise nature of those properties and which makes also clear how they can correctly ground all the probabilities postulated by the theory. Thus, after having introduced several different kinds of probabilistic dispositions, I will try to fill the gap in the literature by proposing a novel and complete dispositional account of GRW, based on what I call spontaneous weighted multi-track propensities. I claim that such an account can satisfy both of our desiderata. (shrink)
This paper looks at three ways of addressing probabilism’s implausible requirement of logical omniscience. The first and most common strategy says it’s okay to require an ideally rational person to be logically omniscient. I argue that this view is indefensible on any interpretation of ‘ideally rational’. The second strategy says probabilism should be formulated not in terms of logically possible worlds but in terms of doxastically possible worlds, ways you think the world might be. I argue that, on the interpretation (...) of this approach that lifts the requirement of certainty in all logical truths, the view becomes vacuous, issuing no requirements on rational believers at all. Finally, I develop and endorse a new solution to the problem. This view proposes dynamic norms for reasoning with credences. The solution is based on an old proposal of Ian Hacking’s that says you’re required to be sensitive to logical facts only when you know they are logical facts. (shrink)
The purpose of this paper is to open for investigation a range of phenomena familiar from dynamical systems or chaos theory which appear in a simple fuzzy logic with the introduction of self-reference. Within that logic, self-referential sentences exhibit properties of fixed point attractors, fixed point repellers, and full chaos on the [0, 1] interval. Strange attractors and fractals appear in two dimensions in the graphing of pairs of mutually referential sentences and appear in three dimensions in the graphing (...) of mutually referential triples. (shrink)
Existing definitions of relevance relations are essentially ambiguous outside the binary case. Hence definitions of probabilistic causality based on relevance relations, as well as probability values based on maximal specificity conditions and homogeneous reference classes are also not uniquely specified. A 'neutral state' account of explanations is provided to avoid the problem, based on an earlier account of aleatory explanations by the author. Further reasons in support of this model are given, focusing on the dynamics of explanation. It is (...) shown that truth in explanation need not entail maximal specificity and that probabilistic explanations should not contain a specification of probability values. (shrink)
Objective. Conceptualization of the definition of space as a semantic unit of language consciousness. -/- Materials & Methods. A structural-ontological approach is used in the work, the methodology of which has been tested and applied in order to analyze the subject matter area of psychology, psycholinguistics and other social sciences, as well as in interdisciplinary studies of complex systems. Mathematical representations of space as a set of parallel series of events (Alexandrov) were used as the initial theoretical basis of the (...) structural-ontological analysis. In this case, understanding of an event was considered in the context of the definition adopted in computer science – a change in the object properties registered by the observer. -/- Results. The negative nature of space realizes itself in the subject-object structure, the components interaction of which is characterized by a change – a key property of the system under study. Observer’s registration of changes is accompanied by spatial focusing (situational concretization of the field of changes) and relating of its results with the field of potentially distinguishable changes (subjective knowledge about «changing world»). The indicated correlation performs the function of space identification in terms of recognizing its properties and their subjective significance, depending on the features of the observer`s motivational sphere. As a result, the correction of the actual affective dynamics of the observer is carried out, which structures the current perception of space according to principle of the semantic fractal. Fractalization is a formation of such a subjective perception of space, which supposes the establishment of semantic accordance between the situational field of changes, on the one hand, and the worldview, as well as the motivational characteristics of the observer, on the other. -/- Conclusions. Performed structural-ontological analysis of the system formed by the interaction of the perceptual function of the psyche and the semantic field of the language made it possible to conceptualize the space as a field of changes potentially distinguishable by the observer, structurally organized according to the principle of the semantic fractal. The compositional features of the fractalization process consist in fact that the semantic fractal of space is relevant to the product of the difference between the situational field of changes and the field of potentially distinguishable changes, adjusted by the current configuration of the observer`s value-needs hierarchy and reduced by his actual affective dynamics. (shrink)
In the third Whitehead Psychology Nexus Studies, we have discussed the dual-aspect-dual-mode proto-experience -subjective experience framework of consciousness based on neuroscience, its implication in war, suffering, peace, and happiness, the process of sublimation for optimizingthem and converting the negative aspects of seven groups of self-protective energy system into their positive aspects from both western and eastern perspectives. In this article, we summarize the recent development since then as follows. In, we rigorously investigated the classical and quantum matching and selection processes (...) for precisely experiencing a specific SE in a specific neural-network. In, we unpacked the quantum view of superposition related to the superposition-based hypothesis H1 of our framework in terms of subquantum dual-aspect primal entities and addressed the related explanatory gaps. We developed alternative hypotheses of our framework, namely, the superposition-then-integration-emergence based H2, the integration-emergence based H3, the intelligent mechanism based H4, and the vacuum/Aether based H5. We concluded that our framework with H1 is the most optimal one because it has the least number of problems. We found over 40 different but overlapping meanings attributed to the term ‘consciousness’ and suggested that authors must specify which aspect ofconsciousness they refer to when using this term to minimize confusion. We proposed definitions of consciousness, qualia, mind, and awareness. We investigated the necessary ingredients for access consciousness: wakefulness, re-entry, attention, working memory and so on. We discussed Nâgârjuna’s philosophy of dependent co-origination with respect to our PE-SE framework. We linked dynamic systems theory and fractal catalytic theory with standard representation theory using our framework. We introduce the PE-SE aspects of consciousness in theoretical classical and quantum physics including loop quantum gravity and string theory. In, we proposed that the SE of subject or ‘self’ in self-related neural-network is tuned to the self-related SEs/PEs superposed in other innumerable entities during samadhi state via matching and selection processes. This leads to bliss, ecstasy, or exceptionally high degree of climax at samadhi state. We conclude that, so far, the dualaspect-dual-mode PE-SE framework with hypothesis H1 is the most optimal framework for explaining our conventional reality because it has the least number of problems. Keywords: Evolution of consciousness; Internal representation; Sensorimotor interaction; Dual-aspect model; Subjective experience; Protoexperiences, Explanatory gap; Mind-brain problem; Purusha; Prakriti; Eastern and Western perspectives; Yoga; Sublimation process; Whitehead; Process and Reality; Occasions of experience; Superposition; Subquantum dual-aspect primal entities ; Superposition-then-integration-emergence; Integration-emergence; Intelligent mechanism; Vacuum/Aether; qualia; Mind; Awareness; Nâgârjuna; Classical and quantum physics; Loop quantum gravity; String theory. (shrink)
The chapter explains why evolutionary genetics – a mathematical body of theory developed since the 1910s – eventually got to deal with culture: the frequency dynamics of genes like “the lactase gene” in populations cannot be correctly modeled without including social transmission. While the body of theory requires specific justifications, for example meticulous legitimations of describing culture in terms of traits, the body of theory is an immensely valuable scientific instrument, not only for its modeling power but also for the (...) amount of work that has been necessary to build, maintain, and expand it. A brief history of evolutionary genetics is told to demonstrate such patrimony, and to emphasize the importance and accumulation of statistical knowledge therein. The probabilistic nature of genotypes, phenogenotypes and population phenomena is also touched upon. Although evolutionary genetics is actually composed by distinct and partially independent traditions, the most important mathematical object of evolutionary genetics is the Mendelian space, and evolutionary genetics is mostly the daring study of trajectories of alleles in a population that explores that space. The ‘body’ is scientific wealth that can be invested in studying every situation that happens to turn out suitable to be modeled as a Mendelian population, or as a modified Mendelian population, or as a population of continuously varying individuals with an underlying Mendelian basis. Mathematical tinkering and justification are two halves of the mutual adjustment between the body of theory and the new domain of culture. Some works in current literature overstate justification, misrepresenting the relationship between body of theory and domain, and hindering interdisciplinary dialogue. (shrink)
In present times, Science has undergone a drastic change due to the critical examination of its methods of acquiring scientific knowledge. It has become more and more contiguous to philosophy. Relativity theory and Quantum Mechanics have revolutionized our concepts of classical physics in their analysis of matter and have created not only a new mathematical symbolism but a revision of a large number of its basic concepts. Relativity has shown that all material objects and processes exist in the integral form (...) of space-time, of which the relations of space and time are different but inseparable aspects. Its modification of our classical concepts of mass, length, force, the law of addition of velocities, the principle of simultaneity along with a new interpretation of laws of conservation of energy, momentum and angular momentum are of a more universal nature. The theory of relativity has demonstrated for the first time the inner necessity of the idea of dialectical contradiction in the theoretical development of the concepts of physics. Quantum mechanics has continued which began in physics with the advent of the theory of relativity. With the development of quantum mechanics, the notion of a strict continuity in the spectrum of values of physical quantities is no longer valid, the classical concept of trajectory is rejected, the principle of classical determinism is questioned. It has shown that the basic laws of nature are not dynamic but statistical and that the probabilistic form of causality is the fundamental form while classical determinism is just its limiting case. In this article, an attempt is made to present a comprehensive picture of the impact of relativity theory and quantum mechanics on a large number of concepts of philosophy. It has been shown that these theories have called for a drastic revision of the seminal kernels of the traditional philosophy of science. (shrink)
In the light of the results obtained during the last two decades in analysis of signals by time series, it has become evident that the tools of non linear dynamics have their elective role of application in biological, and, in particular, in neuro-physiological and psycho-physiological studies. The basic concept in non linear analysis of experimental time series is that one of recurrence whose conceptual counterpart is represented from variedness and variability that are the foundations of complexity in dynamic processes. Thus, (...) the recurrence plots and the Recurrence Quantification Analysis (RQA) are discussed. It is shown that RQA represents the most general and correct methodology in investigation of experimental time series. By it we arrive to inspect the inner structure of the time series connected to the signals under investigation. Linked to RQA we prospect also the method CZF, recently introduced by us. It is able to account for a true estimation of variability of signals in time as well as in frequency domain. And, consequently, it may be used in conjunction with classical Fourier analysis, accounting however that it is inappropriate in analysis of non linear and non stationary experimental time series. The use of CZF method in fractal analysis is also considered in addition to standard index as Hurst exponent. A large field of possible applications in neurological as well as in psycho-physiological studies is given. Also, there are given examples of other and (possibly linked) applications as example the analysis of beat-to-beat fluctuations of human heartbeat intervals that is sovereign in psycho-physiological studies. We give applications on some different planes to evidence the particular sensitivity of such methods. We reach the objective to show that the previously exposed methods are also able to predict in advance the advent of ventricular tachycardia and/or of ventricular fibrillation. The RQA analysis gives good results. The CZF method gives the most excellent results showing that it is able to give very significant indexes of prediction. We also apply such methods in investigation of state anxiety, and proposing in detail a quantum like model of such phenomenological status of the mind. (shrink)
This paper develops a new structural psychology, and therein proposes a specific model for the scientific study of consciousness. The presented model uses Earth's geologic history of mass-extinction & recovery (evolutionary dynamics) in determining humanity’s adaptive response (conscious and non-conscious traits). It argues humanity adaptively mirrors Earth’s basic evolutionary dynamics, in a “mythologizing of natural adversity” as foundation for all human knowledge – a process that continues well into the modern era. The intellectual lineage used to develop this model includes: (...) • Evolutionary biology offers a context for this study – answering Chalmers’ “hard question,” • Paleoanthropology defines the circumstance of human emergence from Gaia, • Environmental forces on neurophysiology derive an ambiguous but instructive narrative logic (mythic sensibility), • Psychology tracks humanity’s shift from animal-self to modern creative-self, using work of Hegel > Freud > Jung > Rank > Joseph Campbell > Arnold Mindell as a new structural psychology, • Fractal geometry offers a holographic design for modeling consciousness, • Memetics presents a tool for measuring conscious traits, in a variation of the Hall-Tonna values inventory, • Finally, Structured Opportunistic Thinking, a hybrid of NTL’s T-group, and Pierce’s Power Equity Group Theory, suggests a developmental methodology. This work presents a “general hypothesizing model” of human consciousness, in attempting a science of consciousness. (shrink)
The explanatory role of natural selection is one of the long-term debates in evolutionary biology. Nevertheless, the consensus has been slippery because conceptual confusions and the absence of a unified, formal causal model that integrates different explanatory scopes of natural selection. In this study we attempt to examine two questions: (i) What can the theory of natural selection explain? and (ii) Is there a causal or explanatory model that integrates all natural selection explananda? For the first question, we argue that (...) five explananda have been assigned to the theory of natural selection and that four of them may be actually considered explananda of natural selection. For the second question, we claim that a probabilistic conception of causality and the statistical relevance concept of explanation are both good models for understanding the explanatory role of natural selection. We review the biological and philosophical disputes about the explanatory role of natural selection and formalize some explananda in probabilistic terms using classical results from population genetics. Most of these explananda have been discussed in philosophical terms but some of them have been mixed up and confused. We analyze and set the limits of these problems. (shrink)
Conceptual combination performs a fundamental role in creating the broad range of compound phrases utilised in everyday language. This article provides a novel probabilistic framework for assessing whether the semantics of conceptual combinations are compositional, and so can be considered as a function of the semantics of the constituent concepts, or not. While the systematicity and productivity of language provide a strong argument in favor of assuming compositionality, this very assumption is still regularly questioned in both cognitive science and (...) philosophy. Additionally, the principle of semantic compositionality is underspecifi ed, which means that notions of both "strong" and "weak" compositionality appear in the literature. Rather than adjudicating between different grades of compositionality, the framework presented here contributes formal methods for determining a clear dividing line between compositional and non-compositional semantics. In addition, we suggest that the distinction between these is contextually sensitive. Compositionality is equated with a joint probability distribution modeling how the constituent concepts in the combination are interpreted. Marginal selectivity is introduced as a pivotal probabilistic constraint for the application of the Bell/CH and CHSH systems of inequalities. Non-compositionality is equated with a failure of marginal selectivity, or violation of either system of inequalities in the presence of marginal selectivity. This means that the conceptual combination cannot be modeled in a joint probability distribution, the variables of which correspond to how the constituent concepts are being interpreted. The formal analysis methods are demonstrated by applying them to an empirical illustration of twenty-four non-lexicalised conceptual combinations. (shrink)
In previous work, we studied four well known systems of qualitative probabilistic inference, and presented data from computer simulations in an attempt to illustrate the performance of the systems. These simulations evaluated the four systems in terms of their tendency to license inference to accurate and informative conclusions, given incomplete information about a randomly selected probability distribution. In our earlier work, the procedure used in generating the unknown probability distribution (representing the true stochastic state of the world) tended to (...) yield probability distributions with moderately high entropy levels. In the present article, we present data charting the performance of the four systems when reasoning in environments of various entropy levels. The results illustrate variations in the performance of the respective reasoning systems that derive from the entropy of the environment, and allow for a more inclusive assessment of the reliability and robustness of the four systems. (shrink)
The problem with reductionism in biology is not the reduction, but the implicit attitude of determinism that usually accompanies it. Methodological reductionism is supported by deterministic beliefs, but making such a connection is problematic when it is based on an idea of determinism as fixed predictability. Conflating determinism with predictability gives rise to inaccurate models that overlook the dynamic complexity of our world, as well as ignore our epistemic limitations when we try to model it. Furthermore, the assumption of a (...) strictly deterministic framework is unnecessarily hindering to biology. By removing the dogma of determinism, biological methods, including reductive methods, can be expanded to include stochastic models and probabilistic interpretations. Thus, the dogma of reductionism can be saved once its ties with determinism are severed. In this paper, I analyze two problems that have faced molecular biology for the last 50 years—protein folding and cancer. Both cases demonstrate the long influence of reductionism and determinism on molecular biology, as well as how abandoning determinism has opened the door to more probabilistic and unconstrained reductive methods in biology. (shrink)
This work describes an explainable system for emotion attribution and recommendation (called DEGARI (Dynamic Emotion Generator And ReclassIfier) relying on a recently introduced probabilistic commonsense reasoning framework.
The epistemic modal auxiliaries 'must' and 'might' are vehicles for expressing the force with which a proposition follows from some body of evidence or information. Standard approaches model these operators using quantificational modal logic, but probabilistic approaches are becoming increasingly influential. According to a traditional view, 'must' is a maximally strong epistemic operator and 'might' is a bare possibility one. A competing account---popular amongst proponents of a probabilisitic turn---says that, given a body of evidence, 'must p' entails that Pr(p) (...) is high but non-maximal and 'might p' that Pr(p) is significantly greater than 0. Drawing on several observations concerning the behavior of 'must', 'might' and similar epistemic operators in evidential contexts, deductive inferences, downplaying and retractions scenarios, and expressions of epistemic tension, I argue that those two influential accounts have systematic descriptive shortcomings. To better make sense of their complex behavior, I propose instead a broadly Kratzerian account according to which 'must p' entails that Pr(p) = 1 and 'might p' that Pr(p) > 0, given a body of evidence and a set of normality assumptions about the world. From this perspective, 'must' and 'might' are vehicles for expressing a common mode of reasoning whereby we draw inferences from specific bits of evidence against a rich set of background assumptions---some of which we represent as defeasible---which capture our general expectations about the world. I will show that the predictions of this Kratzerian account can be substantially refined once it is combined with a specific yet independently motivated `grammatical' approach to the computation of scalar implicatures. Finally, I discuss some implications of these results for more general discussions concerning the empirical and theoretical motivation to adopt a probabilisitic semantic framework. (shrink)
The idea that mind and body are distinct entities that interact is often claimed to be incompatible with physics. The aim of this paper is to disprove this claim. To this end, we construct a broad mathematical framework that describes theories with mind-body interaction (MBI) as an extension of current physical theories. We employ histories theory, i.e., a formulation of physical theories in which a physical system is described in terms of (i) a set of propositions about possible evolutions of (...) the system and (ii) a probability assignment to such propositions. The notion of dynamics is incorporated into the probability rule. As this formulation emphasises logical and probabilistic concepts, it is ontologically neutral. It can be used to describe mental `degrees of freedom' in addition to physical ones. This results into a mathematical framework for psycho-physical interaction (ΨΦI formalism). Interestingly, a class of ΨΦI theories turns out to be compatible with energy conservation. (shrink)
How can different individuals' probability assignments to some events be aggregated into a collective probability assignment? Classic results on this problem assume that the set of relevant events -- the agenda -- is a sigma-algebra and is thus closed under disjunction (union) and conjunction (intersection). We drop this demanding assumption and explore probabilistic opinion pooling on general agendas. One might be interested in the probability of rain and that of an interest-rate increase, but not in the probability of rain (...) or an interest-rate increase. We characterize linear pooling and neutral pooling for general agendas, with classic results as special cases for agendas that are sigma-algebras. As an illustrative application, we also consider probabilistic preference aggregation. Finally, we compare our results with existing results on binary judgment aggregation and Arrovian preference aggregation. This paper is the first of two self-contained, but technically related companion papers inspired by binary judgment-aggregation theory. (shrink)
Suppose several individuals (e.g., experts on a panel) each assign probabilities to some events. How can these individual probability assignments be aggregated into a single collective probability assignment? This article reviews several proposed solutions to this problem. We focus on three salient proposals: linear pooling (the weighted or unweighted linear averaging of probabilities), geometric pooling (the weighted or unweighted geometric averaging of probabilities), and multiplicative pooling (where probabilities are multiplied rather than averaged). We present axiomatic characterisations of each class of (...) pooling functions (most of them classic, but one new) and argue that linear pooling can be justified procedurally, but not epistemically, while the other two pooling methods can be justified epistemically. The choice between them, in turn, depends on whether the individuals' probability assignments are based on shared information or on private information. We conclude by mentioning a number of other pooling methods. (shrink)
Justification logics are constructive analogues of modal logics. They are often used as epistemic logics, particularly as models of evidentialist justification. However, in this role, justification logics are defective insofar as they represent justification with a necessity-like operator, whereas actual evidentialist justification is usually probabilistic. This paper first examines and rejects extant candidates for solving this problem: Milnikel’s Logic of Uncertain Justifications, Ghari’s Hájek–Pavelka-Style Justification Logics and a version of probabilistic justification logic developed by Kokkinis et al. It (...) then proposes a new solution to the problem in the form of a justification logic that incorporates the essential features of both a fuzzy logic and a probabilistic logic. (shrink)
In the quest and search for a physical theory of everything from the macroscopic large body matter to the microscopic elementary particles, with strange and weird concepts springing from quantum physics discovery, irreconcilable positions and inconvenient facts complicated physics – from Newtonian physics to quantum science, the question is- how do we close the gap? Indeed, there is a scientific and mathematical fireworks when the issue of quantum uncertainties and entanglements cannot be explained with classical physics. The Copenhagen interpretation is (...) an expression of few wise men on quantum physics that was largely formulated from 1925 to 1927 namely by Niels Bohr and Werner Heisenberg. From this point on, there is a divergence of quantum science into the realms of indeterminacy, complementarity and entanglement which are principles expounded in Yijing, an ancient Chinese knowledge constructed on symbols, with a vintage of at least 3 millennia, with broken and unbroken lines to form stacked 6-line structure called the hexagram. It is premised on probability development of the hexagram in a space-time continuum. The discovery of the quantization of action meant that quantum physics could not convincingly explain the principles of classical physics. This paper will draw the great departure from classical physics into the realm of probabilistic realities. The probabilistic nature and reality interpretation had a significant influence on Bohr’s line of thought. Apparently, Bohr realized that speaking of disturbance seemed to indicate that atomic objects were classical particles with definite inherent kinematic and dynamic properties (Hanson, 1959). Disturbances, energy excitation and entanglements are processual evolutionary phases in Yijing. This paper will explore the similarities in quantum physics and the methodological ways where Yijing is pivoted to interpret observable realities involving interactions which are uncontrollable and probabilistic and forms an inseparable unity due to the entanglement, superposition Transgressing disciplinary boundaries in the discussion of Yijing, originally from the Western Zhou period (1000-750 BC), over a period of warring states and the early imperial period (500-200 BC) which was compiled, transcribed and transformed into a cosmological texts with philosophical commentaries known as the “Ten Wings” and closely associated with Confucius (551- 479 BC) with the Copenhagen Interpretation (1925-1927) by the few wise men including Niel Bohr and Werner Heisensberg would seem like a subversive undertaking. Subversive as the interpretations from Yijing is based on wisdom derived from thousands of years from ancient China to recently discovered quantum concepts. The subversive undertaking does seem to violate the sanctuaries of accepted ways in looking at Yijing principles, classical physics and quantum science because of the fortified boundaries that have been erected between Yijing and the sciences. Subversive as this paper may be, it is an attempt to re-cast an ancient framework where indeterminism, complementarity, non-linearity entanglement, superposition and probability interpretation is seen in today quantum’s realities. (shrink)
In this paper I expound an argument which seems to establish that probabilism and special relativity are incompatible. I examine the argument critically, and consider its implications for interpretative problems of quantum theory, and for theoretical physics as a whole.
I defend an analog of probabilism that characterizes rationally coherent estimates for chances. Specifically, I demonstrate the following accuracy-dominance result for stochastic theories in the C*-algebraic framework: supposing an assignment of chance values is possible if and only if it is given by a pure state on a given algebra, your estimates for chances avoid accuracy-dominance if and only if they are given by a state on that algebra. When your estimates avoid accuracy-dominance (roughly: when you cannot guarantee that other (...) estimates would be more accurate), I say that they are sufficiently coherent. In formal epistemology and quantum foundations, the notion of rational coherence that gets more attention requires that you never allow for a sure loss (or “Dutch book”) in a given sort of betting game; I call this notion full coherence. I characterize when these two notions of rational coherence align, and I show that there is a quantum state giving estimates that are sufficiently coherent, but not fully coherent. (shrink)
Nick Shea’s Representation in Cognitive Science commits him to representations in perceptual processing that are about probabilities. This commentary concerns how to adjudicate between this view and an alternative that locates the probabilities rather in the representational states’ associated “attitudes”. As background and motivation, evidence for probabilistic representations in perceptual processing is adduced, and it is shown how, on either conception, one can address a specific challenge Ned Block has raised to this evidence.
This paper offers a probabilistic treatment of the conditions for argument cogency as endorsed in informal logic: acceptability, relevance, and sufficiency. Treating a natural language argument as a reason-claim-complex, our analysis identifies content features of defeasible argument on which the RSA conditions depend, namely: change in the commitment to the reason, the reason’s sensitivity and selectivity to the claim, one’s prior commitment to the claim, and the contextually determined thresholds of acceptability for reasons and for claims. Results contrast with, (...) and may indeed serve to correct, the informal understanding and applications of the RSA criteria concerning their conceptual dependence, their function as update-thresholds, and their status as obligatory rather than permissive norms, but also show how these formal and informal normative approachs can in fact align. (shrink)
Formal systems are standardly envisaged in terms of a grammar specifying well-formed formulae together with a set of axioms and rules. Derivations are ordered lists of formulae each of which is either an axiom or is generated from earlier items on the list by means of the rules of the system; the theorems of a formal system are simply those formulae for which there are derivations. Here we outline a set of alternative and explicitly visual ways of envisaging and analyzing (...) at least simple formal systems using fractal patterns of infinite depth. Progressively deeper dimensions of such a fractal can be used to map increasingly complex wffs or increasingly complex 'value spaces', with tautologies, contradictions, and various forms of contingency coded in terms of color. This and related approaches, it turns out, offer not only visually immediate and geometrically intriguing representations of formal systems as a whole but also promising formal links (1) between standard systems and classical patterns in fractal geometry, (2) between quite different kinds of value spaces in classical and infinite-valued logics, and (3) between cellular automata and logic. It is hoped that pattern analysis of this kind may open possibilities for a geometrical approach to further questions within logic and metalogic. (shrink)
According to a standard assumption in epistemology, if one only partially believes that p , then one cannot thereby have knowledge that p. For example, if one only partially believes that that it is raining outside, one cannot know that it is raining outside; and if one only partially believes that it is likely that it will rain outside, one cannot know that it is likely that it will rain outside. Many epistemologists will agree that epistemic agents are capable of (...) partial beliefs in addition to full beliefs and that partial beliefs can be epistemically assessed along some dimensions. However, it has been generally assumed that such doxastic attitudes cannot possibly amount to knowledge. In Probabilistic Knowledge, Moss challenges this standard assumption and provides a formidable defense of the claim that probabilistic beliefs—a class of doxastic attitudes including credences and degrees of beliefs—can amount to knowledge too. Call this the probabilistic knowledge claim . Throughout the book, Moss goes to great lengths to show that probabilistic knowledge can be fruitfully applied to a variety of debates in epistemology and beyond. My goal in this essay is to explore a further application for probabilistic knowledge. I want to look at the role of probabilistic knowledge within a “knowledge-centered” psychology—a kind of psychology that assigns knowledge a central stage in explanations of intentional behavior. My suggestion is that Moss’s notion of probabilistic knowledge considerably helps further both a knowledge-centered psychology and a broadly intellectualist picture of action and know-how that naturally goes along with it. At the same time, though, it raises some interesting issues about the notion of explanation afforded by the resulting psychology. (shrink)
We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence or testimony providing causal information. Denial of knowledge for beliefs based on probabilistic evidence did not arise because participants viewed such beliefs as unjustified, nor because such beliefs leave open the possibility of error. These findings rule out traditional philosophical (...) accounts for why probabilistic evidence does not produce knowledge. The experiments instead suggest that people deny knowledge because they distrust drawing conclusions about an individual based on reasoning about the population to which it belongs, a tendency previously identified by “judgment and decision making” researchers. Consistent with this, participants were more willing to ascribe knowledge for beliefs based on probabilistic evidence that is specific to a particular case. 2016 APA, all rights reserved). (shrink)
We often have some reason to do actions insofar as they promote outcomes or states of affairs, such as the satisfaction of a desire. But what is it to promote an outcome? I defend a new version of 'probabilism about promotion'. According to Minimal Probabilistic Promotion, we promote some outcome when we make that outcome more likely than it would have been if we had done something else. This makes promotion easy and reasons cheap.
Are special relativity and probabilism compatible? Dieks argues that they are. But the possible universe he specifies, designed to exemplify both probabilism and special relativity, either incorporates a universal "now" (and is thus incompatible with special relativity), or amounts to a many world universe (which I have discussed, and rejected as too ad hoc to be taken seriously), or fails to have any one definite overall Minkowskian-type space-time structure (and thus differs drastically from special relativity as ordinarily understood). Probabilism and (...) special relativity appear to be incompatible after all. What is at issue is not whether "the flow of time" can be reconciled with special relativity, but rather whether explicitly probabilistic versions of quantum theory should be rejected because of incompatibility with special relativity. (shrink)
The aim of the paper is to develop general criteria of argumentative validity and adequacy for probabilistic arguments on the basis of the epistemological approach to argumentation. In this approach, as in most other approaches to argumentation, proabilistic arguments have been neglected somewhat. Nonetheless, criteria for several special types of probabilistic arguments have been developed, in particular by Richard Feldman and Christoph Lumer. In the first part (sects. 2-5) the epistemological basis of probabilistic arguments is discussed. With (...) regard to the philosophical interpretation of probabilities a new subjectivist, epistemic interpretation is proposed, which identifies probabilities with tendencies of evidence (sect. 2). After drawing the conclusions of this interpretation with respect to the syntactic features of the probability concept, e.g. one variable referring to the data base (sect. 3), the justification of basic probabilities (priors) by judgements of relative frequency (sect. 4) and the justification of derivative probabilities by means of the probability calculus are explained (sect. 5). The core of the paper is the definition of '(argumentatively) valid derivative probabilistic arguments', which provides exact conditions for epistemically good probabilistic arguments, together with conditions for the adequate use of such arguments for the aim of rationally convincing an addressee (sect. 6). Finally, some measures for improving the applicability of probabilistic reasoning are proposed (sect. 7). (shrink)
The traditional possible-worlds model of belief describes agents as ‘logically omniscient’ in the sense that they believe all logical consequences of what they believe, including all logical truths. This is widely considered a problem if we want to reason about the epistemic lives of non-ideal agents who—much like ordinary human beings—are logically competent, but not logically omniscient. A popular strategy for avoiding logical omniscience centers around the use of impossible worlds: worlds that, in one way or another, violate the laws (...) of logic. In this paper, we argue that existing impossible-worlds models of belief fail to describe agents who are both logically non-omniscient and logically competent. To model such agents, we argue, we need to ‘dynamize’ the impossible-worlds framework in a way that allows us to capture not only what agents believe, but also what they are able to infer from what they believe. In light of this diagnosis, we go on to develop the formal details of a dynamic impossible-worlds framework, and show that it successfully models agents who are both logically non-omniscient and logically competent. (shrink)
This paper examines a promising probabilistic theory of singular causation developed by David Lewis. I argue that Lewis' theory must be made more sophisticated to deal with certain counterexamples involving pre-emption. These counterexamples appear to show that in the usual case singular causation requires an unbroken causal process to link cause with effect. I propose a new probabilistic account of singular causation, within the framework developed by Lewis, which captures this intuition.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.