There are different kinds of uncertainty. I outline some of the various ways that uncertainty enters science, focusing on uncertainty in climate science and weather prediction. I then show how we cope with some of these sources of error through sophisticated modelling techniques. I show how we maintain confidence in the face of error.
Current environmental problems and technological risks are a challenge for a new institutional arrangement of the value spheres of Science, Politics and Morality. Distinguished authors from different European countries and America provide a cross-disciplinary perspective on the problems of political decision making under the conditions of scientificuncertainty. cases from biotechnology and the environmental sciences are discussed. The papers collected for this volume address the following themes: (i) controversies about risks and political decision making; (ii) concepts of science (...) for policy; (iii) the use of social science in the policy making process; (iv) ethical problems with developments in science and technology; (v) public and state interests in the development and control of technology. (shrink)
Uncertainty is recognized as a key issue in water resources research, amongst other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g. uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim —what we call here “framing” the uncertainty. This article promotes awareness of (...)uncertainty framing in four ways. 1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. 2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. 3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. 4) Provocative recommendations promote adjustments for a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully-considered incremental science. In addition to uncertainty quantification and degree of belief (present in ~5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (~25%) and indicating evidence is sufficient (~40%) – or uncertainty is completely ignored (~8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers. (shrink)
While the foundations of climate science and ethics are well established, fine-grained climate predictions, as well as policy-decisions, are beset with uncertainties. This chapter maps climate uncertainties and classifies them as to their ground, extent and location. A typology of uncertainty is presented, centered along the axes of scientific and moral uncertainty. This typology is illustrated with paradigmatic examples of uncertainty in climate science, climate ethics and climate economics. Subsequently, the chapter discusses the IPCC’s preferred way (...) of representing uncertainties and evaluates its strengths and weaknesses from a risk management perspective. Three general strategies for decision-makers to cope with climate uncertainty are outlined, the usefulness of which largely depends on whether or not decision-makers find themselves in a context of deep uncertainty. The chapter concludes by offering two recommendations to ease the work of policymakers, faced with the various uncertainties engrained in climate discourse. (shrink)
In the last few decades the role played by models and modeling activities has become a central topic in the scientific enterprise. In particular, it has been highlighted both that the development of models constitutes a crucial step for understanding the world and that the developed models operate as mediators between theories and the world. Such perspective is exploited here to cope with the issue as to whether error-based and uncertainty-based modeling of measurement are incompatible, and thus alternative (...) with one another, as sometimes claimed nowadays. The crucial problem is whether assuming this standpoint implies definitely renouncing to maintain a role for truth and the related concepts, particularly accuracy, in measurement. It is argued here that the well known objections against true values in measurement, which would lead to refuse the concept of accuracy as non-operational, or to maintain it as only qualitative, derive from a not clear distinction between three distinct processes: the metrological characterization of measuring systems, their calibration, and finally measurement. Under the hypotheses that (1) the concept of true value is related to the model of a measurement process, (2) the concept of uncertainty is related to the connection between such model and the world, and (3) accuracy is a property of measuring systems (and not of measurement results) and uncertainty is a property of measurement results (and not of measuring systems), not only the compatibility but actually the conjoint need of error-based and uncertainty-based modeling emerges. (shrink)
Ever since the hard problem of consciousness (Chalmers, 1996, 1995) first entered the scene in the debate over consciousness many have taken it to show the limitations of a scientific or naturalist explanation of consciousness. The hard problem is the problem of explaining why there is any experience associated with certain physical processes, that is, why there is anything it is like associated with such physical processes? The character of one’s experience doesn’t seem to be entailed by physical processes (...) and so an explanation which can overcome such a worry must (1) explain how physical processes give rise to experience (explain the entailment), (2) give an explanation which doesn’t rely on such physical processes, or (3) show why the hard problem is misguided in some sense. Recently, a rather ambitious and novel theory of consciousness has entered the scene – Integrated Information Theory (IIT) of Consciousness (Oizumi et al., 2014; Tononi, 2008; Tononi et al., 2016) – and proposes that consciousness is the result of a specific type of information processing, what those developing the theory call integrated information. The central aim of this dissertation is to philosophically investigate IIT and see whether it has the ability to overcome the hard problem and related worries. I then aim to use this philosophical investigation to answer a set of related questions which guide this dissertation, which are the following: Is it possible to give an information-theoretic explanation of consciousness? What would the nature of such an explanation be and would it result in a novel metaphysics of consciousness? In this dissertation, I begin in chapter one by first setting up the hard problem and related arguments against the backdrop of IIT (Mindt, 2017). I show that given a certain understanding of structural and dynamical properties IIT fails to overcome the hard problem of consciousness. I go on in chapter two to argue that a deflationary account of causation is the best view for IIT to overcome the causal exclusion problem (Baxendale and Mindt, 2018). In chapter three, I explain IIT’s account of how the qualitative character of our experience arises (qualia) and what view of intentionality (the directedness of our mental states) IIT advocates. I then move on in chapter four to show why the hard problem mischaracterizes structural and dynamical properties and misses important nuances that may shed light on giving a naturalized explanation of consciousness. In the last and fifth chapter, I outline a sketch of a novel metaphysics of consciousness that takes the conjunction of Neutral Monism and Information-Theoretic Structural Realism to give what I call Information-Theoretic Neutral-Structuralism. (shrink)
Is the shape of the Earth really a globe? Reading closely, the author of this voluminous paperback (first published as hardcover in 2015), historian David Wootton, does not take for granted the fact that the Earth is round or spherical. However, this does not mean that he is a relativist. And it is interesting to consider why he regards science as progress against any relativist view of the history of science. -/- On the whole, the book is an extraordinary contribution (...) to the studies of the history of early modern science and philosophy. Through the seventeen chapters with rich notes and illustrations, Wootton elaborates on epistemological problems in observing the heavens and the earth, or how openly and clearly people at the time constructed scientific knowledge. For instance, convincingly, Wootton re-interprets Butterfield’s and Kuhn’s conceptions of ‘scientific revolution’ in early modern Europe, and re-examines the idea of ‘eureka’, ‘discovery’, or ‘invention’ (p. 67) as a necessary precondition for science. For he positively defends the view that the scientific revolution ‘has been so astonishingly successful’ (p. 571), that the unique fact of science is ‘progress’ in its history (p. 513). In this progressive sense, he is philosophically neither committed to Kuhn’s (and Rorty’s) subdued relativism that science evolves within a set of its terms, nor to strong relativism that progress in science is illusory and thus falsifiable. Also, in some longer notes, he explicates why the relativism of a number of historians is undermined (pp. 580–92). (shrink)
In many situations, people are unsure in their moral judgements. In much recent philosophical literature, this kind of moral doubt has been analysed in terms of uncertainty in one’s moral beliefs. Non-cognitivists, however, argue that moral judgements express a kind of conative attitude, more akin to a desire than a belief. This paper presents a scientifically informed reconciliation of non-cognitivism and moral doubt. The central claim is that attitudinal ambivalence—the degree to which one holds conflicting attitudes towards the same (...) object—can play the role of moral doubt for non-cognitivists. I will demonstrate that ambivalence has all of the features that we would expect it to have in order to play the role of moral doubt. It is gradable, can vary through time, covaries with strength of motivation, and is suitably distinct from the other features of our moral judgements. As well as providing a defence of non-cognitivism, this insight poses a new challenge for the view: deciding how to act under moral ambivalence. (shrink)
The research has three objectives: 1) to study the concept of Heisenberg’s uncertainty principle, 2) to study the concept of reality and knowledge in Buddhist philosophy, and 3) to analyze the concept of Heisenberg’s uncertainty principle in Buddhist philosophical perspective. This is documentary research. In this research, it was found that Heisenberg's uncertainty principle refers to the experiment of thought while studying physical reality on smaller particles than atoms where at the present no theory of Physics can (...) clearly explain such properties. In this respect, the mentioned principle is utilized to predict the pairs of a certain attribute of physics, position, and momentum, for instance. The accuracy of position, however, cannot be precisely yielded in advance by such a principle. In other words, it is impossible to measure the position and momentum of a quantum particle at the same time. In the study of the ultimate reality on the matter in the Buddhist philosophy, it showed that corporeality in nature is conditioned by cause and effect and thereby falling under the three common characteristics: 1) impermanence, 2) state of suffering, and 3) state of non-substantiality. In the study of Heisenberg’s uncertainty principle in the Buddhist philosophical perspective, this research was found that the processes in acquiring certain knowledge of Heisenberg and Buddhist philosophy are by one another in the sense that such knowledge is methodologically acquired through experience, rationality, and intuition because both see the Reality in the same manners, that is, the physical reality is viewed by Heisenberg as the thing that goes under changing state of wave-particle all the times, and the matter is seen by Buddhist philosophy as the impermanence and change depending upon its factors involved and thereby is not-self. However, in the different aspect, on the one hand, Buddhist philosophy utilizes the knowledge on the matter to develop the morality and ethics by which the cessation of suffering could be respectively made, but on the other hand, Heisenberg somehow applies the certain knowledge on physical objects into quantum technology to accommodate the physical comfortability in living life. Suggestions in the application of knowledge gained from this research into benefit were that while Buddhist philosophy can make use of Heisenberg’s uncertainty principle to provide certain help at the time of explanation of three common characteristics being done through the scientific method, science also can utilize knowledge gained from Buddhist philosophy to expand the framework of scientific knowledge which is aimed at studying only the physical objects to be able to study the mental objects through the integration of three epistemological methods as well. (shrink)
The research has three objectives: 1) to study the concept of Heisenberg’s uncertainty principle, 2) to study the concept of reality and knowledge in Buddhist philosophy, and 3) to analyze the concept of Heisenberg’s uncertainty principle in Buddhist philosophical perspective. This is documentary research. In this research, it was found that Heisenberg's uncertainty principle refers to the experiment of thought while studying physical reality on smaller particles than atoms where at the present no theory of Physics can (...) clearly explain such properties. In this respect, the mentioned principle is utilized to predict the pairs of a certain attribute of physics, position, and momentum, for instance. The accuracy of position, however, cannot be precisely yielded in advance by such a principle. In other words, it is impossible to measure the position and momentum of a quantum particle at the same time. In the study of the ultimate reality on the matter in the Buddhist philosophy, it showed that corporeality in nature is conditioned by cause and effect and thereby falling under the three common characteristics: 1) impermanence, 2) state of suffering, and 3) state of non-substantiality. In the study of Heisenberg’s uncertainty principle in the Buddhist philosophical perspective, this research was found that the processes in acquiring certain knowledge of Heisenberg and Buddhist philosophy are by one another in the sense that such knowledge is methodologically acquired through experience, rationality, and intuition because both see the Reality in the same manners, that is, the physical reality is viewed by Heisenberg as the thing that goes under changing state of wave-particle all the times, and the matter is seen by Buddhist philosophy as the impermanence and change depending upon its factors involved and thereby is not-self. However, in the different aspect, on the one hand, Buddhist philosophy utilizes the knowledge on the matter to develop the morality and ethics by which the cessation of suffering could be respectively made, but on the other hand, Heisenberg somehow applies the certain knowledge on physical objects into quantum technology to accommodate the physical comfortability in living life. Suggestions in the application of knowledge gained from this research into benefit were that while Buddhist philosophy can make use of Heisenberg’s uncertainty principle to provide certain help at the time of explanation of three common characteristics being done through the scientific method, science also can utilize knowledge gained from Buddhist philosophy to expand the framework of scientific knowledge which is aimed at studying only the physical objects to be able to study the mental objects through the integration of three epistemological methods as well. (shrink)
Against the background of continuing inadequacy in global efforts to address climate change and apparent social and political inertia, ever greater interest is being generated in the idea that geoengineering may offer some solution to this problem. I do not take a position, here, on whether or not geoengineering could ever be morally justifiable. My goal in this paper is more modest – but also has broader implications. I aim to show that even if some form of geoengineering might be (...) ethically acceptable in certain specific circumstances, lab-based research into such techniques could nevertheless have morally problematic consequences. I support this claim by explaining that our current state of uncertainty regarding how the impacts of geoengineering interventions could be geographically distributed may help to promote international agreement on fair rules for the governance of geoengineering. In these circumstances of scientificuncertainty, international actors also face uncertainty regarding who the winners and losers could be with respect to potential rules of geoengineering governance, thereby obstructing the pursuit of self-interest in the selection of such rules. Instead of a research first approach, then, we have reason to take a governance first approach – ensuring that fair international institutions to regulate geoengineering activities are established before further research is conducted into how the costs and benefits of such interventions could be distributed. (shrink)
A collective understanding that traces a debate between ‘what is science?’ and ‘what is a science about?’ has an extraction to the notion of scientific knowledge. The debate undertakes the pursuit of science that hardly extravagance the dogma of pseudo-science. Scientific conjectures invoke science as an intellectual activity poured by experiences and repetition of the objects that look independent of any idealist views (believes in the consensus of mind-dependence reality). The realistic machinery employs in an empiricist exposition of (...) the objective phenomenon by synchronizing the general method to make observational predictions that cover all the phenomena of the particular entity without any exception. The formation of science encloses several epistemological purviews and a succession of conjectures cum refutation that a newer theorem could reinstate. My attempt is to advocate a holistic plea of scientific conjectures that outruns the restricted regulation of experience or testable hypothesis to render the validity of a chain of logical reasoning (deductive or inductive) of basic scientific statements. The milieu of scientific intensification integrates speculation that loads efficiency towards a new experimental dimension where the reality is not itself objective or observers relative; in fact the observed phenomenon divulges in the constructive progression of preferred methods of falsifiability and uncertainty. (shrink)
A collective understanding that traces a debate between 'what is science?’ and ‘what is a science about?’ has an extraction to the notion of scientific knowledge. The debate undertakes the pursuit of science that hardly extravagance the dogma of pseudo-science. Scientific conjectures invoke science as an intellectual activity poured by experiences and repetition of the objects that look independent of any idealist views (believes in the consensus of mind-dependence reality). The realistic machinery employs in an empiricist exposition of (...) the objective phenomenon by synchronizing the general method to make observational predictions that cover all the phenomena of the particular entity without any exception. The formation of science encloses several epistemological purviews and a succession of conjectures cum refutation that a newer theorem could reinstate. My attempt is to advocate a holistic plea of scientific conjectures that outruns the restricted regulation of experience or testable hypothesis to render the validity of a chain of logical reasoning (deductive or inductive) of basic scientific statements. The milieu of scientific intensification integrates speculation that loads efficiency towards a new experimental dimension where the reality is not itself objective or observers relative; in fact the observed phenomenon divulges in the constructive progression of preferred methods of falsifiability and uncertainty. (shrink)
As soon as you believe an imagination to be nonfictional, this imagination becomes your ontological theory of the reality. Your ontological theory (of the reality) can describe a system as the reality. However, actually this system is only a theory/conceptual-space/imagination/visual-imagery of yours, not the actual reality (i.e., the thing-in-itself). An ontological theory (of the reality) actually only describes your (subjective/mental) imagination/visual-imagery/conceptual-space. An ontological theory of the reality, is being described as a situation model (SM). There is no way to prove/disprove (...) that there is only one reality, or there are two realities (i.e., “subjective reality” and “objective reality”). So, every ontology talk/theory/imagination about the two realities is only a talk/theory/imagination – we will never know whether it is true or not. The conventionally-called “physical/objective reality/world” around my conventionally-called “physical/objective body” is actually a geometric mathematical model (being generated/mathematically-modeled by my brain) – it's actually a subset/component/part/element of my brain’s mind/consciousness/manifest-image. Our cosmos is an autonomous objective parallel computing automaton (aka state machine) which evolves by itself automatically/unintentionally – wave-particle duality and Heisenberg’s uncertainty principle can be explained under this SM of my brain. Each elementary particle (as a building block of our cosmos) is an autonomous mathematical entity itself (i.e., a thing in itself). Our cosmos has the same nature as a Game of Life system – both are autonomous objective parallel-computing automata. Cosmos (as a state machine) is indistinguishable from a digital simulation – my consciousness (as something nonphysical) is not cosmos (as a state machine). If we are happy to accept randomness/stochasticity, then it is obviously possible that all other worlds in the many-worlds interpretation (MWI) actually do not exist (objectively). As one metaphysical option, we can treat all other worlds as subjective only (even if they are actually objective). Under the context of this metaphysical option, we are only living in one world (i.e., the world we are currently living in; this world) – we are not living in many worlds at the same time parallelly. Under the context of this metaphysical option, there is only one possible future. The relationship among any number of elementary particles, is governed/described by Schrodinger equation. If (in theory) Schrodinger equation can’t be used to reliably forecast whether I will go to McDonald for dinner in this world (based on the current state of all elementary particles of the cosmos), then what can Schrodinger equation do? The conventionally-called “space” does not exist objectively. “Time” and “matter” are not physical. Consciousness is the subjective-form (aka quale) of the mathematical models (of the objective cosmos) which are intracorporeally/subjectively used by the control logic of a Turing machine’s program fatedly. A Turing machine’s mind/consciousness/manifest-image or deliberate decisions/choices should not be able to actually/objectively change/control/drive the (autonomous or fated) worldline of any elementary particle within this world (i.e., the world we are currently living in, under the context of MWI). Besides the Schrodinger equation (or another mathematical equation/function which is yet to be discovered) which is a valid/correct/factual causality of our cosmos/state-machine, every other causality (of our cosmos/state-machine) is either invalid/incorrect/counterfactual or can be proved by deductive inference based on the Schrodinger equation (or the aforementioned yet-to-be-discovered mathematical equation/function) only. Closed causality entails no causality. Consciousness plays no causal role (“epiphenomenalism”), or in other words, any cognitive/behavioural activity can in principle be carried out without consciousness (“conscious inessentialism”). If the “loop quantum gravity” theory is correct, then time/space does not actually/objectively exist in the objective-evolution of the objective cosmos, or in other words, we should not use the subjective/mental concept of “time”, “state” or “space” to describe/imagine the objective-evolution of our cosmos. (shrink)
In the following, I will discuss the current social reaction to the ecological crisis and the ways in which society reacts to technological risks, which can be understood primarily as a reaction to scientific and moral or ethical uncertainty. In the first section, I will clarify what is meant by scientific and moral or ethical uncertainty. In the second section, I will contrast Max Weber's differentiation of science, law [Recht) and morality in the modern world with (...) the process of de-differentiation of these value spheres, a trend which can be observed in the present-day context of the ecological crisis and technological risks. We shall see that social contradictions emerge in the functional relationships between these value spheres, and that such contradictions go hand in hand with these value spheres or contexts of discourse either losing their original function or becoming transformed. Science forfeits its role as a functional authority and becomes a strategic resource for politics. Law becomes a basic constituent of an amoral form of negotiation, which can no longer be properly grasped in terms of legal categories. Morality is transformed into fear, and economics yields unprofitable practices. In the third section, I will in attempt to open up the moral and ethical dimension of how to deal with uncertainty with the help of discourse theory (Apel, 1988; Habermas, 1996), as well as outline a possible solution. (shrink)
While geoengineering may counteract negative effects of anthropogenic climate change, it is clear that most geoengineering options could also have some harmful effects. Moreover, it is predicted that the benefits and harms of geoengineering will be distributed unevenly in different parts of the world and to future generations, which raises serious questions of justice. It has been suggested that a compensation scheme to redress geoengineering harms is needed for geoengineering to be ethically and politically acceptable. Discussions of compensation for geoengineering (...) harms, however, sometimes presume geoengineering has presented new and unique challenges to compensation that cannot be readily accommodated by existing compensation practices. The most explicit formulation of this view was recently presented by Toby Svoboda and Peter J. Irvine, who argued that two forms of uncertainty in geoengineering — namely, ethical uncertainty and scientificuncertainty — make it immensely difficult to devise an ethically and politically satisfactory compensation scheme for geoengineering harms. -/- In this paper, we argue against the view that geoengineering presents new and unique challenges relating to compensation. More specifically, we show that placing these challenges within the broader context of anthropogenic climate change reveals them to be less serious and less specific to geoengineering than some appear to believe. (shrink)
The UK has been ‘following the science’ in response to the COVID-19 pandemic in line with the national framework for the use of scientific advice in assessment of risk. We argue that the way in which it does so is unsatisfactory in two important respects. Firstly, pandemic policy making is not based on a comprehensive assessment of policy impacts. And secondly, the focus on reasonable worst-case scenarios as a way of managing uncertainty results in a loss of decision-relevant (...) information and does not provide a coherent basis for policy making. (shrink)
The precautionary principle in public decision making concerns situations where following an assessment of the available scientific information, there are reasonable grounds for concern for the possibility of adverse effects on the environment or human health, but scientificuncertainty persists. In such cases provisional risk management measures may be adopted, without having to wait until the reality and seriousness of those adverse effects become fully apparent. This is the definition of the precautionary principle as operationalized under EU (...) law. The precautionary principle is a deliberative principle. Its application involves deliberation on a range of normative dimensions which need to be taken into account while making the principle operational in the public policy context. Under EU law, any risk management measures to be adopted while implementing the precautionary principle, have to be proportionate to ensure the chosen high level of protection in the European Community. This articlewill illustrate the established practice concerning the release of genetically modified organisms into the environment and how the principle is implemented under hard law. The article also provides an outlook on what this may imply for the relative new case of nanotechnology and the use of precautionary principle within the context of soft law (use of codes of conduct). (shrink)
The precautionary principle (PP) aims to anticipate and minimize potentially serious or irreversible risks under conditions of scientificuncertainty. Thus it preserves the potential for future developments. It has been incorporated into many international treaties and pieces of national legislation for environmental protection and sustainable development. In this article, we outline an interpretation of the PP as a framework of orientation for a sustainable information society. Since the risks induced by future information and communication technologies (ICT) are social (...) risks for the most part, we propose to extend the PP from mainly environmental to social subjects of protection. From an ethical point of view, the PP and sustainability share the principle of intergenerational justice, which can be used as an argument to preserve free space for the decisions of future generations. Applied to technical innovation and to ICT issues in particular, the extended PP can serve as a framework of orientation to avoid socio-economically irreversible developments. We conclude that the PP is a useful approach for: (i) policy makers to reconcile information society and sustainability policies and (ii) ICT companies to formulate sustainability strategies. (shrink)
The geosciences include a wide spectrum of disciplines ranging from paleontology to climate science, and involve studies of a vast range of spatial and temporal scales, from the deep-time history of microbial life to the future of a system no less immense and complex than the entire Earth. Modeling is thus a central and indispensable tool across the geosciences. Here, we review both the history and current state of model-based inquiry in the geosciences. Research in these fields makes use of (...) a wide variety of models, such as conceptual, physical, and numerical models, and more specifically cellular automata, artificial neural networks, agent-based models, coupled models, and hierarchical models. We note the increasing demands to incorporate biological and social systems into geoscience modeling, challenging the traditional boundaries of these fields. Understanding and articulating the many different sources of scientificuncertainty – and finding tools and methods to address them – has been at the forefront of most research in geoscience modeling. We discuss not only structuralmodel uncertainties, parameter uncertainties, and solution uncertainties, but also the diverse sources of uncertainty arising from the complex nature of geoscience systems themselves. Without an examination of the geosciences, our philosophies of science and our understanding of the nature of model-based science are incomplete. (shrink)
As a response to climate change, geoengineering with solar radiation management has the potential to result in unjust harm. Potentially, this injustice could be ameliorated by providing compensation to victims of SRM. However, establishing a just SRM compensation system faces severe challenges. First, there is scientificuncertainty in detecting particular harmful impacts and causally attributing them to SRM. Second, there is ethical uncertainty regarding what principles should be used to determine responsibility and eligibility for compensation, as well (...) as determining how much compensation ought to be paid. Significant challenges loom for crafting a just SRM compensation system. (shrink)
I provide a critical commentary regarding the attitude of the logician and the philosopher towards the physicist and physics. The commentary is intended to showcase how a general change in attitude towards making scientific inquiries can be beneficial for science as a whole. However, such a change can come at the cost of looking beyond the categories of the disciplines of logic, philosophy and physics. It is through self-inquiry that such a change is possible, along with the realization of (...) the essence of the middle that is otherwise excluded by choice. The logician, who generally holds a reverential attitude towards the physicist, can then actively contribute to the betterment of physics by improving the language through which the physicist expresses his experience. The philosopher, who otherwise chooses to follow the advancement of physics and gets stuck in the trap of sophistication of language, can then be of guidance to the physicist on intellectual grounds by having the physicist’s experience himself. In course of this commentary, I provide a glimpse of how a truthful conversion of verbal statements to physico-mathematical expressions unravels the hitherto unrealized connection between Heisenberg uncertainty relation and Cauchy’s definition of derivative that is used in physics. The commentary can be an essential reading if the reader is willing to look beyond the categories of logic, philosophy and physics by being ‘nobody’. (shrink)
Medicine, like law, is a pragmatic, probabilistic activity. Both require that decisions be made on the basis of available evidence, within a limited time. In contrast to law, medicine, particularly evidence-based medicine as it is currently practiced, aspires to a scientific standard of proof, one that is more certain than the standards of proof courts apply in civil and criminal proceedings. But medicine, as Dr. William Osler put it, is an "art of probabilities," or at best, a "science of (...)uncertainty." One can better practice medicine by using other evidentiary standards in addition to the "scientific." To employ only the scientific standard of proof is inappropriate, if not impossible; furthermore, as this review will show, its application in medicine is fraught with bias. Evidence is information. It supports or undermines a proposition, whether a hypothesis in science, a diagnosis in medicine, or a fact or point in question in a legal investigation. In medicine, physicians marshal evidence to make decisions on how to best prevent, diagnose, and treat disease, and improve health. In law, courts decide the facts and render justice. Judges and juries assess evidence to establish liability, to settle custody and medical issues, and to determine a defendant's guilt or innocence. (shrink)
Non-epistemic values pervade climate modelling, as is now well documented and widely discussed in the philosophy of climate science. Recently, Parker and Winsberg have drawn attention to what can be termed “epistemic inequality”: this is the risk that climate models might more accurately represent the future climates of the geographical regions prioritised by the values of the modellers. In this paper, we promote value management as a way of overcoming epistemic inequality. We argue that value management can be seriously considered (...) as soon as the value-free ideal and inductive risk arguments commonly used to frame the discussions of value influence in climate science are replaced by alternative social accounts of objectivity. We consider objectivity in Longino's sense as well as strong objectivity in Harding's sense to be relevant options here, because they offer concrete proposals that can guide scientific practice in evaluating and designing so-called multi-model ensembles and, in fine, improve their capacity to quantify and express uncertainty in climate projections. (shrink)
In the process of scientific discovery, knowledge ampliation is pursued by means of non-deductive inferences. When ampliative reasoning is performed, probabilities cannot be assigned objectively. One of the reasons is that we face the problem of the unconceived alternatives: we are unable to explore the space of all the possible alternatives to a given hypothesis, because we do not know how this space is shaped. So, if we want to adequately account for the process of knowledge ampliation, we need (...) to develop an account of the process of scientific discovery which is not exclusively based on probability calculus. We argue that the analytic view of the method of science advocated by Cellucci is interestingly suited to this goal, since it rests on the concept of plausibility. In this perspective, in order to account for how probabilities are in fact assigned in uncertain contexts and knowledge ampliation is really pursued, we have to take into account plausibility-based considerations. (shrink)
The engineering knowledge research program is part of the larger effort to articulate a philosophy of engineering and an engineering worldview. Engineering knowledge requires a more comprehensive conceptual framework than scientific knowledge. Engineering is not ‘merely’ applied science. Kuhn and Popper established the limits of scientific knowledge. In parallel, the embrace of complementarity and uncertainty in the new physics undermined the scientific concept of observer-independent knowledge. The paradigm shift from the scientific framework to the broader (...) participant engineering framework entails a problem shift. The detached scientific spectator seeks the ‘facts’ of ‘objective’ reality – out there. The participant, embodied in reality, seeks ‘methods’, about how to work in the world. The engineering knowledge research program is recursively enabling. Advances in engineering knowledge are involved in the unfolding of the nature of reality. Newly understood, quantum uncertainty entails that the participant is a natural inquirer. ‘Practical reason’ is concerned with ‘how we should live’– the defining question of morality. The engineering knowledge research program is selective seeking ‘important truths’, ‘important knowledge’, ‘important methods’ that manifest value, and serve the engineering agenda of ‘the construction of the good.’ The importance of engineering knowledge research program is clear in the new STEM curriculum where educators have been challenged to rethink the relation between science and engineering. A 2015 higher education initiative to integrate engineering colleges into liberal arts and sciences colleges has stalled due to the confusion and conflict between the engineering and scientific representations of knowledge. (shrink)
In both scientific and popular circles it is often said that we are in the midst of a sixth mass extinction. Although the urgency of our present environmental crises is not in doubt, such claims of a present mass extinction are highly controversial scientifically. Our aims are, first, to get to the bottom of this scientific debate by shedding philosophical light on the many conceptual and methodological challenges involved in answering this scientific question, and, second, to offer (...) new philosophical perspectives on what the value of asking this question has been — and whether that value persists today. We show that the conceptual challenges in defining ‘mass extinction’, uncertainties in past and present diversity assessments, and data incommensurabilities undermine a straightforward answer to the question of whether we are in, or entering, a sixth mass extinction today. More broadly we argue that an excessive focus on the mass extinction framing can be misleading for present conservation efforts and may lead us to miss out on the many other valuable insights that Earth’s deep time can offer in guiding our future. (shrink)
Confirming Robinson’s Statement? A Lakatosian Analysis of Keynes and his Immediate Orthodoxy Jesús Muñoz Abstract Was the Keynesian message alive during the second half of the XXth Century, or was it betrayed by his followers? This article in the fields of the history of economic thought and methodology contrasts the Scientific Research Programmes (SRPs), a Lakatosian concept, of Keynes in The General Theory of Employment, Interest and Money (TGT) with those of its immediate orthodox schools: Monetarism (MS), Neoclassical Synthesis (...) (NS), New Classical Macroeconomics – rational expectations – (RE) and General Disequilibrium (GD). The objective is to to assess the immediate impact of Keynes’s vision in economics. It can be concluded according to this comparison that Keynes’s bequeath was alive during the period between 1950 and 1980, but that it was accepted under different names. Many economists deny this statement. However it is hereby argued with the help of Lakatosian methodology that in both economic and philosophical terms the MS, NS, RE and GD SRPs are degenerative variants of Keynes's SRP. The Keynesian reasoning chain -a non self-regulated system, non neutrality on the part of money, organicism, non-ergodicity, historical time and uncertainty- is misunderstood and hence misapplied on the part of these deviant schools, or transformed into “bastard Keynesianism”, to quote Joan Robinson (Joan Robinson, 1975[1973], 125). In other words, Keynes’s economics is different from Keynesian economics as was firstly proposed by Axel Leijonhuvfud (Leijonhuvfud, 1968). The internal history of macroeconomics in those periods is undertaken, since it is is the rational reconstruction of the meaning of a SRP. Section 1 is an introduction to basic concepts related to philosophy of science and methodology, especially Lakatosian methodology, which can be skipped by the specialized reader. Section 2 is an analysis of Keynes’s hard core in his SRP, also being an introduction to the problem as it outlines Keynes’s thinking, Section 3 describes initial Post Keynesianism, which was faithful to the original message. Sections 4, 5, 6 and 7 outline the hard cores of Monetarists, Neoclassical Synthesis, New Classical Macroeconomics and General Disequilibrium, respectively. Section 8 is a comprehensive analysis of the differences between the mentioned paradigms, and hence concludes. References are listed at the end of the article. (shrink)
The aim of the article. The aim of the article is to identify the components of social and economic systems life cycle. To achieve this aim, the article describes the traits and characteristics of the system, determines the features of social and economic systems functioning and is applied a systematic approach in the study of their life cycle. The results of the analysis. It is determined that the development of social and economic systems has signs of cyclicity and is explained (...) methodologically by the axiomatics, rules and laws. Understanding of circular patterns has been formed long ago and now is recorded by scientists monitoring the properties of natural and artificial environment of human activity. During the study, it was found out that in scientific literature there is no unified description of the life cycle elements of social and economic systems at personal, micro, meso, macro and global levels. The paper investigates the cyclical patterns in multilevel social and economic systems for a human, employee, family, product, company, city, industry, elite, macroeconomic indicators, humanity, global processes, global economic system and the Universe. It is noted that at grass-roots administrative levels of the global environment of a human life activity system, a thesis about the cyclicity of development and the stages of the life cycle is considered by a wide circle of scientists and is doubtless. On hierarchically higher management levels of the global environment of human activity system, the scientists noticed the similar patterns of the cyclical nature. Problematic and discussion questions about cyclic development of social and economic systems are identified: the uncertainty of the driving force source of repeated changes; the vague distinction between systemic (internal) and off-system (external) influence on development; the lack of a unified description of development nature at different managerial levels; the use of different descriptive terms for the same constituent elements of repeated changes (period, stage, phase, wave, cycle); the uncertainty of the number of repeated changes constituent elements; the absence of an identification mechanism of the exact time of changes in the constituent elements of repeated changes; the surface description of out-of-system period of the subjects activity («before-system» period of creation and «after-system» period of liquidation); attaching more importance to primary and less importance to ultimate component elements of the repeated changes; the lack of reliable mechanism for predicting and forecasting the dynamics of constituent components of repeated changes. It is noted that all social and economic systems are developed within repeated changes and they have the same attributes of cyclicity: the fluctuation of indicators is marked that characterize the condition of the subject; the presence of system (from creation to liquidation) and out-of-system (creation – «design» and liquidation – «disposal») periods are noted; the subjects are created by the actions of other subjects (that is, subjects do not arise «out of nowhere», but are the result of actions of other subjects – «constructors»); after cessation of subjects functioning, their resources are used by other subjects (that is, the subjects do not disappear «nowhere», and they are transformed into the resources of other subjects – «recovery»); a unidirectional life cycle from «birth» to «death» is observed in the subjects; the subjects do not have to undergo the whole life cycle (each time might be the last); the subjects display similar changes of the indicators during their activity; ascending, peak and descending fluctuations of indicators are observed in the subjects; minor fluctuations occur in the constituent elements; the subjects have individual nature of indicator fluctuations and duration of activity. Conclusions and directions of further researches. The scientific novelty of the study is identification structure of life cycle of social and economic system with definitions of: common («before-system state», «growth», «stabilization», «reduction», «after-system state») and specific («creation», «slow growth», «rapid growth», «growth stabilization», «short stabilization», «rapid reduction», «slow reduction», «liquidation») constituent elements of repeated changes; marginal borders of indicators; dynamics of constituent elements of life cycle; key points of changing nature of the indicators dynamics. Identification of the life cycle of social and economic systems characterizes potentially possible limits of their activity in time and provides the possibility of further research of aspects of the cyclical development for methodological basis formation and the development of applied measures of systemic development in the areas of marketing and management innovations. (shrink)
Unlike almost all other philosophers of science, Karl Popper sought to contribute to natural philosophy or cosmology – a synthesis of science and philosophy. I consider his contributions to the philosophy of science and quantum theory in this light. There is, however, a paradox. Popper’s most famous contribution – his principle of demarcation – in driving a wedge between science and metaphysics, serves to undermine the very thing he professes to love: natural philosophy. I argue that Popper’s philosophy of science (...) is, in this respect, defective. Science cannot proceed without making highly problematic metaphysical assumptions concerning the comprehensibility and knowability of the universe. Precisely because these assumptions are problematic, rigour requires that they be subjected to sustained critical scrutiny, as an integral part of science itself. Popper’s principle of demarcation must be rejected. Metaphysics and philosophy of science become a vital part of science. Natural philosophy is reborn. A solution to the problem of what it means to say a theory is unified is proposed, a problem Popper failed to solve. In The Logic of Scientific Discovery, Popper made important contributions to the interpretation of quantum theory, especially in connection with Heisenberg's uncertainty relations. Popper's advocacy of natural philosophy has important implications for education. (shrink)
In the quest and search for a physical theory of everything from the macroscopic large body matter to the microscopic elementary particles, with strange and weird concepts springing from quantum physics discovery, irreconcilable positions and inconvenient facts complicated physics – from Newtonian physics to quantum science, the question is- how do we close the gap? Indeed, there is a scientific and mathematical fireworks when the issue of quantum uncertainties and entanglements cannot be explained with classical physics. The Copenhagen interpretation (...) is an expression of few wise men on quantum physics that was largely formulated from 1925 to 1927 namely by Niels Bohr and Werner Heisenberg. From this point on, there is a divergence of quantum science into the realms of indeterminacy, complementarity and entanglement which are principles expounded in Yijing, an ancient Chinese knowledge constructed on symbols, with a vintage of at least 3 millennia, with broken and unbroken lines to form stacked 6-line structure called the hexagram. It is premised on probability development of the hexagram in a space-time continuum. The discovery of the quantization of action meant that quantum physics could not convincingly explain the principles of classical physics. This paper will draw the great departure from classical physics into the realm of probabilistic realities. The probabilistic nature and reality interpretation had a significant influence on Bohr’s line of thought. Apparently, Bohr realized that speaking of disturbance seemed to indicate that atomic objects were classical particles with definite inherent kinematic and dynamic properties (Hanson, 1959). Disturbances, energy excitation and entanglements are processual evolutionary phases in Yijing. This paper will explore the similarities in quantum physics and the methodological ways where Yijing is pivoted to interpret observable realities involving interactions which are uncontrollable and probabilistic and forms an inseparable unity due to the entanglement, superposition Transgressing disciplinary boundaries in the discussion of Yijing, originally from the Western Zhou period (1000-750 BC), over a period of warring states and the early imperial period (500-200 BC) which was compiled, transcribed and transformed into a cosmological texts with philosophical commentaries known as the “Ten Wings” and closely associated with Confucius (551- 479 BC) with the Copenhagen Interpretation (1925-1927) by the few wise men including Niel Bohr and Werner Heisensberg would seem like a subversive undertaking. Subversive as the interpretations from Yijing is based on wisdom derived from thousands of years from ancient China to recently discovered quantum concepts. The subversive undertaking does seem to violate the sanctuaries of accepted ways in looking at Yijing principles, classical physics and quantum science because of the fortified boundaries that have been erected between Yijing and the sciences. Subversive as this paper may be, it is an attempt to re-cast an ancient framework where indeterminism, complementarity, non-linearity entanglement, superposition and probability interpretation is seen in today quantum’s realities. (shrink)
The author's analysis of conceptual aspects of global human resource management shows the lack of unified mechanisms anf forms. Thus, we state that at the beginning of the XXI century at all management level, the contours of the management influence methodology on human resources are formed. This gives the possibility of determining only the main backbone constituent elements. Due to the complexity of the process of people management as a resource, management mechanisms are formalized only in the framework of different (...) social and economic systems. Their formalization appears extremely difficult due to uncertainty about quantitative and qualitative changes in the global environment of human activity and financial turbulence. Therefore, the priority becomes the problem of providing targeted safe dynamics of mankind development which can be achieved through a civilized and humane management of effects on certain thoroughly scientific basis. (shrink)
‘Sentience’ sometimes refers to the capacity for any type of subjective experience, and sometimes to the capacity to have subjective experiences with a positive or negative valence, such as pain or pleasure. We review recent controversies regarding sentience in fish and invertebrates and consider the deep methodological challenge posed by these cases. We then present two ways of responding to the challenge. In a policy-making context, precautionary thinking can help us treat animals appropriately despite continuing uncertainty about their sentience. (...) In a scientific context, we can draw inspiration from the science of human consciousness to disentangle conscious and unconscious perception (especially vision) in animals. Developing better ways to disentangle conscious and unconscious affect is a key priority for future research. (shrink)
The work in this paper is presented with this spirit to draw the relatedness of Yijing to quantum physics and seek to express the continuity between the ancient sages and contemporary scientific thought. Yijing is abstract philosophical and can provide an excellent method for generating, structuring and exploring quantum fields relevant to our present level of scientific knowledge. Further, the view of reality that science emphasizes as a seamless, continuous field is the same as Yijing where ‘self’ as (...) particle is deeply integrated into the basic fabric of reality through their consciousness. It is this consciousness that interacts and co-relates to that field of interconnectedness. (Schöter, 2011) The wholeness of realities, in Yijing, are layer of fields interacting, changing, extending into possibilities and uncertainties. (shrink)
Global climate change has been characterised as the crisis of reason, imagination and language, to mention some. The 'everything change', as Margaret Atwood calls it, arguably also impacts on how we aesthetically perceive, interpret and appreciate nature. This article looks at philosophical theories of nature appreciation against global environmental change. The article examines how human-induced global climate change affects the 'scientific' approaches to nature appreciation which base aesthetic judgment on scientific knowledge and the competing 'non-scientific' approaches which (...) emphasise the role of emotions, imagination and stories in the aesthetic understanding of environment. The author claims that both approaches are threatened by global climate change and cannot continue as usual. In particular, he explores aesthetic imagination in contemporary times when our visions about environment are thoroughly coloured by worry and uncertainty and there seems to be little room for awe and wonder, which have traditionally characterised the aesthetic experience of nature. Finally, he proposes that art could stimulate environmental imagining in this age of uncertainty. (shrink)
Time as the key to a theory of everything became recently a renewed topic in scientific literature. Social constructivism applied to physics abandons the inevitable essentials of nature. It adopts uncertainty in the scope of the existential activity of scientific research. We have enlightened the deep role of social constructivism of the predetermined Newtonian time and space notions in natural sciences. Despite its incompatibility with determinism governing the Newtonian mechanics, randomness and entropy are inevitable when negative localized (...) energy is transformed into spatially dissipative heat. In sharp contrast to the Newtonian notion, social constructivism makes room for the temporal twin of cyclic conservation and pseudo-linearly structural evolution both reconciling mechanical order and thermodynamic chaos in Leibnizian space-time. In the broad scope of natural sciences this triad nature of time produces a multiple reality and gives appropriate answers on virtual physical paradoxes. (shrink)
This wide ranging discourse covers many disciplines of science and the human condition in an attempt to fully understand the manifestation of time. Time's Paradigm is, at its inception, a philosophical debate between the theories of 'Presentism' and 'The Block Model', beginning with a pronounced psychological analysis of 'free will' in an environment where the past and the future already exist. It lays the foundation for the argument that time is a cyclical, contained progression, rather than a meandering voyage into (...) infinity, bringing into question the validity of a commensurate 'Big Bang'. Following, the proposal widens to encompass physics. It tackles clock rates and time dilation, acausality and the nuisance of a universal clock, and demonstrates that conscious consideration creates the present moment - time's flow - separating the solid state past and future whose reality is devoid of space. Arguments relating to Quantum Physics theory, including the Uncertainty Principle and a Superposition of States, lend credibility to key areas involving cognitive awareness. It is posited that defined points in time and space prohibit progress in linear models for progression. Thus motile paradoxes can be resolved with the absence of infinities; temporal perception, it is concluded, being the result of uncertainty. Time's Paradigm takes the bold step of asking us to consider a tangible dimension of time, representing an intimate extension of our three, known spatial dimensions. Chaos theory is briefly introduced leading to the configuration of a fractal fourth dimension of time whose assumption demands only one direction of flow. Further, it asks whether our universe is expanding or contracting. It considers the simple physics of bodies contracting in a fourth dimension of time (UC), and how that marries comfortably with standard scientific models such as Special Relativity. The rate at which matter is contracting in the universe is illustrated in a reduction factor of 1.618... coinciding with Fibonacci's Ratio and countering Time Dilation. Lastly the more complex aspects of relativistic velocities are tackled together with the conundrum of Zero Velocity and The Speed of Light being attributes of the same event in Cyclical Space-Time, and ultimately, the prospect of superluminal velocities by interaction with parallel time-zones in a multi-layered block universe. (shrink)
The existential insecurity of human beings has induced them to create protective spheres of symbols: myths, religions, values, belief systems, theories, etc. Rationality is one of the key factors contributing to the construction of civilisation in technical and symbolic terms. As Hankiss (2001) has emphasised, protective spheres of symbols may collapse – thus causing a profound social crisis. Social and political transformations had a tremendous impact at the end of the 20th century. As a result, management theories have been revised (...) in order to deal with transition and uncertainty. Francis Fukuyama's (2000) approach is supportive of hierarchical organisation as the best solution when facing a 'disruption'. The notion of Homo Hierarchicus has been based on, allegedly, rational presumptions. This paper contributes to the discussion on hierarchy within contemporary organisations. It criticises so-called 'natural' and 'rational' necessities justifying hierarchy. A key issue identified by the paper is the formalisation of language in claiming value-free knowledge and 'detached' observation as the basis for neutral rationality and aspired efficiency. This should be seriously reconsidered as hindering rather than aiding understanding of social complexity. All in all, Homo Hierarchicus appears to be misleading rather than helping symbolic sphere or construct. (shrink)
This is the first book in a two-volume series. The present volume introduces the basics of the conceptual foundations of quantum physics. It appeared first as a series of video lectures on the online learning platform Udemy.]There is probably no science that is as confusing as quantum theory. There's so much misleading information on the subject that for most people it is very difficult to separate science facts from pseudoscience. The goal of this book is to make you able to (...) separate facts from fiction with a comprehensive introduction to the scientific principles of a complex topic in which meaning and interpretation never cease to puzzle and surprise. An A-Z guide which is neither too advanced nor oversimplified to the weirdness and paradoxes of quantum physics explained from the first principles to modern state-of-the-art experiments and which is complete with figures and graphs that illustrate the deeper meaning of the concepts you are unlikely to find elsewhere. A guide for the autodidact or philosopher of science who is looking for general knowledge about quantum physics at intermediate level furnishing the most rigorous account that an exposition can provide and which only occasionally, in few special chapters, resorts to a mathematical level that goes no further than that of high school. It will save you a ton of time that you would have spent searching elsewhere, trying to piece together a variety of information. The author tried to span an 'arch of knowledge' without giving in to the temptation of taking an excessively one-sided account of the subject. What is this strange thing called quantum physics? What is its impact on our understanding of the world? What is ‘reality’ according to quantum physics? This book addresses these and many other questions through a step-by-step journey. The central mystery of the double-slit experiment and the wave-particle duality, the fuzzy world of Heisenberg's uncertainty principle, the weird Schrödinger's cat paradox, the 'spooky action at a distance' of quantum entanglement, the EPR paradox and much more are explained, without neglecting such main contributors as Planck, Einstein, Bohr, Feynman and others who struggled themselves to come up with the mysterious quantum realm. We also take a look at the experiments conducted in recent decades, such as the surprising "which-way" and "quantum-erasure" experiments. Some considerations on why and how quantum physics suggests a worldview based on philosophical idealism conclude this first volume. This treatise goes, at times, into technical details that demand some effort and therefore requires some basics of high school math (calculus, algebra, trigonometry, elementary statistics). However, the final payoff will be invaluable: Your knowledge of, and grasp on, the subject of the conceptual foundations of quantum physics will be deep, wide, and outstanding. Additionally, because schools, colleges, and universities teach quantum physics using a dry, mostly technical approach which furnishes only superficial insight into its foundations, this manual is recommended for all those students, physicists or philosophers of science who would like to look beyond the mere formal aspect and delve deeper into the meaning and essence of quantum mechanics. The manual is a primer that the public deserves. (shrink)
Neutrosophic set has been derived from a new branch of philosophy, namely Neutrosophy. Neutrosophic set is capable of dealing with uncertainty, indeterminacy and inconsistent information. Neutrosophic set approaches are suitable to modeling problems with uncertainty, indeterminacy and inconsistent information in which human knowledge is necessary, and human evaluation is needed. Neutrosophic set theory was firstly proposed in 1998 by Florentin Smarandache, who also developed the concept of single valued neutrosophic set, oriented towards real world scientific and engineering (...) applications. Since then, the single valued neutrosophic set theory has been extensively studied in books and monographs, the properties of neutrosophic sets and their applications, by many authors around the world. Also, an international journal - Neutrosophic Sets and Systems started its journey in 2013. Neutrosophic triplet was first defined in 2016 by Florentin Smarandache and Mumtaz Ali and they also introduced the neutrosophic triplet groups in the same year. For every element “x” in a neutrosophic triplet set A, there exist a neutral of “x” and an opposite of “x”. Also, neutral of “x” must be different from the classical neutral element. Therefore, the NT set is different from the classical set. Furthermore, a NT of “x” is showed by . This first volume collects original research and applications from different perspectives covering different areas of neutrosophic studies, such as decision making, Triplet, topology, and some theoretical papers. This volume contains three sections: NEUTROSOPHIC TRIPLET, DECISION MAKING, AND OTHER PAPERS. (shrink)
The role of reason, and its embodiment in philosophical-scientific theorizing, is always a troubling one for religious traditions. The deep emotional needs that religion strives to satisfy seem ever linked to an attitudes of acceptance, belief, or trust, yet, in its theoretical employment, reason functions as a critic as much as it does a creator, and in the special fields of metaphysics and epistemology its critical arrows are sometimes aimed at long-standing cherished beliefs. Understandably, the mere approach to these (...) beliefs through organized philosophical activity, however well-intended, is viewed with suspicion by ecclesiastical authorities and the devout. The attitude towards philosophical inquiry on the part of the Islamic religious community might be thought to typify this reaction. As one of the great prophetic religions, the self-avowed image of Islam is of a tradition which already possesses the truth as set forth in the divine revelation of the Qur'an. What need is there for philosophizing on fundamental matters, e.g., the ultimate nature of reality, the foundations of morality, the modes whereby the divine is connected with the temporal? The structure of creation is already made clear, the "straight path" for living already manifest. how can philosophical activity be anything but a source of divisive controversy, for as it turns its gaze to the foundations upon which the Shari`a' (Islamic Law) rests, or to the grounds for religious belief itself, it cannot avoid turning up alternative viewpoints, different perspectives on divine revelation, noting various weaknesses in received 1 interpretations? In short, isn't the practice of philosophy a threat to Islam's promise of providing a comprehensive way of living devoid of skepticism and uncertainty about the place of a human in God's creation and his or her role in the 'umma (Islamic community)? This problem is not unique to Islam, nor is it a new one within Islam. We know that it has been debated by Islamic thinkers since the translations of the Greek philosophers began to appear in an organized Islamic world during the 8th Century A.. (shrink)
The aim of this study is to justify the belief that there are biological normative mechanisms that fulfill non-trivial causal roles in the explanations (as formulated by researchers) of actions and behaviors present in specific systems. One example of such mechanisms is the predictive mechanisms described and explained by predictive processing (hereinafter PP), which (1) guide actions and (2) shape causal transitions between states that have specific content and fulfillment conditions (e.g. mental states). Therefore, I am guided by a specific (...) theoretical goal associated with the need to indicate those conditions that should be met by the non-trivial theory of normative mechanisms and the specific models proposed by PP supporters. In this work, I use classical philosophical methods, such as conceptual analysis and critical reflection. I also analyze selected studies in the field of cognitive science, cognitive psychology, neurology, information theory and biology in terms of the methodology, argumentation and language used, in accordance with their theoretical importance for the issues discussed in this study. In this sense, the research presented here is interdisciplinary. My research framework is informed by the mechanistic model of explanation, which defines the necessary and sufficient conditions for explaining a given phenomenon. The research methods I chose are therefore related to the problems that I intend to solve. In the introductory chapter, “The concept of predictive processing”, I discuss the nature of PP as well as its main assumptions and theses. I also highlight the key concepts and distinctions for this research framework. Many authors argue that PP is a contemporary version of Kantianism and is exposed to objections similar to those made against the approach of Immanuel Kant. I discuss this thesis and show that it is only in a very general sense that the PP framework is neo-Kantian. Here we are not dealing with transcendental deduction nor with the application of transcendental arguments. I argue that PP is based on reverse engineering and abduction inferences. In the second part of this chapter, I respond to the objection formulated by Dan Zahavi, who directly accuses this research framework of anti-realistic consequences. I demonstrate that the position of internalism, present in the so-called conservative PP, does not imply anti-realism, and that, due to the explanatory role played in it by structural representations directed at real patterns, it is justified to claim that PP is realistic. In this way, I show that PP is a non-trivial research framework, having its subject, specific methods and its own epistemic status. Finally, I discuss positions classified as the socalled radical PP. In the chapter “Predictive processing as a Bayesian explanatory model” I justify the thesis according to which PP offers Bayesian modeling. Many researchers claim that the brain is an implemented statistical probabilistic network that is an approximation of the Bayesian 7 rule. In practice, this means that all cognitive processes are to apply Bayes' rule and can be described in terms of probability distributions. Such a solution arouses objections among many researchers and is the subject of wide criticism. The purpose of this chapter is to justify the thesis that Bayesian PP is a non-trivial research framework. For this purpose, I argue that it explains certain phenomena not only at the computational level described by David Marr, but also at the level of algorithms and implementation. Later in this chapter I demonstrate that PP is normative modeling. Proponents of the use of Bayesian models in psychology or decision theory argue that they are normative because they allow the formulation of formal rules of action that show what needs to be done to make a given action optimal. Critics of this approach emphasize that such thinking about the normativity of Bayesian modeling is unjustified and that science should shift from prescriptive to descriptive positions. In a polemic with Shira Elqayam and Jonathan Evans (2011), I show that the division they propose into prescriptivism and Bayesian descriptivism is apparent, because, as I argue, there are two forms of prescriptivism, i.e. the weak and the strong. I argue that the weak version is epistemic and can lead to anti-realism, while the strong version is ontic and allows one to justify realism in relation to Bayesian models. I argue that a weak version of prescriptivism is valid for PP. It allows us to adopt anti-realism in relation to PP. In practice, this means that you can explain phenomena using Bayes' rule. This does not, however, imply that they are Bayesian in nature. However, the full justification of realism in relation to the Bayesian PP presupposes the adoption of strong prescriptivism. This position assumes that phenomena are explained by Bayesian rule because they are Bayesian as such. If they are Bayesian in nature, then they should be explained using Bayesian modeling. This thesis will be substantiated in the chapters “Normative functions and mechanisms in the context of predictive processing” and “Normative mechanisms and actions in predictive processing”. In the chapter “The Free Energy Principle in predictive processing”, I discuss the Free Energy Principle (hereinafter FEP) formulated by Karl Friston and some of its implications. According to this principle, all biological systems (defined in terms of Markov blankets) minimize the free energy of their internal states in order to maintain homeostasis. Some researchers believe that PP is a special case of applying this principle to cognition, and that predictive mechanisms are homeostatic mechanisms that minimize free energy. The discussion of FEP is important due to the fact that some authors consider it to be important for explanatory purposes and normative. If this is the case, then FEP turns out to be crucial in explaining normative predictive mechanisms and, in general, any normative biological mechanisms. To define the explanatory possibilities of this principle, I refer to the discussion of its supporters on the issue they define as the problem of continuity and discontinuity between life and mind. A critical analysis of this discussion and the additional arguments I have formulated have allowed me to revise the explanatory ambitions of FEP. I also reject the belief that this principle is necessary to explain the nature of predictive mechanisms. I argue that the principle formulated and defended by Friston is an important research heuristic for PP analysis. 8 In the chapter “Normative functions and mechanisms in predictive processing”, I start my analyzes by formulating an answer to the question about the normative nature of homeostatic mechanisms. I demonstrate that predictive mechanisms are not homeostatic. I defend the view that a full explanation of normative mechanisms presupposes an explanation of normative functions. I discuss the most important proposals for understanding the normativity of a function, both from a systemic and teleosemantic perspective. I conclude that the non-trivial concept of a function must meet two requirements which I define as explanatory and normative. I show that none of the theories I have invoked satisfactorily meets both of these requirements. Instead, I propose a model of normativity based on Bickhard's account, but supplemented by a mechanistic perspective. I argue that a function is normative when: (1) it allows one to explain the dysfunction of a given mechanism; (2) it contributes to the maintenance of the organism's stability by shaping and limiting possible relations, processes and behaviors of a given system; and when (3) (according to the representational and predictive functions) it enables explaining the attribution of logical values of certain representations / predictions. In such an approach, a mechanism is normative when it performs certain normative functions and when it is constitutive for a specific action or behavior, despite the fact that for some reason it cannot realize it either currently or in the long-term. Such an understanding of the normativity of mechanisms presupposes the acceptance of the epistemic hypothesis. I argue that this hypothesis is not cognitively satisfactory, and therefore the ontic hypothesis should be justified, which is directly related to adopting the position of ontic prescriptivism. For this reason, referring to the mechanistic theory of scientific explanations, I formulate an ontical interpretation of the concept of a normative mechanism. According to this approach, a mechanism or a function is normative when they perform such and such causal roles in explaining certain actions and behaviors. With regard to the normative properties of predictive mechanisms and functions, this means that they are the causes of specific actions an organism carries out in the environment. In this way, I justify the necessity of accepting the ontic hypothesis and rejecting the epistemic hypothesis. The fifth chapter, “Normative mechanisms and actions in predictive processing”, is devoted to the dark room problem and the related exploration-exploitation trade-off. A dark room is the state that an agent could be in if it minimized the sum of all potential prediction errors. I demonstrate that, in accordance with the basic assumption of PP about the need for continuous and long-term minimization of prediction errors, such a state should be desirable for the agent. Is it really so? Many authors believe it is not. I argue that the test of the value of PP is the possibility of a non-trivial solution of this problem, which can be reduced to the choice between active and uncertainty-increasing exploration and safe and easily predictable exploitation. I show that the solution proposed by PP supporters present in the literature does not enable a fully satisfactory explanation of this dilemma. Then I defend the position according to which the full explanation of the normative mechanisms, and, subsequently, the solution to the dilemma of exploration and exploitation, involves reference to the existence of constraints present in the environment. The constraints 9 include elements of the environment that make a given mechanism not only causal but also normative. They are therefore key to explaining the predictive mechanisms. They do not only play the role of the context in which the mechanism is implemented, but, above all, are its constitutive component. I argue that the full explanation of the role of constraints in normative predictive mechanisms presupposes the integration of individual models of specific cognitive phenomena, because only the mechanistic integration of PP with other models allows for a non-trivial explanation of the nature of normative predictive mechanisms that would have a strong explanatory value. The explanatory monism present in many approaches to PP makes it impossible to solve the problem of the dark room. Later in this chapter, I argue that the Bayesian PP is normative not because it enables the formulation of such and such rules of action, but because the predictive mechanisms themselves are normative. They are normative because they condition the choice of such and such actions by agents. In this way, I justify the hypothesis that normative mechanisms make it possible to explain the phenomenon of agent motivation, which is crucial for solving the dark room problem. In the last part of the chapter, I formulate the hypothesis of distributed normativity, which assumes that the normative nature of certain mechanisms, functions or objects is determined by the relations into which these mechanisms, functions or objects enter. This means that what is normative (in the primary sense) is the relational structure that constitutes the normativity of specific items included in it. I suggest that this hypothesis opens up many areas of research and makes it possible to rethink many problems. In the “Conclusion”, I summarize the results of my research and indicate further research perspectives. (shrink)
The Intergovernmental Panel on Climate Change has developed a novel framework for assessing and communicating uncertainty in the findings published in their periodic assessment reports. But how should these uncertainty assessments inform decisions? We take a formal decision-making perspective to investigate how scientific input formulated in the IPCC’s novel framework might inform decisions in a principled way through a normative decision model.
If one follows the accounts by philosophers of science and the discussions in scientific communities, there can be little doubt that failure is an essential part of scientific practice. It is essential both in the sense of being integral to scientific practice and of being necessary for its overall success. Researchers who create new scientific knowledge face uncertainties about the nature of the problem they are trying to solve, the existence of a solution to that problem, (...) the way in which a solution can be found, and their ability to find such a solution (Gläser, 2007, 247). The existence of these uncertainties means, in turn, that each research process carries the risk of failure. (shrink)
It is commonly thought that such topics as Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason are disparate scientific physical or mathematical issues having little or nothing in common. I suggest that they are largely standard philosophical problems (i.e., language games) which were resolved by Wittgenstein over 80 years ago. -/- Wittgenstein also demonstrated the fatal error in regarding mathematics or language or our behavior in general as a unitary coherent logical ‘system,’ rather (...) than as a motley of pieces assembled by the random processes of natural selection. “Gödel shows us an unclarity in the concept of ‘mathematics’, which is indicated by the fact that mathematics is taken to be a system” and we can say (contra nearly everyone) that is all that Gödel and Chaitin show. Wittgenstein commented many times that ‘truth’ in math means axioms or the theorems derived from axioms, and ‘false’ means that one made a mistake in using the definitions, and this is utterly different from empirical matters where one applies a test. Wittgenstein often noted that to be acceptable as mathematics in the usual sense, it must be useable in other proofs and it must have real world applications, but neither is the case with Godel’s Incompleteness. Since it cannot be proved in a consistent system (here Peano Arithmetic but a much wider arena for Chaitin), it cannot be used in proofs and, unlike all the ‘rest’ of PA it cannot be used in the real world either. As Rodych notes “…Wittgenstein holds that a formal calculus is only a mathematical calculus (i.e., a mathematical language-game) if it has an extra- systemic application in a system of contingent propositions (e.g., in ordinary counting and measuring or in physics) …” Another way to say this is that one needs a warrant to apply our normal use of words like ‘proof’, ‘proposition’, ‘true’, ‘incomplete’, ‘number’, and ‘mathematics’ to a result in the tangle of games created with ‘numbers’ and ‘plus’ and ‘minus’ signs etc., and with -/- ‘Incompleteness’ this warrant is lacking. Rodych sums it up admirably. “On Wittgenstein’s account, there is no such thing as an incomplete mathematical calculus because ‘in mathematics, everything is algorithm [and syntax] and nothing is meaning [semantics]…” -/- I make some brief remarks which note the similarities of these ‘mathematical’ issues to economics, physics, game theory, and decision theory. -/- Those wishing further comments on philosophy and science from a Wittgensteinian two systems of thought viewpoint may consult my other writings -- Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 3rd ed (2019), The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle 2nd ed (2019), Suicide by Democracy 4th ed (2019), The Logical Structure of Human Behavior (2019), The Logical Structure of Consciousness (2019, Understanding the Connections between Science, Philosophy, Psychology, Religion, Politics, and Economics and Suicidal Utopian Delusions in the 21st Century 5th ed (2019), Remarks on Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason in Chaitin, Wittgenstein, Hofstadter, Wolpert, Doria, da Costa, Godel, Searle, Rodych, Berto, Floyd, Moyal-Sharrock and Yanofsky (2019), and The Logical Structure of Philosophy, Psychology, Sociology, Anthropology, Religion, Politics, Economics, Literature and History (2019). (shrink)
Background: Pandemics are challenging for clinical and public health agencies and policymakers because of the scientific and medical uncertainty that accompanies novel viruses like COVID-19 makes an increase of morbidity and mortality prominent. Consequently, there is a need to evaluate the public perception of social distancing, lockdown obligatory, and response satisfactory during the pandemic. Methods: This cross-sectional survey used an anonymous online google based questionnaire to collect data from respondents via social media platforms. The online survey was conducted (...) among social media users from 1st to 30th April 2020. A snowball sampling technique was employed to recruit respondents for the survey. A total of 1,131 respondents responded across the country. Results: Nine out of every ten respondents believed that social distancing is an effective measure to reduce the spread of COVID-19. Also, 8 out of every ten respondents agreed with the lockdown measures. However, just 36.8% think their government is doing enough to stop the outbreak, and only 25% of the respondents were satisfied with the country’s response to the worldwide epidemic. The age of respondents was found to be significantly associated with satisfaction with emergency response during pandemics. Conclusion: It could be concluded that Nigerian public accepted social distancing as an effective way of curbing the spread of COVID-19 and general acceptance on lockdown obligatory; however, more than half of respondents expressed non-satisfactory with government and other agencies responses during the pandemics. (shrink)
It is commonly thought that Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason are disparate scientific physical or mathematical issues having little or nothing in common. I suggest that they are largely standard philosophical problems (i.e., language games) which were mostly resolved by Wittgenstein over 80years ago. -/- “What we are ‘tempted to say’ in such a case is, of course, not philosophy, but it is its raw material. Thus, for example, what a (...) mathematician is inclined to say about the objectivity and reality of mathematical facts, is not a philosophy of mathematics, but something for philosophical treatment.” Wittgenstein PI 234 -/- "Philosophers constantly see the method of science before their eyes and are irresistibly tempted to ask and answer questions in the way science does. This tendency is the real source of metaphysics and leads the philosopher into complete darkness." Wittgenstein -/- I provide a brief summary of some of the major findings of two of the most eminent students of behavior of modern times, Ludwig Wittgenstein and John Searle, on the logical structure of intentionality (mind, language, behavior), taking as my starting point Wittgenstein’s fundamental discovery –that all truly ‘philosophical’ problems are the same—confusions about how to use language in a particular context, and so all solutions are the same—looking at how language can be used in the context at issue so that its truth conditions (Conditions of Satisfaction or COS) are clear. The basic problem is that one can say anything, but one cannot mean (state clear COS for) any arbitrary utterance and meaning is only possible in a very specific context. -/- I dissect some writings of a few of the major commentators on these issues from a Wittgensteinian viewpoint in the framework of the modern perspective of the two systems of thought (popularized as ‘thinking fast, thinking slow’), employing a new table of intentionality and new dual systems nomenclature. I show that this is a powerful heuristic for describing the true nature of these putative scientific, physical or mathematical issues which are really best approached as standard philosophical problems of how language is to be used (language games in Wittgenstein’s terminology). -/- It is my contention that the table of intentionality (rationality, mind, thought, language, personality etc.) that features prominently here describes more or less accurately, or at least serves as an heuristic for, how we think and behave, and so it encompasses not merely philosophy and psychology, but everything else (history, literature, mathematics, politics etc.). Note especially that intentionality and rationality as I (along with Searle, Wittgenstein and others) view it, includes both conscious deliberative linguistic System 2 and unconscious automated prelinguistic System 1 actions or reflexes. (shrink)
This paper considers the responsibilities of the FDA with regard to disseminating information about the benefits and harms of e-cigarettes. Tobacco harm reduction advocates claim that the FDA has been overcautious and has violated ethical obligations by failing to clearly communicate to the public that e-cigarettes are far less harmful than cigarettes. We argue, by contrast, that the FDA’s obligations in this arena are more complex than they may appear at first blush. Though the FDA is accountable for informing the (...) public about the health risks and benefits of products it regulates, it also has other roles that inform when and how it should disseminate information. In addition to being a knowledge purveyor, it is also a knowledge producer, an advisor to the public, and a practical agent shaping the material conditions in which people make health-related choices. In our view, those other roles call for caution in the way the FDA interprets and communicates the available evidence. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.