Undoubtedly the Penrose-Hameroff Orch OR model may be considered as a good theory for describing information processing mechanisms and holistic phenomena in the human brain, but it doesn’t give us satisfactory explanation of human perception. In this work a new approach explaining our perception is introduced, which is in good agreement with Orch OR model and other mainstream science theories such as string theory, loop quantum gravity and holographic principle. It is shown that human perception cannot be explained in the (...) terms of elementary particles and we should introduce new indivisible holistic objects with geometry based on smoothinfinitesimalanalysis - elastic membranes. The example of such a membrane is our Universe which is an indivisible whole. It is shown that our perception may be considered as the result of elastic oscillations of two dimensional (2D) elastic membranes with closed topology embedded in our bodies. Only one elastic membrane responsible for its perceptions will correspond to the selected organism, but there may be other membranes, even at the cell level. In other words, reality may be considered as the process of time evolution of holistic energetically very weak macro objects - elastic membranes with the geometry based on smoothinfinitesimalanalysis. An embedded membrane in this multidimensional world will look different for the external and internal observers: from the outside it will look like a material object with smoothinfinitesimal geometry, while from the inside our Universe-like space-time fabric. When interacting with elementary particles and other membranes, a membrane will transform their energy into its elastic energy (a new form of energy) - the energy of stretching of the infinitesimal segments. The theory postulates that these elastic deformations will not be observable from the point of view of the internal observer. Heisenberg’s uncertainty principle will work in this physics only from the point of view of the internal observer. For the external observer each embedded elastic membrane may be stretched and even a very small region will become observable. For example, living organisms play the role of internal observers of the Universe, and at the same time they serve as external observers for 2D membranes embedded into our Universe. We can observe our 2D self-membranes through our perceptions, which are encoded in elastic oscillations of the elastic membrane. According to the theory, elastic membranes occupy energetically favorable positions around microtubules involved into Orch OR. The theory not only gives us a really multidimensional holistic picture of reality, but it also provides us with a new method for understanding such phenomena as perception, self-awareness and will. (shrink)
Special and General theories of relativity may be considered as the most significant examples of integrative thinking. From these works we see that Albert Einstein attached great importance to how we understand geometry and dimensions. It is shown that physics powered by the new multidimensional elastic geometry is a reliable basis for science integration. Instead of searching for braneworlds (elastic membranes - EM) in higher dimensions we will start by searching them in our 3+1 dimensional world. The cornerstone of the (...) new philosophy is an idea that lower dimensional EMs are an essential component of the living matter, they are responsible for our perceptions, intellect, pattern recognition and high speed signal propagation. According to this theory each EM has both physical and perceptive (psychological) meanings: it exists as our Universe-like physical reality for its inner objects and at the same time it plays perceptive (psychological) role in the external bulk space-time. This philosophy may help us to build up a science which explains not only inanimate, unconscious phenomena, but consciousness as well. (shrink)
We seek to elucidate the philosophical context in which one of the most important conceptual transformations of modern mathematics took place, namely the so-called revolution in rigor in infinitesimal calculus and mathematical analysis. Some of the protagonists of the said revolution were Cauchy, Cantor, Dedekind,and Weierstrass. The dominant current of philosophy in Germany at the time was neo-Kantianism. Among its various currents, the Marburg school (Cohen, Natorp, Cassirer, and others) was the one most interested in matters scientific and (...) mathematical. Our main thesis is that Marburg neo-Kantian philosophy formulated a sophisticated position towards the problems raised by the concepts of limits and infinitesimals. The Marburg school neither clung to the traditional approach of logically and metaphysically dubious infinitesimals, nor whiggishly subscribed to the new orthodoxy of the “great triumvirate” of Cantor, Dedekind, and Weierstrass that declared infinitesimals conceptus nongrati in mathematical discourse. Rather, following Cohen’s lead, the Marburg philosophers sought to clarify Leibniz’s principle of continuity, and to exploit it in making sense of infinitesimals and related concepts. (shrink)
This paper offers an overview of various alternative formulations for Analysis, the theory of Integral and Differential Calculus, and its diverging conceptions of the topological structure of the continuum. We pay particularly attention to SmoothAnalysis, a proposal created by William Lawvere and Anders Kock based on Grothendieck’s work on a categorical algebraic geometry. The role of Heyting’s logic, common to all these alternatives is emphasized.
The paper considers a new type of objects – blinking fractals – that are not covered by traditional theories studying dynamics of self-similarity processes. It is shown that the new approach allows one to give various quantitative characteristics of the newly introduced and traditional fractals using infinite and infinitesimal numbers proposed recently. In this connection, the problem of the mathematical modelling of continuity is discussed in detail. A strong advantage of the introduced computational paradigm consists of its well-marked numerical (...) character and its own instrument – Infinity Computer – able to execute operations with infinite and infinitesimal numbers. (shrink)
In ‘How to part ways smoothly’ Hud Hudson (2007) presents ‘two temporally-continuous spatially unextended material objects that ... share all of their temporal parts up until their very last time-slice’ (2007: 156). They share their location throughout all but the last instant of their lives, at which instant they are a metre apart. Hudson claims that they part smoothly. I shall show that they don’t.
In this survey, a recent computational methodology paying a special attention to the separation of mathematical objects from numeral systems involved in their representation is described. It has been introduced with the intention to allow one to work with infinities and infinitesimals numerically in a unique computational framework in all the situations requiring these notions. The methodology does not contradict Cantor’s and non-standard analysis views and is based on the Euclid’s Common Notion no. 5 “The whole is greater than (...) the part” applied to all quantities (finite, infinite, and infinitesimal) and to all sets and processes (finite and infinite). The methodology uses a computational device called the Infinity Computer (patented in USA and EU) working numerically (recall that traditional theories work with infinities and infinitesimals only symbolically) with infinite and infinitesimal numbers that can be written in a positional numeral system with an infinite radix. It is argued that numeral systems involved in computations limit our capabilities to compute and lead to ambiguities in theoretical assertions, as well. The introduced methodology gives the possibility to use the same numeral system for measuring infinite sets, working with divergent series, probability, fractals, optimization problems, numerical differentiation, ODEs, etc. (recall that traditionally different numerals lemniscate; Aleph zero, etc. are used in different situations related to infinity). Numerous numerical examples and theoretical illustrations are given. The accuracy of the achieved results is continuously compared with those obtained by traditional tools used to work with infinities and infinitesimals. In particular, it is shown that the new approach allows one to observe mathematical objects involved in the Hypotheses of Continuum and the Riemann zeta function with a higher accuracy than it is done by traditional tools. It is stressed that the hardness of both problems is not related to their nature but is a consequence of the weakness of traditional numeral systems used to study them. It is shown that the introduced methodology and numeral system change our perception of the mathematical objects studied in the two problems. (shrink)
The goal of this paper consists of developing a new (more physical and numerical in comparison with standard and non-standard analysis approaches) point of view on Calculus with functions assuming infinite and infinitesimal values. It uses recently introduced infinite and infinitesimal numbers being in accordance with the principle ‘The part is less than the whole’ observed in the physical world around us. These numbers have a strong practical advantage with respect to traditional approaches: they are representable at (...) a new kind of a computer – the Infinity Computer – able to work numerically with all of them. An introduction to the theory of physical and mathematical continuity and differentiation (including subdifferentials) for functions assuming finite, infinite, and infinitesimal values over finite, infinite, and infinitesimal domains is developed in the paper. This theory allows one to work with derivatives that can assume not only finite but infinite and infinitesimal values, as well. It is emphasized that the newly introduced notion of the physical continuity allows one to see the same mathematical object as a continuous or a discrete one, in dependence on the wish of the researcher, i.e., as it happens in the physical world where the same object can be viewed as a continuous or a discrete in dependence on the instrument of the observation used by the researcher. Connections between pure mathematical concepts and their computational realizations are continuously emphasized through the text. Numerous examples are given. (shrink)
This paper considers non-standard analysis and a recently introduced computational methodology based on the notion of ①. The latter approach was developed with the intention to allow one to work with infinities and infinitesimals numerically in a unique computational framework and in all the situations requiring these notions. Non-standard analysis is a classical purely symbolic technique that works with ultrafilters, external and internal sets, standard and non-standard numbers, etc. In its turn, the ①-based methodology does not use any (...) of these notions and proposes a more physical treatment of mathematical objects separating the objects from tools used to study them. It both offers a possibility to create new numerical methods using infinities and infinitesimals in floating-point computations and allows one to study certain mathematical objects dealing with infinity more accurately than it is done traditionally. In these notes, we explain that even though both methodologies deal with infinities and infinitesimals, they are independent and represent two different philosophies of Mathematics that are not in a conflict. It is proved that texts :539–555, 2017; Gutman and Kutateladze in Sib Math J 49:835–841, 2008; Kutateladze in J Appl Ind Math 5:73–75, 2011) asserting that the ①-based methodology is a part of non-standard analysis unfortunately contain several logical fallacies. Their attempt to show that the ①-based methodology can be formalized within non-standard analysis is similar to trying to show that constructivism can be reduced to the traditional Mathematics. (shrink)
The previous two parts of the paper demonstrate that the interpretation of Fermat’s last theorem (FLT) in Hilbert arithmetic meant both in a narrow sense and in a wide sense can suggest a proof by induction in Part I and by means of the Kochen - Specker theorem in Part II. The same interpretation can serve also for a proof FLT based on Gleason’s theorem and partly similar to that in Part II. The concept of (probabilistic) measure of a subspace (...) of Hilbert space and especially its uniqueness can be unambiguously linked to that of partial algebra or incommensurability, or interpreted as a relation of the two dual branches of Hilbert arithmetic in a wide sense. The investigation of the last relation allows for FLT and Gleason’s theorem to be equated in a sense, as two dual counterparts, and the former to be inferred from the latter, as well as vice versa under an additional condition relevant to the Gödel incompleteness of arithmetic to set theory. The qubit Hilbert space itself in turn can be interpreted by the unity of FLT and Gleason’s theorem. The proof of such a fundamental result in number theory as FLT by means of Hilbert arithmetic in a wide sense can be generalized to an idea about “quantum number theory”. It is able to research mathematically the origin of Peano arithmetic from Hilbert arithmetic by mediation of the “nonstandard bijection” and its two dual branches inherently linking it to information theory. Then, infinitesimalanalysis and its revolutionary application to physics can be also re-realized in that wider context, for example, as an exploration of the way for physical quantity of time (respectively, for time derivative in any temporal process considered in physics) to appear at all. Finally, the result admits a philosophical reflection of how any hierarchy arises or changes itself only thanks to its dual and idempotent counterpart. (shrink)
This article discusses how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. Techniques and ideas from non-standard analysis are brought to bear on the problem.
In this article we provide a mathematical model of Kant?s temporal continuum that satisfies the (not obviously consistent) synthetic a priori principles for time that Kant lists in the Critique of pure Reason (CPR), the Metaphysical Foundations of Natural Science (MFNS), the Opus Postumum and the notes and frag- ments published after his death. The continuum so obtained has some affinities with the Brouwerian continuum, but it also has ‘infinitesimal intervals’ consisting of nilpotent infinitesimals, which capture Kant’s theory of (...) rest and motion in MFNS. While constructing the model, we establish a concordance between the informal notions of Kant?s theory of the temporal continuum, and formal correlates to these notions in the mathematical theory. Our mathematical reconstruction of Kant?s theory of time allows us to understand what ?faculties and functions? must be in place for time to satisfy all the synthetic a priori principles for time mentioned. We have presented here a mathematically precise account of Kant?s transcendental argument for time in the CPR and of the rela- tion between the categories, the synthetic a priori principles for time, and the unity of apperception; the most precise account of this relation to date. We focus our exposition on a mathematical analysis of Kant’s informal terminology, but for reasons of space, most theorems are explained but not formally proven; formal proofs are available in (Pinosio, 2017). The analysis presented in this paper is related to the more general project of developing a formalization of Kant’s critical philosophy (Achourioti & van Lambalgen, 2011). A formal approach can shed light on the most controversial concepts of Kant’s theoretical philosophy, and is a valuable exegetical tool in its own right. However, we wish to make clear that mathematical formalization cannot displace traditional exegetical methods, but that it is rather an exegetical tool in its own right, which works best when it is coupled with a keen awareness of the subtleties involved in understanding the philosophical issues at hand. In this case, a virtuous ?hermeneutic circle? between mathematical formalization and philosophical discourse arises. (shrink)
A computational methodology called Grossone Infinity Computing introduced with the intention to allow one to work with infinities and infinitesimals numerically has been applied recently to a number of problems in numerical mathematics (optimization, numerical differentiation, numerical algorithms for solving ODEs, etc.). The possibility to use a specially developed computational device called the Infinity Computer (patented in USA and EU) for working with infinite and infinitesimal numbers numerically gives an additional advantage to this approach in comparison with traditional methodologies (...) studying infinities and infinitesimals only symbolically. The grossone methodology uses the Euclid’s Common Notion no. 5 ‘The whole is greater than the part’ and applies it to finite, infinite, and infinitesimal quantities and to finite and infinite sets and processes. It does not contradict Cantor’s and non-standard analysis views on infinity and can be considered as an applied development of their ideas. In this paper we consider infinite series and a particular attention is dedicated to divergent series with alternate signs. The Riemann series theorem states that conditionally convergent series can be rearranged in such a way that they either diverge or converge to an arbitrary real number. It is shown here that Riemann’s result is a consequence of the fact that symbol ∞ used traditionally does not allow us to express quantitatively the number of addends in the series, in other words, it just shows that the number of summands is infinite and does not allows us to count them. The usage of the grossone methodology allows us to see that (as it happens in the case where the number of addends is finite) rearrangements do not change the result for any sum with a fixed infinite number of summands. There are considered some traditional summation techniques such as Ramanujan summation producing results where to divergent series containing infinitely many positive integers negative results are assigned. It is shown that the careful counting of the number of addends in infinite series allows us to avoid this kind of results if grossone-based numerals are used. (shrink)
We examine some of Connes’ criticisms of Robinson’s infinitesimals starting in 1995. Connes sought to exploit the Solovay model S as ammunition against non-standard analysis, but the model tends to boomerang, undercutting Connes’ own earlier work in functional analysis. Connes described the hyperreals as both a “virtual theory” and a “chimera”, yet acknowledged that his argument relies on the transfer principle. We analyze Connes’ “dart-throwing” thought experiment, but reach an opposite conclusion. In S , all definable sets of (...) reals are Lebesgue measurable, suggesting that Connes views a theory as being “virtual” if it is not definable in a suitable model of ZFC. If so, Connes’ claim that a theory of the hyperreals is “virtual” is refuted by the existence of a definable model of the hyperreal field due to Kanovei and Shelah. Free ultrafilters aren’t definable, yet Connes exploited such ultrafilters both in his own earlier work on the classification of factors in the 1970s and 80s, and in Noncommutative Geometry, raising the question whether the latter may not be vulnerable to Connes’ criticism of virtuality. We analyze the philosophical underpinnings of Connes’ argument based on Gödel’s incompleteness theorem, and detect an apparent circularity in Connes’ logic. We document the reliance on non-constructive foundational material, and specifically on the Dixmier trace −∫ (featured on the front cover of Connes’ magnum opus) and the Hahn–Banach theorem, in Connes’ own framework. We also note an inaccuracy in Machover’s critique of infinitesimal-based pedagogy. (shrink)
I argue that imperatives express contents that are both cognitively and semantically related to, but nevertheless distinct from, modal propositions. Imperatives, on this analysis, semantically encode features of planning that are modally specified. Uttering an imperative amounts to tokening this feature in discourse, and thereby proffering it for adoption by the audience. This analysis deals smoothly with the problems afflicting Portner's Dynamic Pragmatic account and Kaufmann's Modal account. It also suggests an appealing reorientation of clause-type theorizing, in which (...) the cognitive act of updating on a typed sentence plays a central role in theorizing about both its semantics and role in discourse. (shrink)
This dissertation is a contribution to formal and computational philosophy. -/- In the first part, we show that by exploiting the parallels between large, yet finite lotteries on the one hand and countably infinite lotteries on the other, we gain insights in the foundations of probability theory as well as in epistemology. Case 1: Infinite lotteries. We discuss how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. The solution boils down to the introduction (...) of infinitesimal probability values, which can be achieved using non-standard analysis. Our solution can be generalized to uncountable sample spaces, giving rise to a Non-Archimedean Probability (NAP) theory. Case 2: Large but finite lotteries. We propose application of the language of relative analysis (a type of non-standard analysis) to formulate a new model for rational belief, called Stratified Belief. This contextualist model seems well-suited to deal with a concept of beliefs based on probabilities ‘sufficiently close to unity’. -/- The second part presents a case study in social epistemology. We model a group of agents who update their opinions by averaging the opinions of other agents. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating. To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. The probability of ending up in an inconsistent belief state turns out to be always smaller than 2%. (shrink)
In my dissertation, I present Hermann Cohen's foundation for the history and philosophy of science. My investigation begins with Cohen's formulation of a neo-Kantian epistemology. I analyze Cohen's early work, especially his contributions to 19th century debates about the theory of knowledge. I conclude by examining Cohen's mature theory of science in two works, The Principle of the Infinitesimal Method and its History of 1883, and Cohen's extensive 1914 Introduction to Friedrich Lange's History of Materialism. In the former, Cohen (...) gives an historical and philosophical analysis of the foundations of the infinitesimal method in mathematics. In the latter, Cohen presents a detailed account of Heinrich Hertz's Principles of Mechanics of 1894. Hertz considers a series of possible foundations for mechanics, in the interest of finding a secure conceptual basis for mechanical theories. Cohen argues that Hertz's analysis can be completed, and his goal achieved, by means of a philosophical examination of the role of mathematical principles and fundamental concepts in scientific theories. (shrink)
The distinction between the discrete and the continuous lies at the heart of mathematics. Discrete mathematics (arithmetic, algebra, combinatorics, graph theory, cryptography, logic) has a set of concepts, techniques, and application areas largely distinct from continuous mathematics (traditional geometry, calculus, most of functional analysis, differential equations, topology). The interaction between the two – for example in computer models of continuous systems such as fluid flow – is a central issue in the applicable mathematics of the last hundred years. This (...) article explains the distinction and why it has proved to be one of the great organizing themes of mathematics. (shrink)
Our visual experience seems to suggest that no continuous curve can cover every point of the unit square, yet in the late nineteenth century Giuseppe Peano proved that such a curve exists. Examples like this, particularly in analysis (in the sense of the infinitesimal calculus) received much attention in the nineteenth century. They helped instigate what Hans Hahn called a “crisis of intuition”, wherein visual reasoning in mathematics came to be thought to be epistemically problematic. Hahn described this (...) “crisis” as follows: Mathematicians had for a long time made use of supposedly geometric evidence as a means of proof in much too naive and much too uncritical a way, till the unclarities and mistakes that arose as a result forced a turnabout. Geometrical intuition was now declared to be inadmissible as a means of proof... (p. 67) Avoiding geometrical evidence, Hahn continued, mathematicians aware of this crisis pursued what he called “logicization”, “when the discipline requires nothing but purely logical fundamental concepts and propositions for its development.” On this view, an epistemically ideal mathematics would minimize, or avoid altogether, appeals to visual representations. This would be a radical reformation of past practice, necessary, according to its advocates, for avoiding “unclarities and mistakes” like the one exposed by Peano. (shrink)
One of the purposes of the Bologna Process was to facilitate the construction of a Europe of Knowledge through educational governance, yet it fails to reach its purpose because of several unexplained assumptions that undermine the conceptual standing of the whole project; it is the purpose of this paper to bring these assumptions to light. -/- A knowledge economy cannot exist without the knowledge workers which were previously formed in educational institutions, therefore the project for a Europe of Knowledge is (...) usually linked with the educational policies especially those affecting the higher education institutions. -/- One such policy area is the Bologna Process which explicitly traces its purpose to the construction of an educational system that will facilitate the smooth delivery of employable graduates to the European labor market. This presentation has two purposes. First to show through a textual analysis of the Bologna ministerial declarations how the subject of higher education is constructed to single out the European citizen, understood in a narrow sense as the employable, mobile and skilled graduate. Second, to show that the notion of citizenship used in the Bologna declarations is ill-construed. -/- Starting from T. H. Marshall’s classical distinction between the three understandings of citizenship (civic, political, and social), this paper will show that the Bologna discourse on citizenship borrows and mixes illegitimately from the three notions, without making it explicit why such a hybrid notion of citizenship is used in the first place. (shrink)
Principals’ transition in Colleges of Education in Ghana is critical to quality teacher education and training, but it comes with complexities and challenges to newly appointed principals. However, there is a seeming absence of research on strategies for smooth transitions in Colleges of Education in Ghana. This study was therefore conducted to establish strategies that promoted the College of Education principals’ transition management in Ghana. Phenomenological research design was used for the study. Ten (10) newly appointed principals of public (...) colleges of education were purposively sampled for the study. Interview protocol was the research instrument used. The data collected was analyzed using content analysis method. The study established that capacity building, relationship building, appropriate leadership style and maintenance of discipline were key among the coping strategies for smooth transitions. This study then provides a guide for new principals. It was recommended that this area should be further explored and a model for managing transition designed to support College of Education principals’ in transition. (shrink)
Social decisions are often made under great uncertainty – in situations where political principles, and even standard subjective expected utility, do not apply smoothly. In the first section, we argue that the core of this problem lies in decision theory itself – it is about how to act when we do not have an adequate representation of the context of the action and of its possible consequences. Thus, we distinguish two criteria to complement decision theory under ignorance – Laplace’s principle (...) of insufficient reason and Wald’s maximin criterion. After that, we apply this analysis to political philosophy, by contrasting Harsanyi’s and Rawls’s theories of justice, respectively based on Laplace’s principle of insufficient reason and Wald’s maximin rule – and we end up highlighting the virtues of Rawls’s principle on practical grounds (it is intuitively attractive because of its computational simplicity, so providing a salient point for convergence) – and connect this argument to our moral intuitions and social norms requiring prudence in the case of decisions made for the sake of others. (shrink)
Eyal Shahar’s essay review [1] of James Penston’s remarkable book [2] seems more inspired playful academic provocation than review or essay, expressing dramatic views of impossible validity. The account given of modern biostatistical causation reveals the slide from science into the intellectual confusion and non-science RCTs have created: “…. the purpose of medical research is to estimate the magnitude of the effect of a causal contrast, for example the probability ratio of a binary outcome …” But Shahar’s world is simultaneously (...) not probabilistic, but of absolute uncertainty: “We should have no confidence in any type of evidence ….. We should have no confidence at all”. Shahar’s "Causal contrast" is attractive. It seems to make sense, but bypasses in two words the means of establishing causation by the scientific method. This phrase assumes a numeric statistically significant “contrast” is causal rather than a potential correlation requiring further investigation. The concept of “causal contrast” is a slippery slope from sense into biostatistical non-science. This can be illustrated with an hypothetical RCT where 100% of interventions exhibit a posited treatment effect and 0% of placebo controls. Internal validity is seemingly quite reasonably assumed satisfied (common-sense dictating the likelihood of an awesome magnificent fraud, bias or plain error of the magnitude required is infinitesimal). Scientific method appears satisfied. The RCT demonstrates: (1) strict regularity of outcome in the presence of posited cause; (2) the absence of outcome in its absence and (3) an intervention (experiment) showing the direction of causation is from posited cause to posited effect. Now travel further down the slope from science. Assume 50% of interventions and 0% of controls are positive. We compromise scientific method, but justify this by assuming a large subgroup which we say surely must on these figures be exhibiting the posited treatment effect. But what of 10% of interventions and 9% of placebo controls exhibiting the posited treatment effect? Our biostatistician says the 1% “causal contrast” is statistically significant. But we have: (1) minimal evidence of regularity; (2) the posited outcome irrespective of presence of posited cause and (3) our intervention is at the highest equivocal in demonstrating any form of causation. This is not science. It is, however, where biostatistics has unthinkingly taken us, as Penston has shown comprehensively [2]. We, the audience of published medical research, are now for the 10% / 9% example well down the slope from science. An unattractive hypothesis results requiring numerous assumptions similar to these:- "There is a 'contrast' which is ‘causal’, albeit the method employed is not scientific. An effect of the intervention has been observed in a very small subgroup. This subgroup is susceptible to treatment. The similar number of placebo controls exhibiting the outcome sought is irrelevant, because the 1% difference between intervention and controls is statistically significant. The statistical analysis is valid and reliable. The RCT’s internal validity is sufficiently satisfied. No funding or bias or fraud has affected the results or their analysis.” As Penston notes: “Confirming and refuting the results of research is crucial to science …. But … there’s no way of testing the results of any particular large-scale RCT or epidemiological study. Each study … is left hanging in the air, unsupported.” It gets worse. To identify a rare serious adverse reaction of a frequency of 1:10,000 can require a trial of 200,000 or larger split between controls and interventions. This is not done. But for every 100 who prospectively benefit from the intervention, 9,900 also receive it. And for every 100 benefiting one person (who likely gains no benefit) will suffer a serious unidentified adverse reaction. This is also without taking account of more common adverse reactions whether serious or otherwise. References [1] Shahar, E. (2011) Research and medicine: human conjectures at every turn. International Journal of Person Centered Medicine 1 (2), 250-253. [2] Penston, J. (2010). stats.con: How we’ve been fooled by statistics-based research in medicine. London: The London Press, UK. (shrink)
This study was conducted to find out head teachers' perception of the implementation of the capitation grant scheme in Sunyani West East District of the Brong Ahafo Region. The study specifically focused on explaining how head teachers conceptualised the concept of capitation grant scheme, the implementation process, and the challenges associated with the implementation of the scheme. A descriptive research design was adopted for the study, and a questionnaire and an interview guide were designed and administered to a sample of (...) 40 head teachers from the district in the Region. The analysis of data revealed that 70.0% of the head teachers had an in-depth understanding of the source of capitation grant as being from the Government. The study, among others, found that the main challenges confronting the smooth implementations of the scheme were delay in the release of funds and inadequate funds. It is recommended that Government should release adequate amount of the grant in good time (thus, before the beginning of each quarter) so that school heads will avoid pre-financing of school activities. Also, the Ghana Education Service should continue to train head teachers in financial management and administration for prudent use of funds. (shrink)
In contemporary mathematics, a Colombeau algebra of Colombeau generalized functions is an algebra of a certain kind containing the space of Schwartz distributions. While in classical distribution theory a general multiplication of distributions is not possible, Colombeau algebras provide a rigorous framework for this. Remark 1.1.1.Such a multiplication of distributions has been a long time mistakenly believed to be impossible because of Schwartz’ impossibility result, which basically states that there cannot be a differential algebra containing the space of distributions and (...) preserving the product of continuous functions. However, if one only wants to preserve the product of smooth functions instead such a construction becomes possible, as demonstrated first by J.F.Colombeau [1],[2]. As a mathematical tool, Colombeau algebras can be said to combine a treatment of singularities, differentiation and nonlinear operations in one framework, lifting the limitations of distribution theory. These algebras have found numerous applications in the fields of partial differential equations, geophysics, microlocal analysis and general relativity so far. (shrink)
Non-Archimedean probability functions allow us to combine regularity with perfect additivity. We discuss the philosophical motivation for a particular choice of axioms for a non-Archimedean probability theory and answer some philosophical objections that have been raised against infinitesimal probabilities in general.
Leibniz is well known for his formulation of the infinitesimal calculus. Nevertheless, the nature and logic of his discovery are seldom questioned: does it belong more to mathematics or metaphysics, and how is it connected to his physics? This book, composed of fourteen essays, investigates the nature and foundation of the calculus, its relationship to the physics of force and principle of continuity, and its overall method and metaphysics. The Leibnizian calculus is presented in its origin and context together (...) with its main contributors: Archimedes, Cavalieri, Wallis, Hobbes, Pascal, Huygens, Bernoulli, and Nieuwentijt. Many of us know and probably have used the Leibnizian formula: to .. (shrink)
Dual-process theories divide cognition into two kinds of processes: Type 1 processes that are autonomous and do not use working memory, and Type 2 processes that are decoupled from the immediate situation and use working memory. Often, Type 1 processes are also fast, high capacity, parallel, nonconscious, biased, contextualized, and associative, while Type 2 processes are typically slow, low capacity, serial, conscious, normative, abstract, and rule-based. This article argues for an embodied dual-process theory based on the phenomenology of Martin Heidegger. (...) According to Heidegger, the basis of human agents’ encounters with the world is in a prereflective, pragmatically engaged disposition marked by readiness-to-hand (Zuhandenheit), sometimes equated with “smooth coping.” Examples of smooth coping include walking, throwing a ball, and other embodied actions that do not require reflective thought. I argue that smooth coping primarily consists of Type 1 processes. The Heideggerian dual-process model yields distinctly different hypotheses from Hubert Dreyfus’ model of smooth coping, and I will critically engage with Dreyfus’ work. (shrink)
Felix Klein and Abraham Fraenkel each formulated a criterion for a theory of infinitesimals to be successful, in terms of the feasibility of implementation of the Mean Value Theorem. We explore the evolution of the idea over the past century, and the role of Abraham Robinson's framework therein.
When scientists seek further confirmation of their results, they often attempt to duplicate the results using diverse means. To the extent that they are successful in doing so, their results are said to be robust. This paper investigates the logic of such "robustness analysis" [RA]. The most important and challenging question an account of RA can answer is what sense of evidential diversity is involved in RAs. I argue that prevailing formal explications of such diversity are unsatisfactory. I propose (...) a unified, explanatory account of diversity in RAs. The resulting account is, I argue, truer to actual cases of RA in science; moreover, this account affords us a helpful, new foothold on the logic undergirding RAs. (shrink)
In Bertrand Russell's 1903 Principles of Mathematics, he offers an apparently devastating criticism of the neo-Kantian Hermann Cohen's Principle of the Infinitesimal Method and its History (PIM). Russell's criticism is motivated by his concern that Cohen's account of the foundations of calculus saddles mathematics with the paradoxes of the infinitesimal and continuum, and thus threatens the very idea of mathematical truth. This paper defends Cohen against that objection of Russell's, and argues that properly understood, Cohen's views of limits (...) and infinitesimals do not entail the paradoxes of the infinitesimal and continuum. Essential to that defense is an interpretation, developed in the paper, of Cohen's positions in the PIM as deeply rationalist. The interest in developing this interpretation is not just that it reveals how Cohen's views in the PIM avoid the paradoxes of the infinitesimal and continuum. It also reveals some of what is at stake, both historically and philosophically, in Russell's criticism of Cohen. (shrink)
The Koch snowflake is one of the first fractals that were mathematically described. It is interesting because it has an infinite perimeter in the limit but its limit area is finite. In this paper, a recently proposed computational methodology allowing one to execute numerical computations with infinities and infinitesimals is applied to study the Koch snowflake at infinity. Numerical computations with actual infinite and infinitesimal numbers can be executed on the Infinity Computer being a new supercomputer patented in USA (...) and EU. It is revealed in the paper that at infinity the snowflake is not unique, i.e., different snowflakes can be distinguished for different infinite numbers of steps executed during the process of their generation. It is then shown that for any given infinite number n of steps it becomes possible to calculate the exact infinite number, Nn, of sides of the snowflake, the exact infinitesimal length, Ln, of each side and the exact infinite perimeter, Pn, of the Koch snowflake as the result of multiplication of the infinite Nn by the infinitesimal Ln. It is established that for different infinite n and k the infinite perimeters Pn and Pk are also different and the difference can be infinite. It is shown that the finite areas An and Ak of the snowflakes can be also calculated exactly (up to infinitesimals) for different infinite n and k and the difference An − Ak results to be infinitesimal. Finally, snowflakes constructed starting from different initial conditions are also studied and their quantitative characteristics at infinity are computed. (shrink)
The experimental philosophy movement advocates the use of empirical methods in philosophy. The methods most often discussed and in fact employed in experimental philosophy are appropriated from the experimental paradigm in psychology. But there is a variety of other (at least partly) empirical methods from various disciplines that are and others that could be used in philosophy. The paper explores the application of corpus analysis to philosophical issues. Although the method is well established in linguistics, there are only a (...) few tentative attempts of philosophers to utilise it. Examples are introduced and the merit of corpus analysis is compared to that of using general internet search engines and questionnaires for similar purposes. (shrink)
Hermann Weyl was one of the most important figures involved in the early elaboration of the general theory of relativity and its fundamentally geometrical spacetime picture of the world. Weyl’s development of “pure infinitesimal geometry” out of relativity theory was the basis of his remarkable attempt at unifying gravitation and electromagnetism. Many interpreters have focused primarily on Weyl’s philosophical influences, especially the influence of Husserl’s transcendental phenomenology, as the motivation for these efforts. In this article, I argue both that (...) these efforts are most naturally understood as an outgrowth of the distinctive mathematical-physical tradition in Göttingen and also that phenomenology has little to no constructive role to play in them. (shrink)
Interactions between an intelligent software agent and a human user are ubiquitous in everyday situations such as access to information, entertainment, and purchases. In such interactions, the ISA mediates the user’s access to the content, or controls some other aspect of the user experience, and is not designed to be neutral about outcomes of user choices. Like human users, ISAs are driven by goals, make autonomous decisions, and can learn from experience. Using ideas from bounded rationality, we frame these interactions (...) as instances of an ISA whose reward depends on actions performed by the user. Such agents benefit by steering the user’s behaviour towards outcomes that maximise the ISA’s utility, which may or may not be aligned with that of the user. Video games, news recommendation aggregation engines, and fitness trackers can all be instances of this general case. Our analysis facilitates distinguishing various subcases of interaction, as well as second-order effects that might include the possibility for adaptive interfaces to induce behavioural addiction, and/or change in user belief. We present these types of interaction within a conceptual framework, and review current examples of persuasive technologies and the issues that arise from their use. We argue that the nature of the feedback commonly used by learning agents to update their models and subsequent decisions could steer the behaviour of human users away from what benefits them, and in a direction that can undermine autonomy and cause further disparity between actions and goals as exemplified by addictive and compulsive behaviour. We discuss some of the ethical, social and legal implications of this technology and argue that it can sometimes exploit and reinforce weaknesses in human beings. (shrink)
This paper presents a semantical analysis of the Weak Kleene Logics Kw3 and PWK from the tradition of Bochvar and Halldén. These are three-valued logics in which a formula takes the third value if at least one of its components does. The paper establishes two main results: a characterisation result for the relation of logical con- sequence in PWK – that is, we individuate necessary and sufficient conditions for a set.
This essay concerns the question of how we make genuine epistemic progress through conceptual analysis. Our way into this issue will be through consideration of the paradox of analysis. The paradox challenges us to explain how a given statement can make a substantive contribution to our knowledge, even while it purports merely to make explicit what one’s grasp of the concept under scrutiny consists in. The paradox is often treated primarily as a semantic puzzle. However, in “Sect. 1” (...) I argue that the paradox raises a more fundamental epistemic problem, and in “Sects.1 and 2” I argue that semantic proposals—even ones designed to capture the Fregean link between meaning and epistemic significance—fail to resolve that problem. Seeing our way towards a real solution to the paradox requires more than semantics; we also need to understand how the process of analysis can yield justification for accepting a candidate conceptual analysis. I present an account of this process, and explain how it resolves the paradox, in “Sect. 3”. I conclude in “Sect. 4” by considering the implications for the present account concerning the goal of conceptual analysis, and by arguing that the apparent scarcity of short and finite illuminating analyses in philosophically interesting cases provides no grounds for pessimism concerning the possibility of philosophical progress through conceptual analysis. (shrink)
A large amount of data is maintained in every Social networking sites.The total data constantly gathered on these sites make it difficult for methods like use of field agents, clipping services and ad-hoc research to maintain social media data. This paper discusses the previous research on sentiment analysis.
Very often traditional approaches studying dynamics of self-similarity processes are not able to give their quantitative characteristics at infinity and, as a consequence, use limits to overcome this difficulty. For example, it is well know that the limit area of Sierpinski’s carpet and volume of Menger’s sponge are equal to zero. It is shown in this paper that recently introduced infinite and infinitesimal numbers allow us to use exact expressions instead of limits and to calculate exact infinitesimal values (...) of areas and volumes at various points at infinity even if the chosen moment of the observation is infinitely faraway on the time axis from the starting point. It is interesting that traditional results that can be obtained without the usage of infinite and infinitesimal numbers can be produced just as finite approximations of the new ones. (shrink)
In recent years, many philosophers of religion have turned their attention to the topic of faith. Given the ubiquity of the word “faith” both in and out of religious contexts, many of them have chosen to begin their forays by offering an analysis of faith. But it seems that there are many kinds of faith: religious faith, non‐religious faith, interpersonal faith, and propositional faith, to name a few. In this article, I discuss analyses of faith that have been offered (...) and point out the dimensions along which they differ. (shrink)
An astonishing volume and diversity of evidence is available for many hypotheses in the biomedical and social sciences. Some of this evidence—usually from randomized controlled trials (RCTs)—is amalgamated by meta-analysis. Despite the ongoing debate regarding whether or not RCTs are the ‘gold-standard’ of evidence, it is usually meta-analysis which is considered the best source of evidence: meta-analysis is thought by many to be the platinum standard of evidence. However, I argue that meta-analysis falls far short of (...) that standard. Different meta-analyses of the same evidence can reach contradictory conclusions. Meta-analysis fails to provide objective grounds for intersubjective assessments of hypotheses because numerous decisions must be made when performing a meta-analysis which allow wide latitude for subjective idiosyncrasies to influence its outcome. I end by suggesting that an older tradition of evidence in medicine—the plurality of reasoning strategies appealed to by the epidemiologist Sir Bradford Hill—is a superior strategy for assessing a large volume and diversity of evidence. (shrink)
This paper presents a preliminary analysis of homeopathy from the perspective of the demarcation problem in the philosophy of science. In this context, Popper, Kuhn and Feyerabend’s solution to the problem will be given respectively and their criteria will be applied to homeopathy, aiming to shed some light on the controversy over its scientific status. It then examines homeopathy under the lens of demarcation criteria to conclude that homeopathy is regarded as science by Feyerabend and is considered as pseudoscience (...) by Popper and Kuhn. By offering adequate tools for the analysis of the foundations, structure and implications of homeopathy, demarcation issue can help to clarify this medical controversy. The main argument of this article is that a final decision on homeopathy, whose scientific status changes depending on the criteria of the philosophers mentioned, cannot be given. (shrink)
There exists a huge number of numerical methods that iteratively construct approximations to the solution y(x) of an ordinary differential equation (ODE) y′(x) = f(x,y) starting from an initial value y_0=y(x_0) and using a finite approximation step h that influences the accuracy of the obtained approximation. In this paper, a new framework for solving ODEs is presented for a new kind of a computer – the Infinity Computer (it has been patented and its working prototype exists). The new computer is (...) able to work numerically with finite, infinite, and infinitesimal numbers giving so the possibility to use different infinitesimals numerically and, in particular, to take advantage of infinitesimal values of h. To show the potential of the new framework a number of results is established. It is proved that the Infinity Computer is able to calculate derivatives of the solution y(x) and to reconstruct its Taylor expansion of a desired order numerically without finding the respective derivatives analytically (or symbolically) by the successive derivation of the ODE as it is usually done when the Taylor method is applied. Methods using approximations of derivatives obtained thanks to infinitesimals are discussed and a technique for an automatic control of rounding errors is introduced. Numerical examples are given. (shrink)
Despite their success in describing and predicting cognitive behavior, the plausibility of so-called ‘rational explanations’ is often contested on the grounds of computational intractability. Several cognitive scientists have argued that such intractability is an orthogonal pseudoproblem, however, since rational explanations account for the ‘why’ of cognition but are agnostic about the ‘how’. Their central premise is that humans do not actually perform the rational calculations posited by their models, but only act as if they do. Whether or not the problem (...) of intractability is solved by recourse to ‘as if’ explanations critically depends, inter alia, on the semantics of the ‘as if’ connective. We examine the five most sensible explications in the literature, and conclude that none of them circumvents the problem. As a result, rational ‘as if’ explanations must obey the minimal computational constraint of tractability. (shrink)
A critical survey of various positions on the nature, use, possession, and analysis of normative concepts. We frame our treatment around G.E. Moore’s Open Question Argument, and the ways metaethicists have responded by departing from a Classical Theory of concepts. In addition to the Classical Theory, we discuss synthetic naturalism, noncognitivism (expressivist and inferentialist), prototype theory, network theory, and empirical linguistic approaches. Although written for a general philosophical audience, we attempt to provide a new perspective and highlight some underappreciated (...) problems about normative concepts. (shrink)
A new computational methodology for executing calculations with infinite and infinitesimal quantities is described in this paper. It is based on the principle ‘The part is less than the whole’ introduced by Ancient Greeks and applied to all numbers (finite, infinite, and infinitesimal) and to all sets and processes (finite and infinite). It is shown that it becomes possible to write down finite, infinite, and infinitesimal numbers by a finite number of symbols as particular cases of a (...) unique framework. The new methodology has allowed us to introduce the Infinity Computer working with such numbers (its simulator has already been realized). Examples dealing with divergent series, infinite sets, and limits are given. (shrink)
According to the traditional analysis of propositional knowledge (which derives from Plato's account in the Meno and Theaetetus), knowledge is justified true belief. This chapter develops the traditional analysis, introduces the famous Gettier and lottery problems, and provides an overview of prospective solutions. In closing, I briefly comment on the value of conceptual analysis, note how it has shaped the field, and assess the state of post-Gettier epistemology.
First published in 1949 expressly to introduce logical positivism to English speakers. Reichenbach, with Rudolph Carnap, founded logical positivism, a form of epistemofogy that privileged scientific over metaphysical truths.
This paper presents results found through searching publicly available U.S. data sources for information about how to handle incidental fndings (IF) in human subjects research, especially in genetics and genomics research, neuroimaging research, and CT colonography research. We searched the Web sites of 14 federal agencies, 22 professional societies, and 100 universities, as well as used the search engine Google for actual consent forms that had been posted on the Internet. Our analysis of these documents showed that there is (...) very little public guidance available for researchers as to how to deal with incidental fndings. Moreover, the guidance available is not consistent. (shrink)
Gettier presented the now famous Gettier problem as a challenge to epistemology. The methods Gettier used to construct his challenge, however, utilized certain principles of formal logic that are actually inappropriate for the natural language discourse of the Gettier cases. In that challenge to epistemology, Gettier also makes truth claims that would be considered controversial in analytic philosophy of language. The Gettier challenge has escaped scrutiny in these other relevant academic disciplines, however, because of its façade as an epistemological (...) class='Hi'>analysis. This article examines Gettier's methods with the analytical tools of logic and analytic philosophy of language. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.