Let f(1)=2, f(2)=4, and let f(n+1)=f(n)! for every integer n≥2. Edmund Landau's conjecture states that the set P(n^2+1) of primes of the form n^2+1 is infinite. Landau's conjecture implies the following unproven statement Φ: card(P(n^2+1))<ω ⇒ P(n^2+1)⊆[2,f(7)]. Let B denote the system of equations: {x_i!=x_k: i,k∈{1,...,9}} ∪ {x_i⋅x_j=x_k: i,j,k∈{1,...,9}}. We write some system U⊆B of 9 equations which has exactly two solutions in positive integers x_9,...,x_9, namely (1,...,1) and (f(1),...,f(9)). No known system S⊆B with a ﬁnite number of solutions (...) in positive integers x_1, . . . , x_9 has a solution (x_1,. . .,x_9)∈(N\{0})^9 satisfying max(x_1,..., x_9)>f (9). We write some system A of 8 equations. Let Λ denote the statement: if the system A has at most finitely many solutions in positive integers x_1,...,x_9, then each such solution (x_1,...,x_9) satisfies x_1,...,x_9≤f(9). The statement Λ is equivalent to the statement Φ. This heuristically proves the statement Φ . This proof does not yield that card(P(n^2+1))=ω. Algorithms always terminate. We explain the distinction between existing algorithms (i.e. algorithms whose existence is provable in ZFC) and known algorithms (i.e. algorithms whose definition is constructive and currently known to us). No known set X⊆N satisfies conditions (1)-(4) and is widely known in number theory or naturally defined, where this term has only informal meaning. *** (1) A known algorithm with no input returns an integer n satisfying card(X)<ω ⇒ X⊆(-∞,n]. (2) A known algorithm for every k∈N decides whether or not k∈X. (3) No known algorithm with no input returns the logical value of the statement card(X)=ω. (4) There are many elements of X and it is conjectured that X is infinite. (5) X has the simplest definition among known sets Y⊆N with the same set of known elements. *** The set X= P(n^2+1) satisfies conditions (2)-(5). The statement Φ implies condition (1) for X=P(n^2+1). The set X={k∈N: (k>10^{13}) ⇒ (f(10^{13}),f(k))∩P(n^2+1)≠∅} satisfies conditions (1)-(4) and does not satisfy condition (5) as the set of known elements of X equals {0,...,10^{13}} . No set X⊆N will satisfy conditions (1)-(4) forever, if for every algorithm with no input, at some future day, a computer will be able to execute this algorithm in 1 second or less. The physical limits of computation disprove this assumption. (shrink)
Andrew Wiles' analytic proof of Fermat's Last Theorem FLT, which appeals to geometrical properties of real and complex numbers, leaves two questions unanswered: (i) What technique might Fermat have used that led him to, even if only briefly, believe he had `a truly marvellous demonstration' of FLT? (ii) Why is x^n+y^n=z^n solvable only for n<3? In this inter-disciplinary perspective, we offer insight into, and answers to, both queries; yielding a pre-formal proof of why FLT can be treated as a true (...) arithmetical proposition (one which, moreover, might not be provable formally in the first-order Peano Arithmetic PA), where we admit only elementary (i.e., number-theoretic) reasoning, without appeal to analytic properties of real and complex numbers. We cogently argue, further, that any formal proof of FLT needs---as is implicitly suggested by Wiles' proof---to appeal essentially to formal geometrical properties of formal arithmetical propositions. (shrink)
Yuri Matiyasevich's theorem states that the set of all Diophantine equations which have a solution in non-negative integers is not recursive. Craig Smoryński's theorem states that the set of all Diophantine equations which have at most finitely many solutions in non-negative integers is not recursively enumerable. Let R be a subring of Q with or without 1. By H_{10}(R), we denote the problem of whether there exists an algorithm which for any given Diophantine equation with integer coefficients, can decide whether (...) or not the equation has a solution in R. We prove that a positive solution to H_{10}(R) implies that the set of all Diophantine equations with a finite number of solutions in R is recursively enumerable. We show the converse implication for every infinite set R \subseteq Q such that there exist computable functions \tau_1,\tau_2:N \to Z which satisfy (\forall n \in N \tau_2(n) \neq 0) \wedge ({\frac{\tau_1(n)}{\tau_2(n)}: n \in N}=R). This implication for R=N guarantees that Smoryński's theorem follows from Matiyasevich's theorem. Harvey Friedman conjectures that the set of all polynomials of several variables with integer coefficients that have a rational solution is not recursive. Harvey Friedman conjectures that the set of all polynomials of several variables with integer coefficients that have only finitely many rational solutions is not recursively enumerable. These conjectures are equivalent by our results for R=Q. (shrink)
The way, in which quantum information can unify quantum mechanics (and therefore the standard model) and general relativity, is investigated. Quantum information is defined as the generalization of the concept of information as to the choice among infinite sets of alternatives. Relevantly, the axiom of choice is necessary in general. The unit of quantum information, a qubit is interpreted as a relevant elementary choice among an infinite set of alternatives generalizing that of a bit. The invariance to the (...) axiom of choice shared by quantum mechanics is introduced: It constitutes quantum information as the relation of any state unorderable in principle (e.g. any coherent quantum state before measurement) and the same state already well-ordered (e.g. the well-ordered statistical ensemble of the measurement of the quantum system at issue). This allows of equating the classical and quantum time correspondingly as the well-ordering of any physical quantity or quantities and their coherent superposition. That equating is interpretable as the isomorphism of Minkowski space and Hilbert space. Quantum information is the structure interpretable in both ways and thus underlying their unification. Its deformation is representable correspondingly as gravitation in the deformed pseudo-Riemannian space of general relativity and the entanglement of two or more quantum systems. The standard model studies a single quantum system and thus privileges a single reference frame turning out to be inertial for the generalized symmetry [U(1)]X[SU(2)]X[SU(3)] “gauging” the standard model. As the standard model refers to a single quantum system, it is necessarily linear and thus the corresponding privileged reference frame is necessary inertial. The Higgs mechanism U(1) → [U(1)]X[SU(2)] confirmed enough already experimentally describes exactly the choice of the initial position of a privileged reference frame as the corresponding breaking of the symmetry. The standard model defines ‘mass at rest’ linearly and absolutely, but general relativity non-linearly and relatively. The “Big Bang” hypothesis is additional interpreting that position as that of the “Big Bang”. It serves also in order to reconcile the linear standard model in the singularity of the “Big Bang” with the observed nonlinearity of the further expansion of the universe described very well by general relativity. Quantum information links the standard model and general relativity in another way by mediation of entanglement. The linearity and absoluteness of the former and the nonlinearity and relativeness of the latter can be considered as the relation of a whole and the same whole divided into parts entangled in general. (shrink)
In this paper, I suggest that infinite numbers are large finite numbers, and that infinite numbers, properly understood, are 1) of the structure omega + (omega* + omega)Ө + omega*, and 2) the part is smaller than the whole. I present an explanation of these claims in terms of epistemic limitations. I then consider the importance, part of which is demonstrating the contradiction that lies at the heart of Cantorian set theory: the natural numbers are too large to (...) be counted by any finite number, but too small to be counted by any infinite number – there is no number of natural numbers. (shrink)
Kant's account of space as an infinite given magnitude in the Critique of Pure Reason is paradoxical, since infinite magnitudes go beyond the limits of possible experience. Michael Friedman's and Charles Parsons's accounts make sense of geometrical construction, but I argue that they do not resolve the paradox. I argue that metaphysical space is based on the ability of the subject to generate distinctly oriented spatial magnitudes of invariant scalar quantity through translation or rotation. The set of determinately (...) oriented, constructed geometrical spaces is a proper subset of metaphysical space, thus, metaphysical space is infinite. Kant's paradoxical doctrine of metaphysical space is necessary to reconcile his empiricism with his transcendental idealism. (shrink)
The Overgeneration Argument is a prominent objection against the model-theoretic account of logical consequence for second-order languages. In previous work we have offered a reconstruction of this argument which locates its source in the conflict between the neutrality of second-order logic and its alleged entanglement with mathematics. Some cases of this conflict concern small large cardinals. In this article, we show that in these cases the conflict can be resolved by moving from a set-theoretic implementation of the model-theoretic account to (...) one which uses higher-order resources. (shrink)
Looking at the recent spate of claims about “fake news” which appear to be a new feature of political discourse, I argue that fake news presents an interesting problem in epistemology. Te phenomena of fake news trades upon tolerating a certain indiference towards truth, which is sometimes expressed insincerely by political actors. Tis indiference and insincerity, I argue, has been allowed to fourish due to the way in which we have set the terms of the “public” epistemology that maintains what (...) is considered “rational” public discourse. I argue one potential salve to the problem of fake news is to challenge this public epistemology by injecting a certain ethical consideration back into the discourse. (shrink)
In Lewis reconstructs set theory using mereology and plural quantification (MPQ). In his recontruction he assumes from the beginning that there is an infinite plurality of atoms, whose size is equivalent to that of the set theoretical universe. Since this assumption is far beyond the basic axioms of mereology, it might seem that MPQ do not play any role in order to guarantee the existence of a large infinity of objects. However, we intend to demonstrate that mereology and plural (...) quantification are, in some ways, particularly relevant to a certain conception of the infinite. More precisely, though the principles of mereology and plural quantification do not guarantee the existence of an infinite number of objects, nevertheless, once the existence of any infinite object is admitted, they are able to assure the existence of an uncountable infinity of objects. So, ifMPQ were parts of logic, the implausible consequence would follow that, given a countable infinity of individuals, logic would be able to guarantee an uncountable infinity of objects. (shrink)
Belief in conspiracy theories is typically considered irrational, and as a consequence of this, conspiracy theorists––those who dare believe some conspiracy theory––have been charged with a variety of epistemic or psychological failings. Yet recent philosophical work has challenged the view that belief in conspiracy theories should be considered as typically irrational. By performing an intra-group analysis of those people we call “conspiracy theorists”, we find that the problematic traits commonly ascribed to the general group of conspiracy theorists turn out to (...) be merely a set of stereotypical behaviours and thought patterns associated with a purported subset of that group. If we understand that the supposed prob- lem of belief in conspiracy theories is centred on the beliefs of this purported sub- set––the conspiracists––then we can reconcile the recent philosophical contribu- tions to the wider academic debate on the rationality of belief in conspiracy theories. (shrink)
There is a long-standing debate in epistemology on the structure of justification. Some recent work in formal epistemology promises to shed some new light on that debate. I have in mind here some recent work by David Atkinson and Jeanne Peijnenburg, hereafter “A&P”, on infinite regresses of probabilistic support. A&P show that there are probability distributions defined over an infinite set of propositions {\ such that \ is probabilistically supported by \ for all i and \ has a (...) high probability. Let this result be “APR”. A&P oftentimes write as though they believe that APR runs counter to foundationalism. This makes sense, since there is some prima facie plausibility in the idea that APR runs counter to foundationalism, and since some prominent foundationalists argue for theses inconsistent with APR. I argue, though, that in fact APR does not run counter to foundationalism. I further argue that there is a place in foundationalism for infinite regresses of probabilistic support. (shrink)
I develop a cognitive account of how humans make skeptical judgments (of the form “X does not know p”). In my view, these judgments are produced by a special purpose metacognitive "skeptical" mechanism which monitors our reasoning for hasty or overly risky assumptions. I argue that this mechanism is modular and shaped by natural selection. The explanation for why the mechanism is adaptive essentially relies on an internalized principle connecting knowledge and action, a principle central to pragmatic encroachment theories. I (...) end the paper by sketching how we can use the account I develop here to respond to the skeptic. (shrink)
Instead of the half-century old foundational feud between set theory and category theory, this paper argues that they are theories about two different complementary types of universals. The set-theoretic antinomies forced naïve set theory to be reformulated using some iterative notion of a set so that a set would always have higher type or rank than its members. Then the universal u_{F}={x|F(x)} for a property F() could never be self-predicative in the sense of u_{F}∈u_{F}. But the mathematical theory of categories, (...) dating from the mid-twentieth century, includes a theory of always-self-predicative universals--which can be seen as forming the "other bookend" to the never-self-predicative universals of set theory. The self-predicative universals of category theory show that the problem in the antinomies was not self-predication per se, but negated self-predication. They also provide a model (in the Platonic Heaven of mathematics) for the self-predicative strand of Plato's Theory of Forms as well as for the idea of a "concrete universal" in Hegel and similar ideas of paradigmatic exemplars in ordinary thought. (shrink)
There is a well-known equivalence between avoiding accuracy dominance and having probabilistically coherent credences (see, e.g., de Finetti 1974, Joyce 2009, Predd et al. 2009, Schervish et al. 2009, Pettigrew 2016). However, this equivalence has been established only when the set of propositions on which credence functions are defined is finite. In this paper, we establish connections between accuracy dominance and coherence when credence functions are defined on an infinite set of propositions. In particular, we establish the necessary results (...) to extend the classic accuracy argument for probabilism originally due to Joyce (1998) to certain classes of infinite sets of propositions including countably infinite partitions. (shrink)
It is well known that the following features hold of AR + T under the strong Kleene scheme, regardless of the way the language is Gödel numbered: 1. There exist sentences that are neither paradoxical nor grounded. 2. There are 2ℵ0 fixed points. 3. In the minimal fixed point the weakly definable sets (i.e., sets definable as {n∣ A(n) is true in the minimal fixed point where A(x) is a formula of AR + T) are precisely the Π1 1 sets. (...) 4. In the minimal fixed point the totally defined sets (sets weakly defined by formulae all of whose instances are true or false) are precisely the ▵1 1 sets. 5. The closure ordinal for Kripke's construction of the minimal fixed point is ωCK 1. In contrast, we show that under the weak Kleene scheme, depending on the way the Gödel numbering is chosen: 1. There may or may not exist nonparadoxical, ungrounded sentences. 2. The number of fixed points may be any positive finite number, ℵ0, or 2ℵ0 . 3. In the minimal fixed point, the sets that are weakly definable may range from a subclass of the sets 1-1 reducible to the truth set of AR to the Π1 1 sets, including intermediate cases. 4. Similarly, the totally definable sets in the minimal fixed point range from precisely the arithmetical sets up to precisely the ▵1 1 sets. 5. The closure ordinal for the construction of the minimal fixed point may be ω, ωCK 1, or any successor limit ordinal in between. In addition we suggest how one may supplement AR + T with a function symbol interpreted by a certain primitive recursive function so that, irrespective of the choice of the Godel numbering, the resulting language based on the weak Kleene scheme has the five features noted above for the strong Kleene language. (shrink)
The Koch snowflake is one of the first fractals that were mathematically described. It is interesting because it has an infinite perimeter in the limit but its limit area is finite. In this paper, a recently proposed computational methodology allowing one to execute numerical computations with infinities and infinitesimals is applied to study the Koch snowflake at infinity. Numerical computations with actual infinite and infinitesimal numbers can be executed on the Infinity Computer being a new supercomputer patented in (...) USA and EU. It is revealed in the paper that at infinity the snowflake is not unique, i.e., different snowflakes can be distinguished for different infinite numbers of steps executed during the process of their generation. It is then shown that for any given infinite number n of steps it becomes possible to calculate the exact infinite number, Nn, of sides of the snowflake, the exact infinitesimal length, Ln, of each side and the exact infinite perimeter, Pn, of the Koch snowflake as the result of multiplication of the infinite Nn by the infinitesimal Ln. It is established that for different infinite n and k the infinite perimeters Pn and Pk are also different and the difference can be infinite. It is shown that the finite areas An and Ak of the snowflakes can be also calculated exactly (up to infinitesimals) for different infinite n and k and the difference An − Ak results to be infinitesimal. Finally, snowflakes constructed starting from different initial conditions are also studied and their quantitative characteristics at infinity are computed. (shrink)
In this paper paraconsistent first-order logic LP^{#} with infinite hierarchy levels of contradiction is proposed. Corresponding paraconsistent set theory KSth^{#} is discussed.Axiomatical system HST^{#}as paraconsistent generalization of Hrbacek set theory HST is considered.
An agent’s belief in a proposition, E0, is justified by an infinite regress of deferred justification just in case the belief that E0 is justified, and the justification for believing E0 proceeds from an infinite sequence of propositions, E0, E1, E2, etc., where, for all n ≥ 0, En+1 serves as the justification for En. In a number of recent articles, Atkinson and Peijnenburg claim to give examples where a belief is justified by an infinite regress of (...) deferred justification. I argue here that there is no reason to regard Atkinson and Peijnenburg’s examples as cases where a belief is so justified. My argument is supported by careful consideration of the grounds upon which relevant beliefs are held within Atkinson and Peijnenburg’s examples. (shrink)
Cognitive Set Theory is a mathematical model of cognition which equates sets with concepts, and uses mereological elements. It has a holistic emphasis, as opposed to a reductionistic emphasis, and it therefore begins with a single universe (as opposed to an infinite collection of infinitesimal points).
Reasoning from negative evidence takes place where an expected outcome is tested for, and when it is not found, a conclusion is drawn based on the significance of the failure to find it. By using Gricean maxims and implicatures, we show how a set of alternatives, which we call a paradigm, provides the deep inferential structure on which reasoning from lack of evidence is based. We show that the strength of reasoning from negative evidence depends on how the arguer defines (...) his conclusion and what he considers to be in the paradigm of negated alternatives. If we negate only two of the several possible alternatives, even if they are the most probable, the conclusion will be weak. However, if we deny all possible alternatives, the reasoning will be strong, and even in some cases deductively valid. (shrink)
In order to explain Wittgenstein’s account of the reality of completed infinity in mathematics, a brief overview of Cantor’s initial injection of the idea into set- theory, its trajectory and the philosophic implications he attributed to it will be presented. Subsequently, we will first expound Wittgenstein’s grammatical critique of the use of the term ‘infinity’ in common parlance and its conversion into a notion of an actually existing infinite ‘set’. Secondly, we will delve into Wittgenstein’s technical critique of the (...) concept of ‘denumerability’ as it is presented in set theory as well as his philosophic refutation of Cantor’s Diagonal Argument and the implications of such a refutation onto the problems of the Continuum Hypothesis and Cantor’s Theorem. Throughout, the discussion will be placed within the historical and philosophical framework of the Grundlagenkrise der Mathematik and Hilbert’s problems. (shrink)
Background If trials of therapeutic interventions are to serve society's interests, they must be of high methodological quality and must satisfy moral commitments to human subjects. The authors set out to develop a clinical - trials compendium in which standards for the ethical treatment of human subjects are integrated with standards for research methods. Methods The authors rank-ordered the world's nations and chose the 31 with >700 active trials as of 24 July 2008. Governmental and other authoritative entities of the (...) 31 countries were searched, and 1004 English-language documents containing ethical and/or methodological standards for clinical trials were identified. The authors extracted standards from 144 of those: 50 designated as ‘core’, 39 addressing trials of invasive procedures and a 5% sample of the remainder. As the integrating framework for the standards we developed a coherent taxonomy encompassing all elements of a trial's stages. Findings Review of the 144 documents yielded nearly 15 000 discrete standards. After duplicates were removed, 5903 substantive standards remained, distributed in the taxonomy as follows: initiation, 1401 standards, 8 divisions; design, 1869 standards, 16 divisions; conduct, 1473 standards, 8 divisions; analysing and reporting results, 997 standards, four divisions; and post-trial standards, 168 standards, 5 divisions. Conclusions The overwhelming number of source documents and standards uncovered in this study was not anticipated beforehand and confirms the extraordinary complexity of the clinical trials enterprise. This taxonomy of multinational ethical and methodological standards may help trialists and overseers improve the quality of clinical trials, particularly given the globalisation of clinical research. (shrink)
The word ‘equality’ often requires disambiguation, which is provided by context or by an explicit modifier. For each sort of magnitude, there is at least one sense of ‘equals’ with its correlated senses of ‘is greater than’ and ‘is less than’. Given any two magnitudes of the same sort—two line segments, two plane figures, two solids, two time intervals, two temperature intervals, two amounts of money in a single currency, and the like—the one equals the other or the one is (...) greater than the other or the one is greater than the other [sc. in appropriate correlated senses of ‘equals’, ‘is greater than’ and ‘is less than’]. In case there are two or more appropriate senses of ‘equals’, the one intended is often indicated by an adverb. For example, one plane figure may be said to be equal in area to another and, in certain cases, one plane figure may be said to be equal in length to another. Each sense of ‘equality’ is tied to a specific domain and is therefore non-logical. Notice that in every cases ‘equality’ is definable in terms of ‘is greater than’ and also in terms of ‘is less than’ both of which are routinely considered domain specific, non-logical. The word ‘identity’ in the logical sense does not require disambiguation. Moreover, it is not correlated ‘is greater than’ and ‘is less than’. If it is not the case that a certain designated triangle is [sc. is identical to] an otherwise designated triangle, it is not necessary for the one to be greater than or less than the other. Moreover, if two magnitudes are equal then a unit of measure can be chosen and, no matter what unit is chosen, each magnitude is the same multiple of the unit that the other is. But identity does not require units. In this regard, congruence is like identity and unlike equality. In arithmetic, the logical concept of identity is coextensive with the arithmetic concept of equality. The logical concept of identity admits of an analytically adequate definition in terms of logical concepts: given any number x and any number y, x is y iff x has every property that y has. The arithmetical concept of equality admits of an analytically adequate definition in terms of arithmetical concepts: given any number x and any number y, x equals y iff x is neither less than nor greater than y. As Aristotle told us and as Frege retold us, just because one relation is coextensive with another is no reason to conclude that they are one. (shrink)
This paper focuses on invasive therapeutic procedures, defined as procedures requiring the introduction of hands, instruments, or devices into the body via incisions or punctures of the skin or mucous membranes performed with the intent of changing the natural history of a human disease or condition for the better. Ethical and methodological concerns have been expressed about studies designed to evaluate the effects of invasive therapeutic procedures. Can such studies meet the same standards demanded of those, for example, evaluating pharmaceutical (...) agents? This paper describes a research project aimed at examining the interplay and sometimes apparent conflict between ethical standards for human research and standards for methodological rigor in trials of invasive procedures. The paper discusses how the authors plan to develop a set of consensus standards that, if met, would result in substantial and much-needed improvements in the methodological and ethical quality of such trials. (shrink)
Given any proposition, is it possible to have rationally acceptable attitudes towards it? Absent reasons to the contrary, one would probably think that this should be possible. In this paper I provide a reason to the contrary. There is a proposition such that, if one has any opinions about it at all, one will have a rationally unacceptable set of propositional attitudes—or if one doesn’t, one will end up being cognitively imperfect in some other manner. The proposition I am concerned (...) with is a self-referential propositional attitude ascription involving the propositional attitude of rejection. Given a basic assumption about what constitutes irrationality, and a few assumptions about the nature of cognitively ideal agents, a paradox results. This paradox is superficially like the Liar, but it is importantly different in that no alethic notions are involved at all. As such, it stands independent of the Liar and is not a ‘revenge’ version of it. After setting out the paradox I discuss possible responses. After considering several I argue that one is best off simply accepting that the paradox shows us something surprising and interesting about rationality: that some cognitive shortfall is unavoidable even for ideal agents. I argue that nothing disastrous follows from accepting this conclusion. (shrink)
The paper considers contemporary models of presumption in terms of their ability to contribute to a working theory of presumption for argumentation. Beginning with the Whatelian model, we consider its contemporary developments and alternatives, as proposed by Sidgwick, Kauffeld, Cronkhite, Rescher, Walton, Freeman, Ullmann-Margalit, and Hansen. Based on these accounts, we present a picture of presumptions characterized by their nature, function, foundation and force. On our account, presumption is a modal status that is attached to a claim and has the (...) effect of shifting, in a dialogue, a burden of proof set at a local level. Presumptions can be analysed and evaluated inferentially as components of rule-based structures. Presumptions are defeasible, and the force of a presumption is a function of its normative foundation. This picture seeks to provide a framework to guide the development of specific theories of presumption. (shrink)
The quantum information introduced by quantum mechanics is equivalent to that generalization of the classical information from finite to infinite series or collections. The quantity of information is the quantity of choices measured in the units of elementary choice. The qubit can be interpreted as that generalization of bit, which is a choice among a continuum of alternatives. The axiom of choice is necessary for quantum information. The coherent state is transformed into a well-ordered series of results in time (...) after measurement. The quantity of quantum information is the ordinal corresponding to the infinity series in question. (shrink)
In this paper, the authors show that there is a reading of St. Anselm's ontological argument in Proslogium II that is logically valid (the premises entail the conclusion). This reading takes Anselm's use of the definite description "that than which nothing greater can be conceived" seriously. Consider a first-order language and logic in which definite descriptions are genuine terms, and in which the quantified sentence "there is an x such that..." does not imply "x exists". Then, using an ordinary logic (...) of descriptions and a connected greater-than relation, God's existence logically follows from the claims: (a) there is a conceivable thing than which nothing greater is conceivable, and (b) if <em>x</em> doesn't exist, something greater than x can be conceived. To deny the conclusion, one must deny one of the premises. However, the argument involves no modal inferences and, interestingly, Descartes' ontological argument can be derived from it. (shrink)
Biomedical ontologies are emerging as critical tools in genomic and proteomic research where complex data in disparate resources need to be integrated. A number of ontologies exist that describe the properties that can be attributed to proteins; for example, protein functions are described by Gene Ontology, while human diseases are described by Disease Ontology. There is, however, a gap in the current set of ontologies—one that describes the protein entities themselves and their relationships. We have designed a PRotein Ontology (PRO) (...) to facilitate protein annotation and to guide new experiments. The components of PRO extend from the classification of proteins on the basis of evolutionary relationships to the representation of the multiple protein forms of a gene (products generated by genetic variation, alternative splicing, proteolytic cleavage, and other post-translational modification). PRO will allow the specification of relationships between PRO, GO and other OBO Foundry ontologies. Here we describe the initial development of PRO, illustrated using human proteins from the TGF-beta signaling pathway. (shrink)
In the late 1940s and early 1950s Lorenzen developed his operative logic and mathematics, a form of constructive mathematics. Nowadays this is mostly seen as the precursor to the more well-known dialogical logic and one could assumed that the same philosophical motivations were present in both works. However we want to show that this is not always the case. In particular, we claim, that Lorenzen’s well-known rejection of the actual infinite as stated in Lorenzen (1957) was not a major (...) motivation for operative logic and mathematics. In this article, we claim that this is in fact not the case. Rather, we argue for a shift that happened in Lorenzen’s treatment of the infinite from the early to the late 1950s. His early motivation for the development of operativism is concerned with a critique of the Cantorian notion of set and related questions about the notion of countability and uncountability; only later, his motivation switches to focusing on the concept of infinity and the debate about actual and potential infinity. (shrink)
Rhinoceros beetle (Xylotrupes taprobanes ganesha) Silvestre, 2003 recently recorded from Nilgiri hills,Western Ghats. The distribution of this species were reported from Kerala and Tamil Nadu regions so far here after no works were done in this subspecies distribution so for in this region. This present observation ensure the occurrence of X. Taprobanes Ganesha in the Nilgiris show a light on this species ecological work in this region.
The concepts of choice, negation, and infinity are considered jointly. The link is the quantity of information interpreted as the quantity of choices measured in units of elementary choice: a bit is an elementary choice between two equally probable alternatives. “Negation” supposes a choice between it and confirmation. Thus quantity of information can be also interpreted as quantity of negations. The disjunctive choice between confirmation and negation as to infinity can be chosen or not in turn: This corresponds to set-theory (...) or intuitionist approach to the foundation of mathematics and to Peano or Heyting arithmetic. Quantum mechanics can be reformulated in terms of information introducing the concept and quantity of quantum information. A qubit can be equivalently interpreted as that generalization of “bit” where the choice is among an infinite set or series of alternatives. The complex Hilbert space can be represented as both series of qubits and value of quantum information. The complex Hilbert space is that generalization of Peano arithmetic where any natural number is substituted by a qubit. “Negation”, “choice”, and “infinity” can be inherently linked to each other both in the foundation of mathematics and quantum mechanics by the meditation of “information” and “quantum information”. (shrink)
Exclusionary defeat is Joseph Raz’s proposal for understanding the more complex, layered structure of practical reasoning. Exclusionary reasons are widely appealed to in legal theory and consistently arise in many other areas of philosophy. They have also been subject to a variety of challenges. I propose a new account of exclusionary reasons based on their justificatory role, rejecting Raz’s motivational account and especially contrasting exclusion with undercutting defeat. I explain the appeal and coherence of exclusionary reasons by appeal to commonsense (...) value pluralism and the intermediate space of public policies, social roles, and organizations. We often want our choices to have a certain character or instantiate a certain value and in order to do so, that choice can only be based on a restricted set of reasons. Exclusion explains how pro tanto practical reasons can be disqualified from counting towards a choice of a particular kind without being outweighed or undercut. (shrink)
The INBIOSA project brings together a group of experts across many disciplines who believe that science requires a revolutionary transformative step in order to address many of the vexing challenges presented by the world. It is INBIOSA’s purpose to enable the focused collaboration of an interdisciplinary community of original thinkers. This paper sets out the case for support for this effort. The focus of the transformative research program proposal is biology-centric. We admit that biology to date has been more fact-oriented (...) and less theoretical than physics. However, the key leverageable idea is that careful extension of the science of living systems can be more effectively applied to some of our most vexing modern problems than the prevailing scheme, derived from abstractions in physics. While these have some universal application and demonstrate computational advantages, they are not theoretically mandated for the living. A new set of mathematical abstractions derived from biology can now be similarly extended. This is made possible by leveraging new formal tools to understand abstraction and enable computability. [The latter has a much expanded meaning in our context from the one known and used in computer science and biology today, that is "by rote algorithmic means", since it is not known if a living system is computable in this sense (Mossio et al., 2009).] Two major challenges constitute the effort. The first challenge is to design an original general system of abstractions within the biological domain. The initial issue is descriptive leading to the explanatory. There has not yet been a serious formal examination of the abstractions of the biological domain. What is used today is an amalgam; much is inherited from physics (via the bridging abstractions of chemistry) and there are many new abstractions from advances in mathematics (incentivized by the need for more capable computational analyses). Interspersed are abstractions, concepts and underlying assumptions “native” to biology and distinct from the mechanical language of physics and computation as we know them. A pressing agenda should be to single out the most concrete and at the same time the most fundamental process-units in biology and to recruit them into the descriptive domain. Therefore, the first challenge is to build a coherent formal system of abstractions and operations that is truly native to living systems. Nothing will be thrown away, but many common methods will be philosophically recast, just as in physics relativity subsumed and reinterpreted Newtonian mechanics. -/- This step is required because we need a comprehensible, formal system to apply in many domains. Emphasis should be placed on the distinction between multi-perspective analysis and synthesis and on what could be the basic terms or tools needed. The second challenge is relatively simple: the actual application of this set of biology-centric ways and means to cross-disciplinary problems. In its early stages, this will seem to be a “new science”. This White Paper sets out the case of continuing support of Information and Communication Technology (ICT) for transformative research in biology and information processing centered on paradigm changes in the epistemological, ontological, mathematical and computational bases of the science of living systems. Today, curiously, living systems cannot be said to be anything more than dissipative structures organized internally by genetic information. There is not anything substantially different from abiotic systems other than the empirical nature of their robustness. We believe that there are other new and unique properties and patterns comprehensible at this bio-logical level. The report lays out a fundamental set of approaches to articulate these properties and patterns, and is composed as follows. -/- Sections 1 through 4 (preamble, introduction, motivation and major biomathematical problems) are incipient. Section 5 describes the issues affecting Integral Biomathics and Section 6 -- the aspects of the Grand Challenge we face with this project. Section 7 contemplates the effort to formalize a General Theory of Living Systems (GTLS) from what we have today. The goal is to have a formal system, equivalent to that which exists in the physics community. Here we define how to perceive the role of time in biology. Section 8 describes the initial efforts to apply this general theory of living systems in many domains, with special emphasis on crossdisciplinary problems and multiple domains spanning both “hard” and “soft” sciences. The expected result is a coherent collection of integrated mathematical techniques. Section 9 discusses the first two test cases, project proposals, of our approach. They are designed to demonstrate the ability of our approach to address “wicked problems” which span across physics, chemistry, biology, societies and societal dynamics. The solutions require integrated measurable results at multiple levels known as “grand challenges” to existing methods. Finally, Section 10 adheres to an appeal for action, advocating the necessity for further long-term support of the INBIOSA program. -/- The report is concluded with preliminary non-exclusive list of challenging research themes to address, as well as required administrative actions. The efforts described in the ten sections of this White Paper will proceed concurrently. Collectively, they describe a program that can be managed and measured as it progresses. (shrink)
According to Cantor (Mathematische Annalen 21:545–586, 1883 ; Cantor’s letter to Dedekind, 1899 ) a set is any multitude which can be thought of as one (“jedes Viele, welches sich als Eines denken läßt”) without contradiction—a consistent multitude. Other multitudes are inconsistent or paradoxical. Set theoretical paradoxes have common root—lack of understanding why some multitudes are not sets. Why some multitudes of objects of thought cannot themselves be objects of thought? Moreover, it is a logical truth that such multitudes do (...) exist. However we do not understand this logical truth so well as we understand, for example, the logical truth $${\forall x \, x = x}$$ . In this paper we formulate a logical truth which we call the productivity principle. Rusell (Proc Lond Math Soc 4(2):29–53, 1906 ) was the first one to formulate this principle, but in a restricted form and with a different purpose. The principle explicates a logical mechanism that lies behind paradoxical multitudes, and is understandable as well as any simple logical truth. However, it does not explain the concept of set. It only sets logical bounds of the concept within the framework of the classical two valued $${\in}$$ -language. The principle behaves as a logical regulator of any theory we formulate to explain and describe sets. It provides tools to identify paradoxical classes inside the theory. We show how the known paradoxical classes follow from the productivity principle and how the principle gives us a uniform way to generate new paradoxical classes. In the case of ZFC set theory the productivity principle shows that the limitation of size principles are of a restrictive nature and that they do not explain which classes are sets. The productivity principle, as a logical regulator, can have a definite heuristic role in the development of a consistent set theory. We sketch such a theory—the cumulative cardinal theory of sets. The theory is based on the idea of cardinality of collecting objects into sets. Its development is guided by means of the productivity principle in such a way that its consistency seems plausible. Moreover, the theory inherits good properties from cardinal conception and from cumulative conception of sets. Because of the cardinality principle it can easily justify the replacement axiom, and because of the cumulative property it can easily justify the power set axiom and the union axiom. It would be possible to prove that the cumulative cardinal theory of sets is equivalent to the Morse–Kelley set theory. In this way we provide a natural and plausibly consistent axiomatization for the Morse–Kelley set theory. (shrink)
Ortega y Gasset is known for his philosophy of life and his effort to propose an alternative to both realism and idealism. The goal of this article is to focus on an unfamiliar aspect of his thought. The focus will be given to Ortega’s interpretation of the advancements in modern mathematics in general and Cantor’s theory of transfinite numbers in particular. The main argument is that Ortega acknowledged the historical importance of the Cantor’s Set Theory, analyzed it and articulated a (...) response to it. In his writings he referred many times to the advancements in modern mathematics and argued that mathematics should be based on the intuition of counting. In response to Cantor’s mathematics Ortega presented what he defined as an ‘absolute positivism’. In this theory he did not mean to naturalize cognition or to follow the guidelines of the Comte’s positivism, on the contrary. His aim was to present an alternative to Cantor’s mathematics by claiming that mathematicians are allowed to deal only with objects that are immediately present and observable to intuition. Ortega argued that the infinite set cannot be present to the intuition and therefore there is no use to differentiate between cardinals of different infinite sets. (shrink)
Abstract: Background: Plant production provides human and animal life with different requirements. The concern of workers in agriculture in general and those interested in plant diseases, in particular, has been focused on protection from all that is expected to have problems of production. As environmental conditions play a critical role in the treatment of diseases, the plant is prepared and rendered more susceptible to production, which is exposed and may result in the loss of the entire crop. Objectives: The main (...) goal of this expert system is to get the appropriate diagnosis of potato disease and the correct treatment. Methods: In this paper the design of the proposed Expert System which was produced to help farmers, people interested in agriculture and agricultural engineers in diagnosing many of the potatoes diseases such as : Bacterial wilt, Septoria leaf spot, Late blight, arly blight, Common scab, Black scurf/ canker, Viral disease (potato virus X, S, & Y), Potato Spindle Tuber Viroid (PSTVd), Black leg and soft rot, Pink rot and Black heart- disorder. The proposed expert system presents an overview about potatoes diseases are given, the cause of diseases are outlined and the treatment of disease whenever possible is given out. CLIPS with Delphi was used for designing and implementing the proposed expert system. Results: The proposed potatoes diseases diagnosis expert system was evaluated by Farmers, Agricultural experts and teachers of Agriculture and they were satisfied with its performance. Conclusions: The Proposed expert system is very useful for Farmers, and those interested in agriculture with potatoes disease and recent graduate students. (shrink)
Avocado is the fruit of the avocado tree, scientifically known as Persia Americana. This fruit is prized for its high nutrient value and is added to various dishes due to its good flavor and rich texture. It is the main ingredient in guacamole. These days, the avocado has become an incredibly popular food among health-conscious individuals. It’s often referred to as a superfood, which is not surprising given its health properties. Using a public dataset of 1,234 images of Avocado collected (...) under controlled conditions, we trained a deep convolutional neural network to identify tow type of avocado. The trained model achieved an accuracy of 99.84% on a held-out test set, demonstrating the feasibility of this approach. Overall, the approach of training deep learning models on increasingly large and publicly available image datasets present a clear path toward types of avocado. (shrink)
Looks into evaluation of information provision probability from different sources, based on use of linguistic variables. Formation of functions appurtenant for its unclear variables provides for adoption of decisions by the decision maker, in conditions of nonprobabilistic equivocation. The development of market relations in Ukraine increases the independence and responsibility of enterprises in justifying and making management decisions that ensure their effective, competitive activities. As a result of the analysis, it is determined that the condition of economic facilities can be (...) described and determined by the decision-maker, in the presence of the necessary information. The confidence of the decision-maker in the information received is different and the decisions made have a correspondingly different level of information risk. It is important to substantiate the procedure for assessing the numerical extent of information risk in decision-making based on the information obtained in conditions of uncertainty. The use of a linguistic variable in the processing of expert data presented in the form of a matrix of binary relations of values of the membership function, which allowed to move to further processing of knowledge to support decision-making in the management of industrial, commercial, financial and other activities. As a mathematical model for estimating the numerical measure of information risk when making decisions based on the information obtained in conditions of non-stochastic uncertainty, a model has been developed to model natural language uncertainties, which differs from existing ones by formalizing knowledge taking into account uncertainty of input information. Making such a clear decision in a fuzzy environment has appropriate values of effectiveness and risk. The paper proposes all the functions and accessories of indicators of both quantitative nature and qualitative nature to bring their values in the field of definition to one scale. Then the indicator of the effectiveness of decision-making will be a measure of the clarity of the cross-section of fuzzy subsets, which correspond to the introduced indicators of information risk. The condition of economic facilities can be described and determined by the decision-maker, if the necessary information is available. Decision-making on thenumerical measure of information risk must be determined by a set of basic indicators, which can be both quantitative and qualitative in nature. Predictive values of indicators should be determined in conditions of nonstochastic uncertainty. In this case, the indicators of a quantitative nature can be determined by fuzzy triangular numbers, which implement a high level of confidence in the subjective judgments of experts. Indicators of qualitative nature should be presented in linguistic variables. The values of the indicators of qualitative nature that are predicted must be considered for all fuzzy variable terms-sets of linguistic variables introduced into consideration. For any fuzzy variable, the introduction to the consideration of a clear set of values as carriers of the α-level of its membership function allows to reduce to a single interpretation of the predicted values of indicators of quantitative and qualitative nature in terms of non-stochastic uncertainty. (shrink)
In this paper, we defend Socratic studies as a research programme against several recent attacks, including at least one recently published in Polis . Critics have argued that the study of Socrates, based upon evidence mostly or entirely derived from some set of Plato's dialogues, is founded upon faulty and indefensible historical or hermeneutical technique. We begin by identifying what we believe are the foundational principles of Socratic studies, as the field has been pursued in recent years, and we then (...) show how the research programme that derives from accepting these principles is not defeated by any of the most common recent criticisms of it. Specifically, we argue that challenges to sorting Plato's dialogues by date, more general challenges to historicist interpretations of Plato's dialogues, as well as recent literary criticisms of Socratic studies all fail to undermine the research programme. We conclude with some thoughts about how and why Socratic studies has proved itself a valuable and fruitful research programme. (shrink)
John Turri gives an example that he thinks refutes what he takes to be “G. E. Moore's view” that omissive assertions such as “It is raining but I do not believe that it is raining” are “inherently ‘absurd'”. This is that of Ellie, an eliminativist who makes such assertions. Turri thinks that these are perfectly reasonable and not even absurd. Nor does she seem irrational if the sincerity of her assertion requires her to believe its content. A commissive counterpart of (...) Ellie is Di, a dialetheist who asserts or believes that The Russell set includes itself but I believe that it is not the case that the Russell set includes itself. Since any adequate explanation of Moore's paradox must handle commissive assertions and beliefs as well as omissive ones, it must deal with Di as well as engage Ellie. I give such an explanation. I argue that neither Ellie's assertion nor her belief is irrational yet both are absurd. Likewise neither Di's assertion nor her belief is irrational yet in contrast neither is absurd. I conclude that not all Moore-paradoxical assertions or beliefs are irrational and that the syntax of Moore's examples is not sufficient for the absurdity found in them. (shrink)
Abstract: Background: Coronary artery disease remains the leading cause of morbidity and mortality Worldwide. Previous reviews pointed that nursing interventions are beneficial for coronary artery patients. However, most interventions focused on education and counselling, but not consistent with the outcome set; still did not consider patient’s coronary artery disease risky characteristics. Related studies in China also difficult to find. Therefore this study was conducted to investigate kinds of nursing interventions delivered to coronary artery patients and match them with patient’s risk (...) factors of coronary artery disease. Results of this study were expected to add new knowledge that will alert nurses to consider coronary artery risk factors which in turn might enable the development of appropriate approaches to improve patient’s wellbeing hence reduce frequent coronary artery morbidity and mortality. Methods: A descriptive, cross-sectional, retrospective design using clinical case notes was employed. Study was undertaken in coronary care wards at the teaching hospital in China from November 2017 to September 2018. Structured-literature supported self-designed questionnaire was utilized for data collection. Chi square (χ2) test and multivariate logistic regression for adjusted odds ratio with 95% confidence interval were used to compare the relationship among independent (patient’s risk coronary artery disease factors) and dependent (nursing interventions) categorical variables. Ethical permission was granted accordingly. Results: A total of 300 coronary artery patients’ case notes were audited with mean age 63±11.2 years. Of these 175 (58.3%) were males. 126(42%) were smoking and 224(74.7%) were hypertensive. More evidence based nursing interventions than education and counselling were found to be delivered to these patients. “Administer coronary artery disease medication and their instructions” was mostly delivered to many patients 291(97%) while “counsel to cope with stress” was the least one 60 (20.0%). Three of eight nursing interventions delivered significantly matched with three or all of these patient’s coronary artery risk variables (age, smoking, hypertension and diabetes) (p < 0.05 and/or < 0.01) with Adjusted odds ratio (95% CI) within their significant ranges. Conclusion: This study delivers valuable insight that, nurses in the studied teaching hospital delivered beneficial evidence based nursing interventions to patients with coronary artery disease which significantly matched with their risk factors of coronary artery illness. However, care for stress was low hence needs improvement. Furthermore, research is needed to get consistency of nursing interventions with patient’s end point clinical outcomes for further appraisal of nursing efforts in caring CAD patients . (shrink)
Abstract: Minstrels are found in all parts of the world. They perform similar functions in their various societies. They are the custodian of the people’s history, entertainers, educationists, advisers and reconstructionists. The secret behind their success in their roles in the society lies in their ability to impress the audience during their performance. Performance plays an indispensable role in full actualization of the story being told as a full aesthetic experience. In this paper, the writer x-rays the roles of both (...) performance and the context of the play in reconstruction of the society especially in Upper Iweka area of Onitsha town. (shrink)
The study aimed to identify the impact of Design Thinking on decision making in local NGOs in Gaza Strip. In order to achieve the objectives of the study and test its hypotheses, the descriptive analytical method was used and the questionnaire was used as a main tool for data collection. The study population consisted of decision makers in the local NGOs in Gaza Strip. The researchers chose a comprehensive inventory method. The researchers used SPSS in processing and analyzing the data (...) obtained through the questionnaire. The Smart-PLS program was used to construct the Structural Equation Model (SEM) to solve the relationship between the variables of the study, and the calculation of the direct and indirect effects of the independent variable on the dependent variable through the intermediate variable. The study found a set of results, the most important of which are: The study revealed that Design Thinking has a direct and total impact on decision-making that is relevance to the decision, based on reference data for decision-making. Among the most important recommendations of the study are: • Adoption of the Design Thinking methodology by senior management in the local NGOs in Gaza Strip; In order to make sound and correct decisions, encourage them to follow scientific methodologies in the decision-making mechanism. (shrink)
The study aimed to identify the reality of the process design management in the local ngos in Gaza Strip. In order to achieve the objectives of the study and to test its hypotheses, the analytical descriptive method was used, relying on the questionnaire as the main tool for data collection. The study society is one of the decision makers in the local ngos in Gaza Strip. The study population consisted of 78 local ngos in Gaza Strip. The overall inventory of (...) the possible study community was based mainly on the use of the SPSS in processing and analyzing the data obtained through the survey tool. Smart-PLS was also used to construct the structural equation model (SEM) to analyze the relationship between the variables of the study, the calculation of the direct and indirect effects of the independent variable on the dependent variable through the intermediary variable. The study reached a set of results, the most important: the lack of direct relationship between the management of process design and decision-making, the study found that the design thinking mediates the relationship between the management of process design and decision making with a holistic effect. The study showed the interest of local ngos in creating a good mental image in the local community. The ownership of local ngos to the expertise and technical skills required to implement the projects, and showed the adoption of local ngos in their activities to meet the needs of the beneficiaries and their wishes, and local ngos analyze the problem, and causes, through data relevant to the decision, based on reference data for decision-making. The main recommendations of the study are: the need for senior management to be concerned with local ngos in Gaza Strip; encouraging managers and employees to take care of developing the field of managing the design of operations in their projects, enhancing the creative environment and adding competitive advantage to the organization. The study also recommended that the senior management of local ngos in Gaza Strip adopt the methodology of design thinking because it has an impact on the sustainability of projects, design of the technical feasibility study, meeting the wishes of the beneficiaries, and the continued development of local ngos in Gaza Strip to make sound and sound decisions. Encourage them to follow the scientific methodologies in the decision-making mechanism. (shrink)
The study aimed to identify the reality of decision-making in the local NGOs in Gaza Strip. In order to achieve the objectives of the study and to test its hypotheses, the analytical descriptive method was used, relying on the questionnaire as a main tool for data collection. The study society was one of the decision makers in the local NGOs in Gaza Strip. The study population reached 78 local NGOs in Gaza Strip. A Census Method of the possible study community (...) was adopted, and the use of the Statistical Package for Social Sciences (SPSS) was mainly based on the analysis and analysis of the data obtained through the survey tool. The study reached a set of results, the most important of which are: NGOs follow a decision-making mechanism. NGOs examine the problems and causes of the organization. Organizations are less concerned with stakeholder participation. Make decision. The main recommendations of the study are: To promote systematic methods of decision-making in NGOs in Gaza Strip: Local NGOs in Gaza Strip should continue to develop their competencies in order to make sound and correct decisions and encourage them to follow the scientific methodologies in the mechanism of taking Decisions. (shrink)
The currently standard philosophical conception of existence makes a connection between three things: certain ways of talking about existence and being in natural language; certain natural language idioms of quantification; and the formal representation of these in logical languages. Thus a claim like ‘Prime numbers exist’ is treated as equivalent to ‘There is at least one prime number’ and this is in turn equivalent to ‘Some thing is a prime number’. The verb ‘exist’, the verb phrase ‘there is’ and the (...) quantifier ‘some’ are treated as all playing similar roles, and these roles are made explicit in the standard common formalization of all three sentences by a single formula of first-order logic: ‘(∃ x )[P( x ) & N( x )]’, where ‘P( x )’ abbreviates ‘ x is prime’ and ‘N( x )’ abbreviates ‘ x is a number’. The logical quantifier ‘∃’ accordingly symbolizes in context the role played by the English words ‘exists’, ‘some’ and ‘there is’. (shrink)
The iterative conception of set is typically considered to provide the intuitive underpinnings for ZFCU (ZFC+Urelements). It is an easy theorem of ZFCU that all sets have a definite cardinality. But the iterative conception seems to be entirely consistent with the existence of “wide” sets, sets (of, in particular, urelements) that are larger than any cardinal. This paper diagnoses the source of the apparent disconnect here and proposes modifications of the Replacement and Powerset axioms so as to allow for the (...) existence of wide sets. Drawing upon Cantor’s notion of the absolute infinite, the paper argues that the modifications are warranted and preserve a robust iterative conception of set. The resulting theory is proved consistent relative to ZFC + “there exists an inaccessible cardinal number.”. (shrink)
Despite the importance of the variational principles of physics, there have been relatively few attempts to consider them for a realistic framework. In addition to the old teleological question, this paper continues the recent discussion regarding the modal involvement of the principle of least action and its relations with the Humean view of the laws of nature. The reality of possible paths in the principle of least action is examined from the perspectives of the contemporary metaphysics of modality and Leibniz's (...) concept of essences or possibles striving for existence. I elaborate a modal interpretation of the principle of least action that replaces a classical representation of a system's motion along a single history in the actual modality by simultaneous motions along an infinite set of all possible histories in the possible modality. This model is based on an intuition that deep ontological connections exist between the possible paths in the principle of least action and possible quantum histories in the Feynman path integral. I interpret the action as a physical measure of the essence of every possible history. Therefore only one actual history has the highest degree of the essence and minimal action. To address the issue of necessity, I assume that the principle of least action has a general physical necessity and lies between the laws of motion with a limited physical necessity and certain laws with a metaphysical necessity. (shrink)
I consider the first-order modal logic which counts as valid those sentences which are true on every interpretation of the non-logical constants. Based on the assumptions that it is necessary what individuals there are and that it is necessary which propositions are necessary, Timothy Williamson has tentatively suggested an argument for the claim that this logic is determined by a possible world structure consisting of an infinite set of individuals and an infinite set of worlds. He notes that (...) only the cardinalities of these sets matters, and that not all pairs of infinite sets determine the same logic. I use so-called two-cardinal theorems from model theory to investigate the space of logics and consequence relations determined by pairs of infinite sets, and show how to eliminate the assumption that worlds are individuals from Williamson’s argument. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.