I explore the possibility of a structuralist interpretation of homotopy type theory (HoTT) as a foundation for mathematics. There are two main aspects to HoTT's structuralist credentials. First, it builds on categorical set theory (CST), of which the best-known variant is Lawvere's ETCS. I argue that CST has merit as a structuralist foundation, in that it ascribes only structural properties to typical mathematical objects. However, I also argue that this success depends on the adoption of a strict (...) typing system which undermines the metaphysical seriousness of this structuralism. Homotopy type theory adds to CST a distinctive theory of identity between sets, which arguably allows its objects to be seen as ante rem structures. I examine the prospects for such a view, and address many other interpretive problems as they arise. (shrink)
Most moral theories share certain features in common with other theories. They consist of a set of propositions that are universal, general, and hence impartial. The propositions that constitute a typical moral theory are (1) universal, in that they apply to all subjects designated as within their scope. They are (2) general, in that they include no proper names or definite descriptions. They are therefore (3) impartial, in that they accord no special privilege to any particular agent's situation which (...) cannot be justified under (2) and (3). These three features do not distinguish moral theories from other theories, nor indeed from most general categorical propositions we assert. Yet, in recent years, these features of moral theories have been the target of a certain concerted and sustained criticism, namely, that to be committed to such a moral theory, or to aspire to act in accordance with its requirements, results in what has come to be known as moral alienation. Moral alienation, according to this criticism, consists in (i) viewing one's ground projects from an impersonal, "moral point of view" engendered by one's acceptance of the theory; (ii) being prepared to sacrifice these projects to the requirements of moral principle; and (iii) making such a sacrifice specifically and self-consciously in order to conform to these requirements. Moral alienation is said to manifest itself in one (or both) of two ways, depending on the nature of the project thus susceptible to sacrifice. One may be alienated from oneself, if the project consists of tastes, convictions, or aspirations that are centrally definitive of one's self. In this case one's commitment to the project can be at best conditional on its congruence with one's moral theory. It is claimed that this must make for a rather tepid and unenthusiastic commitment indeed. Alternatively, one may be alienated from others, if the project is an interpersonal relationship such as a friendship, marriage, or collegial relationship. In this case one's responses to the other are motivated by one's awareness of what one's moral theory requires. It is claimed that this obstructs a genuine and unmediated emotional response to the other as such. My aim here will be to argue that this very compelling criticism - call it the moral-alienation criticism - is nevertheless misdirected. The real culprit is not any particular moral theory, but rather a certain familiar personality type that may or may not adopt it. (shrink)
In this paper intuitionistic set theory INC# in infinitary set theoretical language is considered. External induction principle in nonstandard intuitionistic arithmetic were derived. Non trivial application in number theory is considered.
Philosophers of science since Nagel have been interested in the links between intertheoretic reduction and explanation, understanding and other forms of epistemic progress. Although intertheoretic reduction is widely agreed to occur in pure mathematics as well as empirical science, the relationship between reduction and explanation in the mathematical setting has rarely been investigated in a similarly serious way. This paper examines an important particular case: the reduction of arithmetic to set theory. I claim that the reduction is unexplanatory. In (...) defense of this claim, I offer evidence from mathematical practice, and I respond to contrary suggestions due to Steinhart, Maddy, Kitcher and Quine. I then show how, even if set-theoretic reductions are generally not explanatory, set theory can nevertheless serve as a legitimate foundation for mathematics. Finally, some implications of my thesis for philosophy of mathematics and philosophy of science are discussed. In particular, I suggest that some reductions in mathematics are probably explanatory, and I propose that differing standards of theory acceptance might account for the apparent lack of unexplanatory reductions in the empirical sciences. (shrink)
A possible world is a junky world if and only if each thing in it is a proper part. The possibility of junky worlds contradicts the principle of general fusion. Bohn (2009) argues for the possibility of junky worlds, Watson (2010) suggests that Bohn‘s arguments are flawed. This paper shows that the arguments of both authors leave much to be desired. First, relying on the classical results of Cantor, Zermelo, Fraenkel, and von Neumann, this paper proves the possibility of junky (...) worlds for certain weak set theories. Second, the paradox of Burali-Forti shows that according to the Zermelo-Fraenkel set theory ZF, junky worlds are possible. Finally, it is shown that set theories are not the only sources for designing plausible models of junky worlds: Topology (and possibly other "algebraic" mathematical theories) may be used to construct models of junky worlds. In sum, junkyness is a relatively widespread feature among possible worlds. (shrink)
Instead of the half-century old foundational feud between set theory and category theory, this paper argues that they are theories about two different complementary types of universals. The set-theoretic antinomies forced naïve set theory to be reformulated using some iterative notion of a set so that a set would always have higher type or rank than its members. Then the universal u_{F}={x|F(x)} for a property F() could never be self-predicative in the sense of u_{F}∈u_{F}. But the mathematical (...)theory of categories, dating from the mid-twentieth century, includes a theory of always-self-predicative universals--which can be seen as forming the "other bookend" to the never-self-predicative universals of set theory. The self-predicative universals of category theory show that the problem in the antinomies was not self-predication per se, but negated self-predication. They also provide a model (in the Platonic Heaven of mathematics) for the self-predicative strand of Plato's Theory of Forms as well as for the idea of a "concrete universal" in Hegel and similar ideas of paradigmatic exemplars in ordinary thought. (shrink)
Boolean-valued models of set theory were independently introduced by Scott, Solovay and Vopěnka in 1965, offering a natural and rich alternative for describing forcing. The original method was adapted by Takeuti, Titani, Kozawa and Ozawa to lattice-valued models of set theory. After this, Löwe and Tarafder proposed a class of algebras based on a certain kind of implication which satisfy several axioms of ZF. From this class, they found a specific 3-valued model called PS3 which satisfies all the (...) axioms of ZF, and can be expanded with a paraconsistent negation *, thus obtaining a paraconsistent model of ZF. The logic (PS3 ,*) coincides (up to language) with da Costa and D'Ottaviano logic J3, a 3-valued paraconsistent logic that have been proposed independently in the literature by several authors and with different motivations such as CluNs, LFI1 and MPT. We propose in this paper a family of algebraic models of ZFC based on LPT0, another linguistic variant of J3 introduced by us in 2016. The semantics of LPT0, as well as of its first-order version QLPT0, is given by twist structures defined over Boolean agebras. From this, it is possible to adapt the standard Boolean-valued models of (classical) ZFC to twist-valued models of an expansion of ZFC by adding a paraconsistent negation. We argue that the implication operator of LPT0 is more suitable for a paraconsistent set theory than the implication of PS3, since it allows for genuinely inconsistent sets w such that [(w = w)] = 1/2 . This implication is not a 'reasonable implication' as defined by Löwe and Tarafder. This suggests that 'reasonable implication algebras' are just one way to define a paraconsistent set theory. Our twist-valued models are adapted to provide a class of twist-valued models for (PS3,*), thus generalizing Löwe and Tarafder result. It is shown that they are in fact models of ZFC (not only of ZF). (shrink)
The notion of equality between two observables will play many important roles in foundations of quantum theory. However, the standard probabilistic interpretation based on the conventional Born formula does not give the probability of equality between two arbitrary observables, since the Born formula gives the probability distribution only for a commuting family of observables. In this paper, quantum set theory developed by Takeuti and the present author is used to systematically extend the standard probabilistic interpretation of quantum (...) class='Hi'>theory to define the probability of equality between two arbitrary observables in an arbitrary state. We apply this new interpretation to quantum measurement theory, and establish a logical basis for the difference between simultaneous measurability and simultaneous determinateness. (shrink)
Here, we analyse some recent applications of set theory to topology and argue that set theory is not only the closed domain where mathematics is usually founded, but also a flexible framework where imperfect intuitions can be precisely formalized and technically elaborated before they possibly migrate toward other branches. This apparently new role is mostly reminiscent of the one played by other external fields like theoretical physics, and we think that it could contribute to revitalize the interest in (...) set theory in the future. (shrink)
Relevance logic has become ontologically fertile. No longer is the idea of relevance restricted in its application to purely logical relations among propositions, for as Dunn has shown in his (1987), it is possible to extend the idea in such a way that we can distinguish also between relevant and irrelevant predications, as for example between “Reagan is tall” and “Reagan is such that Socrates is wise”. Dunn shows that we can exploit certain special properties of identity within the context (...) of standard relevance logic in a way which allows us to discriminate further between relevant and irrelevant properties, as also between relevant and irrelevant relations. The idea yields a family of ontologically interesting results concerning the different ways in which attributes and objects may hang together. Because of certain notorious peculiarities of relevance logic, however,1 Dunn’s idea breaks down where the attempt is made to have it bear fruit in application to relations among entities which are of homogeneous type. (shrink)
Cognitive Set Theory is a mathematical model of cognition which equates sets with concepts, and uses mereological elements. It has a holistic emphasis, as opposed to a reductionistic emphasis, and it therefore begins with a single universe (as opposed to an infinite collection of infinitesimal points).
We consider the foundational relation between arithmetic and set theory. Our goal is to criticize the construction of standard arithmetic models as providing grounds for arithmetic truth (even in a relative sense). Our method is to emphasize the incomplete picture of both theories and treat models as their syntactical counterparts. Insisting on the incomplete picture will allow us to argue in favor of the revisability of the standard model interpretation. We then show that it is hopeless to expect that (...) the relative grounding provided by a standard interpretation can resist being revisable. We start briefly characterizing the expansion of arithmetic `truth' provided by the interpretation in a set theory. Further, we show that, for every well-founded interpretation of recursive extensions of PA in extensions of ZF, the interpreted version of arithmetic has more theorems than the original. This theorem expansion is not complete however. We continue by defining the coordination problem. The problem can be summarized as follows. We consider two independent communities of mathematicians responsible for deciding over new axioms for ZF and PA. How likely are they to be coordinated regarding PA’s interpretation in ZF? We prove that it is possible to have extensions of PA not interpretable in a given set theory ST. We further show that the probability of a random extension of arithmetic being interpretable in ST is zero. (shrink)
Set-theoretic and category-theoretic foundations represent different perspectives on mathematical subject matter. In particular, category-theoretic language focusses on properties that can be determined up to isomorphism within a category, whereas set theory admits of properties determined by the internal structure of the membership relation. Various objections have been raised against this aspect of set theory in the category-theoretic literature. In this article, we advocate a methodological pluralism concerning the two foundational languages, and provide a theory that fruitfully interrelates (...) a `structural' perspective to a set-theoretic one. We present a set-theoretic system that is able to talk about structures more naturally, and argue that it provides an important perspective on plausibly structural properties such as cardinality. We conclude the language of set theory can provide useful information about the notion of mathematical structure. (shrink)
The purpose of this article is to present several immediate consequences of the introduction of a new constant called Lambda in order to represent the object ``nothing" or ``void" into a standard set theory. The use of Lambda will appear natural thanks to its role of condition of possibility of sets. On a conceptual level, the use of Lambda leads to a legitimation of the empty set and to a redefinition of the notion of set. It lets also clearly (...) appear the distinction between the empty set, the nothing and the ur-elements. On a technical level, we introduce the notion of pre-element and we suggest a formal definition of the nothing distinct of that of the null-class. Among other results, we get a relative resolution of the anomaly of the intersection of a family free of sets and the possibility of building the empty set from ``nothing". The theory is presented with equi-consistency results . On both conceptual and technical levels, the introduction of Lambda leads to a resolution of the Russell's puzzle of the null-class. (shrink)
The original purpose of the present study, 2011, started with a preprint «On the Probable Failure of the Uncountable Power Set Axiom», 1988, is to save from the transfinite deadlock of higher set theory the jewel of mathematical Continuum — this genuine, even if mostly forgotten today raison d’être of all traditional set-theoretical enterprises to Infinity and beyond, from Georg Cantor to David Hilbert to Kurt Gödel to W. Hugh Woodin to Buzz Lightyear.
In the paper we will employ set theory to study the formal aspects of quantum mechanics without explicitly making use of space-time. It is demonstrated that von Neuman and Zermelo numeral sets, previously efectively used in the explanation of Hardy’s paradox, follow a Heisenberg quantum form. Here monadic union plays the role of time derivative. The logical counterpart of monadic union plays the part of the Hamiltonian in the commutator. The use of numerals and monadic union in the classical (...) probability resolution of Hardy’s paradox [1] is supported with the present derivation of a commutator for sets. (shrink)
The purpose of this paper is to show that the Elementary Process Theory (EPT) agrees with the knowledge of the physical world obtained from the successful predictions of Special Relativity (SR). For that matter, a recently developed method is applied: a categorical model of the EPT that incorporates SR is fully specified. Ultimate constituents of the universe of the EPT are modeled as point-particles, gamma-rays, or time-like strings, all represented by integrable hyperreal functions on Minkowski space. This proves (...) that the EPT agrees with SR. (shrink)
Plato’s philosophy is important to Badiou for a number of reasons, chief among which is that Badiou considered Plato to have recognised that mathematics provides the only sound or adequate basis for ontology. The mathematical basis of ontology is central to Badiou’s philosophy, and his engagement with Plato is instrumental in determining how he positions his philosophy in relation to those approaches to the philosophy of mathematics that endorse an orthodox Platonic realism, i.e. the independent existence of a realm of (...) mathematical objects. The Platonism that Badiou makes claim to bears little resemblance to this orthodoxy. Like Plato, Badiou insists on the primacy of the eternal and immu- table abstraction of the mathematico-ontological Idea; however, Badiou’s reconstructed Platonism champions the mathematics of post-Cantorian set theory, which itself af rms the irreducible multiplicity of being. Badiou in this way recon gures the Platonic notion of the relation between the one and the multiple in terms of the multiple-without-one as represented in the axiom of the void or empty set. Rather than engage with the Plato that is gured in the ontological realism of the orthodox Platonic approach to the philosophy of mathematics, Badiou is intent on characterising the Plato that responds to the demands of a post-Cantorian set theory, and he considers Plato’s philosophy to provide a response to such a challenge. In effect, Badiou reorients mathematical Platonism from an epistemological to an ontological problematic, a move that relies on the plausibility of rejecting the empiricist ontology underlying orthodox mathematical Platonism. To draw a connec- tion between these two approaches to Platonism and to determine what sets them radically apart, this paper focuses on the use that they each make of model theory to further their respective arguments. (shrink)
In order to explain Wittgenstein’s account of the reality of completed infinity in mathematics, a brief overview of Cantor’s initial injection of the idea into set- theory, its trajectory and the philosophic implications he attributed to it will be presented. Subsequently, we will first expound Wittgenstein’s grammatical critique of the use of the term ‘infinity’ in common parlance and its conversion into a notion of an actually existing infinite ‘set’. Secondly, we will delve into Wittgenstein’s technical critique of the (...) concept of ‘denumerability’ as it is presented in set theory as well as his philosophic refutation of Cantor’s Diagonal Argument and the implications of such a refutation onto the problems of the Continuum Hypothesis and Cantor’s Theorem. Throughout, the discussion will be placed within the historical and philosophical framework of the Grundlagenkrise der Mathematik and Hilbert’s problems. (shrink)
Most set theorists accept AC, and reject AD, i.e. for them, AC is true in the "world of sets", and AD is false. Applying to set theory the above-mentioned formalistic explanation of the existence of quarks, we could say: if, for a long time in the future, set theorists will continue their believing in AC, then one may think of a unique "world of sets" as existing in the same sense as quarks are believed to exist.
This paper responds to recent work in the philosophy of Homotopy Type Theory by James Ladyman and Stuart Presnell. They consider one of the rules for identity, path induction, and justify it along ‘pre-mathematical’ lines. I give an alternate justification based on the philosophical framework of inferentialism. Accordingly, I construct a notion of harmony that allows the inferentialist to say when a connective or concept is meaning-bearing and this conception unifies most of the prominent conceptions of harmony through category (...)theory. This categorical harmony is stated in terms of adjoints and says that any concept definable by iterated adjoints from general categorical operations is harmonious. Moreover, it has been shown that identity in a categorical setting is determined by an adjoint in the relevant way. Furthermore, path induction as a rule comes from this definition. Thus we arrive at an account of how path induction, as a rule of inference governing identity, can be justified on mathematically motivated grounds. (shrink)
A practical viewpoint links reality, representation, and language to calculation by the concept of Turing (1936) machine being the mathematical model of our computers. After the Gödel incompleteness theorems (1931) or the insolvability of the so-called halting problem (Turing 1936; Church 1936) as to a classical machine of Turing, one of the simplest hypotheses is completeness to be suggested for two ones. That is consistent with the provability of completeness by means of two independent Peano arithmetics discussed in Section I. (...) Many modifications of Turing machines cum quantum ones are researched in Section II for the Halting problem and completeness, and the model of two independent Turing machines seems to generalize them. Then, that pair can be postulated as the formal definition of reality therefore being complete unlike any of them standalone, remaining incomplete without its complementary counterpart. Representation is formal defined as a one-to-one mapping between the two Turing machines, and the set of all those mappings can be considered as “language” therefore including metaphors as mappings different than representation. Section III investigates that formal relation of “reality”, “representation”, and “language” modeled by (at least two) Turing machines. The independence of (two) Turing machines is interpreted by means of game theory and especially of the Nash equilibrium in Section IV. Choice and information as the quantity of choices are involved. That approach seems to be equivalent to that based on set theory and the concept of actual infinity in mathematics and allowing of practical implementations. (shrink)
DEFINING OUR TERMS A “paradox" is an argumentation that appears to deduce a conclusion believed to be false from premises believed to be true. An “inconsistency proof for a theory" is an argumentation that actually deduces a negation of a theorem of the theory from premises that are all theorems of the theory. An “indirect proof of the negation of a hypothesis" is an argumentation that actually deduces a conclusion known to be false from the hypothesis alone (...) or, more commonly, from the hypothesis augmented by a set of premises known to be true. A “direct proof of a hypothesis" is an argumentation that actually deduces the hypothesis itself from premises known to be true. Since `appears', `believes' and `knows' all make elliptical reference to a participant, it is clear that `paradox', `indirect proof' and `direct proof' are all participant-relative. PARTICIPANT RELATIVITY In normal mathematical writing the participant is presumed to be “the community of mathematicians" or some more or less well-defined subcommunity and, therefore, omission of explicit reference to the participant is often warranted. However, in historical, critical, or philosophical writing focused on emerging branches of mathematics such omission often invites confusion. One and the same argumentation has been a paradox for one mathematician, an inconsistency proof for another, and an indirect proof to a third. One and the same argumentation-text can appear to one mathematician to express an indirect proof while appearing to another mathematician to express a direct proof. WHAT IS A PARADOX’S SOLUTION? Of the above four sorts of argumentation only the paradox invites “solution" or “resolution", and ordinarily this is to be accomplished either by discovering a logical fallacy in the “reasoning" of the argumentation or by discovering that the conclusion is not really false or by discovering that one of the premises is not really true. Resolution of a paradox by a participant amounts to reclassifying a formerly paradoxical argumentation either as a “fallacy", as a direct proof of its conclusion, as an indirect proof of the negation of one of its premises, as an inconsistency proof, or as something else depending on the participant's state of knowledge or belief. This illustrates why an argumentation which is a paradox to a given mathematician at a given time may well not be a paradox to the same mathematician at a later time. -/- The present article considers several set-theoretic argumentations that appeared in the period 1903-1908. The year 1903 saw the publication of B. Russell's Principles of mathematics, [Cambridge Univ. Press, Cambridge, 1903; Jbuch 34, 62]. The year 1908 saw the publication of Russell's article on type theory as well as Ernst Zermelo's two watershed articles on the axiom of choice and the foundations of set theory. The argumentations discussed concern “the largest cardinal", “the largest ordinal", the well-ordering principle, “the well-ordering of the continuum", denumerability of ordinals and denumerability of reals. The article shows that these argumentations were variously classified by various mathematicians and that the surrounding atmosphere was one of confusion and misunderstanding, partly as a result of failure to make or to heed distinctions similar to those made above. The article implies that historians have made the situation worse by not observing or not analysing the nature of the confusion. -/- RECOMMENDATION This well-written and well-documented article exemplifies the fact that clarification of history can be achieved through articulation of distinctions that had not been articulated (or were not being heeded) at the time. The article presupposes extensive knowledge of the history of mathematics, of mathematics itself (especially set theory) and of philosophy. It is therefore not to be recommended for casual reading. AFTERWORD: This review was written at the same time Corcoran was writing his signature “Argumentations and logic”[249] that covers much of the same ground in much more detail. https://www.academia.edu/14089432/Argumentations_and_Logic . (shrink)
This is a chapter of the planned monograph "Out of Nowhere: The Emergence of Spacetime in Quantum Theories of Gravity", co-authored by Nick Huggett and Christian Wüthrich and under contract with Oxford University Press. (More information at www<dot>beyondspacetime<dot>net.) This chapter introduces causal set theory and identifies and articulates a 'problem of space' in this theory.
Set-theoretic pluralism is an increasingly influential position in the philosophy of set theory (Balaguer [1998], Linksy and Zalta [1995], Hamkins [2012]). There is considerable room for debate about how best to formulate set-theoretic pluralism, and even about whether the view is coherent. But there is widespread agreement as to what there is to recommend the view (given that it can be formulated coherently). Unlike set-theoretic universalism, set-theoretic pluralism affords an answer to Benacerraf’s epistemological challenge. The purpose of this paper (...) is to determine what Benacerraf’s challenge could be such that this view is warranted. I argue that it could not be any of the challenges with which it has been traditionally identified by its advocates, like of Benacerraf and Field. Not only are none of the challenges easier for the pluralist to meet. None satisfies a key constraint that has been placed on Benacerraf’s challenge. However, I argue that Benacerraf’s challenge could be the challenge to show that our set-theoretic beliefs are safe – i.e., to show that we could not have easily had false ones. Whether the pluralist is, in fact, better positioned to show that our set-theoretic beliefs are safe turns on a broadly empirical conjecture which is outstanding. If this conjecture proves to be false, then it is unclear what the epistemological argument for set-theoretic pluralism is supposed to be. (shrink)
The primary concern of this paper is to outline an explanation of how Kant derives morality from reason. We all know that Kant thought that morality comprises a set of demands that are unconditionally and universally valid. In addition, he thought that to support this understanding of moral principles, one must show that they originate in reason a priori, rather than in contingent facts about human psychology, or the circumstances of human life. But do we really understand how he tries (...) to establish that moral principles originate in reason? In at least two passages in the second section of the Groundwork, Kant insists upon the importance of grounding the moral law in practical reason a priori, and subsequently states a conception of practical reason from which he appears to extract a formulation of the Categorical Imperative. The reasoning employed in these passages would appear to be of central importance to the overall argument of the Groundwork, but in each case the route travelled from the definition of practical reason to the ensuing formulation of the moral law is obscure. My goal is to work out a plausible reconstruction of this portion of Kant’s argument. At the very least, I hope that my interpretation will illuminate the distinctive structure of the Kantian approach to questions of justification in ethics. What I understand of Kant’s view leads me to believe that its aims and overall shape are different in important respects from what is often assumed. It also represents an approach to foundational issues in ethics, which provides an alternative to many contemporary attempts to ground morality in reason. (shrink)
In the contemporary philosophy of set theory, discussion of new axioms that purport to resolve independence necessitates an explanation of how they come to be justified. Ordinarily, justification is divided into two broad kinds: intrinsic justification relates to how `intuitively plausible' an axiom is, whereas extrinsic justification supports an axiom by identifying certain `desirable' consequences. This paper puts pressure on how this distinction is formulated and construed. In particular, we argue that the distinction as often presented is neither well-demarcated (...) nor sufficiently precise. Instead, we suggest that the process of justification in set theory should not be thought of as neatly divisible in this way, but should rather be understood as a conceptually indivisible notion linked to the goal of explanation. (shrink)
The concepts of choice, negation, and infinity are considered jointly. The link is the quantity of information interpreted as the quantity of choices measured in units of elementary choice: a bit is an elementary choice between two equally probable alternatives. “Negation” supposes a choice between it and confirmation. Thus quantity of information can be also interpreted as quantity of negations. The disjunctive choice between confirmation and negation as to infinity can be chosen or not in turn: This corresponds to set- (...) class='Hi'>theory or intuitionist approach to the foundation of mathematics and to Peano or Heyting arithmetic. Quantum mechanics can be reformulated in terms of information introducing the concept and quantity of quantum information. A qubit can be equivalently interpreted as that generalization of “bit” where the choice is among an infinite set or series of alternatives. The complex Hilbert space can be represented as both series of qubits and value of quantum information. The complex Hilbert space is that generalization of Peano arithmetic where any natural number is substituted by a qubit. “Negation”, “choice”, and “infinity” can be inherently linked to each other both in the foundation of mathematics and quantum mechanics by the meditation of “information” and “quantum information”. (shrink)
The link between the high-order metaphysics and abstractions, on the one hand, and choice in the foundation of set theory, on the other hand, can distinguish unambiguously the “good” principles of abstraction from the “bad” ones and thus resolve the “bad company problem” as to set theory. Thus it implies correspondingly a more precise definition of the relation between the axiom of choice and “all company” of axioms in set theory concerning directly or indirectly abstraction: the principle (...) of abstraction, axiom of comprehension, axiom scheme of specification, axiom scheme of separation, subset axiom scheme, axiom scheme of replacement, axiom of unrestricted comprehension, axiom of extensionality, etc. (shrink)
One of the traditional desiderata for a metaphysical theory of laws of nature is that it be able to explain natural regularities. Some philosophers have postulated governing laws to fill this explanatory role. Recently, however, many have attempted to explain natural regularities without appealing to governing laws. Suppose that some fundamental properties are bare dispositions. In virtue of their dispositional nature, these properties must be (or are likely to be) distributed in regular patterns. Thus it would appear that an (...) ontology including bare dispositions can dispense with governing laws of nature. I believe that there is a problem with this line of reasoning. In this essay, I’ll argue that governing laws are indispensable for the explanation of a special sort of natural regularity: those holding among categorical properties (or, as I’ll call them, categorical regularities). This has the potential to be a serious objection to the denial of governing laws, since there may be good reasons to believe that observed regularities are categorical regularities. (shrink)
The paper introduces and utilizes a few new concepts: “nonstandard Peano arithmetic”, “complementary Peano arithmetic”, “Hilbert arithmetic”. They identify the foundations of both mathematics and physics demonstrating the equivalence of the newly introduced Hilbert arithmetic and the separable complex Hilbert space of quantum mechanics in turn underlying physics and all the world. That new both mathematical and physical ground can be recognized as information complemented and generalized by quantum information. A few fundamental mathematical problems of the present such as Fermat’s (...) last theorem, four-color theorem as well as its new-formulated generalization as “four-letter theorem”, Poincaré’s conjecture, “P vs NP” are considered over again, from and within the new-founding conceptual reference frame of information, as illustrations. Simple or crucially simplifying solutions and proofs are demonstrated. The link between the consistent completeness of the system mathematics-physics on the ground of information and all the great mathematical problems of the present (rather than the enumerated ones) is suggested. (shrink)
This paper demonstrates something that Kant notoriously claimed to be possible, but which Kant scholars today widely believe to be impossible: unification of all three formulations of the Categorical Imperative. Part 1 of this paper tells a broad-brush story of how I understand Kant’s theory of practical reason and morality, showing how the three formulations of the Categorical Imperative appear to be unified. Part 2 then provides clear textual support for each premise in the argument for my (...) interpretation. (shrink)
I develop a new theory of properties by considering two central arguments in the debate whether properties are dispositional or categorical. The first claims that objects must possess categorical properties in order to be distinct from empty space. The second argument, however, points out several untoward consequences of positing categorical properties. I explore these arguments and argue that despite appearances, their conclusions need not be in conflict with one another. In particular, we can view the second (...) argument as supporting only the claim that there is not a plurality of categorical properties, and not the stronger claim that there are no categorical properties whatsoever. I then develop a new account of properties which capitalizes on this insight. (shrink)
A Cantorian argument that there is no set of all truths. There is, for the same reason, no possible world as a maximal set of propositions. And omniscience is logically impossible.
Categorical foundations and set-theoretical foundations are sometimes presented as alternative foundational schemes. So far, the literature has mostly focused on the weaknesses of the categorical foundations. We want here to concentrate on what we take to be one of its strengths: the explicit identification of so-called canonical maps and their role in mathematics. Canonical maps play a central role in contemporary mathematics and although some are easily defined by set-theoretical tools, they all appear systematically in a categorical (...) framework. The key element here is the systematic nature of these maps in a categorical framework and I suggest that, from that point of view, one can see an architectonic of mathematics emerging clearly. Moreover, they force us to reconsider the nature of mathematical knowledge itself. Thus, to understand certain fundamental aspects of mathematics, category theory is necessary (at least, in the present state of mathematics). (shrink)
It is a striking fact from reverse mathematics that almost all theorems of countable and countably representable mathematics are equivalent to just five subsystems of second order arithmetic. The standard view is that the significance of these equivalences lies in the set existence principles that are necessary and sufficient to prove those theorems. In this article I analyse the role of set existence principles in reverse mathematics, and argue that they are best understood as closure conditions on the powerset of (...) the natural numbers. (shrink)
The concept of “life” certainly is of some use to distinguish birds and beavers from water and stones. This pragmatic usefulness has led to its construal as a categorical predicate that can sift out living entities from non-living ones depending on their possessing specific properties—reproduction, metabolism, evolvability etc. In this paper, we argue against this binary construal of life. Using text-mining methods across over 30,000 scientific articles, we defend instead a degrees-of-life view and show how these methods can contribute (...) to experimental philosophy of science and concept explication. We apply topic-modeling algorithms to identify which specific properties are attributed to a target set of entities (bacteria, archaea, viruses, prions, plasmids, phages and the molecule of adenine). Eight major clusters of properties were identified together with their relative relevance for each target entity (two that relate to metabolism and catalysis, one to genetics, one to evolvability, one to structure, and—rather unexpectedly—three that concern interactions with the environment broadly construed). While aligning with intuitions—for instance about viruses being less alive than bacteria—these quantitative results also reveal differential degrees of performance that have so far remained elusive or overlooked. Taken together, these analyses provide a conceptual “lifeness space” that makes it possible to move away from a categorical construal of life by empirically assessing the relative lifeness of more-or-less alive entities. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the idea arises of a (...) dual logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
Naturalism, as Binmore understands the term, is characterized by a scientific stance on moral behavior. Binmore claims that a naturalistic account of morality necessarily goes with the conviction “that only hypothetical imperatives make any sense”. In this paper it is argued that this claim is mistaken. First, as Hume’s theory of promising shows, naturalism in the sense of Binmore is very well compatible with acknowledging the importance of categorical imperatives in moral practice. Moreover, second, if Binmore’s own (...) class='Hi'>theory of moral practice and its evolution is correct, then the actual moral practice does—and in fact must—incorporate norms, which have the form of a categorical imperative. Categorical imperatives are part of social reality and, therefore, any moral theory that adequately reflects moral practice must also include categorical imperatives. (shrink)
The ontological analysis of concrete particulars deals with the relationship between concrete object and their attributes. Both Peter Simons' Nuclear Theory and Aristotelian Substance Theory may present an acceptable explanation for the following three problems: Identity of Indiscernible, Excessive Necessitism and Change. In spite of the superiority of these two theories over the other theories, each has problems need to be addressed. Aristotelian theory does not seem to be very successful in showing that the spices (kinds) are (...) unchangeable or at least does not explain it. Simmons's theory also returns to Aristotelian theory. While outlining the challenges to any theory, the final suggestion of the paper is that with inspiration of Simmons' theory and introducing a relation between the generalities (universal Whatness), create a modification in Aristotelian theory to overcome the challenge of the irreductive challenge of Species (kinds) Simons' theory, in some sense, collapses back to Aristotelian Theory. (shrink)
This essay examines the philosophical significance of Ω-logic in Zermelo-Fraenkel set theory with choice (ZFC). The dual isomorphism between algebra and coalgebra permits Boolean-valued algebraic models of ZFC to be interpreted as coalgebras. The modal profile of Ω-logical validity can then be countenanced within a coalgebraic logic, and Ω-logical validity can be defined via deterministic automata. I argue that the philosophical significance of the foregoing is two-fold. First, because the epistemic and modal profiles of Ω-logical validity correspond to those (...) of second-order logical consequence, Ω-logical validity is genuinely logical, and thus vindicates a neo-logicist conception of mathematical truth in the set-theoretic multiverse. Second, the foregoing provides a modal-computational account of the interpretation of mathematical vocabulary, adducing in favor of a realist conception of the cumulative hierarchy of sets. (shrink)
Future Logic is an original, and wide-ranging treatise of formal logic. It deals with deduction and induction, of categorical and conditional propositions, involving the natural, temporal, extensional, and logical modalities. Traditional and Modern logic have covered in detail only formal deduction from actual categoricals, or from logical conditionals (conjunctives, hypotheticals, and disjunctives). Deduction from modal categoricals has also been considered, though very vaguely and roughly; whereas deduction from natural, temporal and extensional forms of conditioning has been all but totally (...) ignored. As for induction, apart from the elucidation of adductive processes (the scientific method), almost no formal work has been done. This is the first work ever to strictly formalize the inductive processes of generalization and particularization, through the novel methods of factorial analysis, factor selection and formula revision. This is the first work ever to develop a formal logic of the natural, temporal and extensional types of conditioning (as distinct from logical conditioning), including their production from modal categorical premises. Future Logic contains a great many other new discoveries, organized into a unified, consistent and empirical system, with precise definitions of the various categories and types of modality (including logical modality), and full awareness of the epistemological and ontological issues involved. Though strictly formal, it uses ordinary language, wherever symbols can be avoided. Among its other contributions: a full list of the valid modal syllogisms (which is more restrictive than previous lists); the main formalities of the logic of change (which introduces a dynamic instead of merely static approach to classification); the first formal definitions of the modal types of causality; a new theory of class logic, free of the Russell Paradox; as well as a critical review of modern metalogic. But it is impossible to list briefly all the innovations in logical science — and therefore, epistemology and ontology — this book presents; it has to be read for its scope to be appreciated. (shrink)
Scientific antirealists run the argument from underconsideration against scientific realism. I argue that the argument from underconsideration backfires on antirealists’ positive philosophical theories, such as the contextual theory of explanation (van Fraassen, 1980), the English model of rationality (van Fraassen, 1989), the evolutionary explanation of the success of science (Wray, 2008; 2012), and explanatory idealism (Khalifa, 2013). Antirealists strengthen the argument from underconsideration with the pessimistic induction against current scientific theories. In response, I construct a pessimistic induction against antirealists (...) that since antirealists generated problematic philosophical theories in the past, they must be generating problematic philosophical theories now. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.