Boolean-valued models of set theory were independently introduced by Scott, Solovay and Vopěnka in 1965, offering a natural and rich alternative for describing forcing. The original method was adapted by Takeuti, Titani, Kozawa and Ozawa to lattice-valued models of set theory. After this, Löwe and Tarafder proposed a class of algebras based on a certain kind of implication which satisfy several axioms of ZF. From this class, they found a specific 3-valued model called PS3 which satisfies all (...) the axioms of ZF, and can be expanded with a paraconsistent negation *, thus obtaining a paraconsistent model of ZF. The logic (PS3 ,*) coincides (up to language) with da Costa and D'Ottaviano logic J3, a 3-valued paraconsistent logic that have been proposed independently in the literature by several authors and with different motivations such as CluNs, LFI1 and MPT. We propose in this paper a family of algebraic models of ZFC based on LPT0, another linguistic variant of J3 introduced by us in 2016. The semantics of LPT0, as well as of its first-order version QLPT0, is given by twist structures defined over Boolean agebras. From this, it is possible to adapt the standard Boolean-valued models of (classical) ZFC to twist-valued models of an expansion of ZFC by adding a paraconsistent negation. We argue that the implication operator of LPT0 is more suitable for a paraconsistent set theory than the implication of PS3, since it allows for genuinely inconsistent sets w such that [(w = w)] = 1/2 . This implication is not a 'reasonable implication' as defined by Löwe and Tarafder. This suggests that 'reasonable implication algebras' are just one way to define a paraconsistent set theory. Our twist-valued models are adapted to provide a class of twist-valued models for (PS3,*), thus generalizing Löwe and Tarafder result. It is shown that they are in fact models of ZFC (not only of ZF). (shrink)
Deontic logic is devoted to the study of logical properties of normative predicates such as permission, obligation and prohibition. Since it is usual to apply these predicates to actions, many deontic logicians have proposed formalisms where actions and action combinators are present. Some standard action combinators are action conjunction, choice between actions and not doing a given action. These combinators resemble boolean operators, and therefore the theory of booleanalgebra offers a well-known athematical framework to (...) study the properties of the classic deontic operators when applied to actions. In his seminal work, Segerberg uses constructions coming from boolean algebras to formalize the usual deontic notions. Segerberg’s work provided the initial step to understand logical properties of deontic operators when they are applied to actions. In the last years, other authors have proposed related logics. In this chapter we introduce Segerberg’s work, study related formalisms and investigate further challenges in this area. (shrink)
The paper considers a generalization of Peano arithmetic, Hilbert arithmetic as the basis of the world in a Pythagorean manner. Hilbert arithmetic unifies the foundations of mathematics (Peano arithmetic and set theory), foundations of physics (quantum mechanics and information), and philosophical transcendentalism (Husserl’s phenomenology) into a formal theory and mathematical structure literally following Husserl’s tracе of “philosophy as a rigorous science”. In the pathway to that objective, Hilbert arithmetic identifies by itself information related to finite sets and series (...) and quantum information referring to infinite one as both appearing in three “hypostases”: correspondingly, mathematical, physical and ontological, each of which is able to generate a relevant science and area of cognition. Scientific transcendentalism is a falsifiable counterpart of philosophical transcendentalism. The underlying concept of the totality can be interpreted accordingly also mathematically, as consistent completeness, and physically, as the universe defined not empirically or experimentally, but as that ultimate wholeness containing its externality into itself. (shrink)
This essay examines the philosophical significance of $\Omega$-logic in Zermelo-Fraenkel set theory with choice (ZFC). The categorical duality between coalgebra and algebra permits Boolean-valued algebraic models of ZFC to be interpreted as coalgebras. The modal profile of $\Omega$-logical validity can then be countenanced within a coalgebraic logic, and $\Omega$-logical validity can be defined via deterministic automata. I argue that the philosophical significance of the foregoing is two-fold. First, because the epistemic and modal profiles of $\Omega$-logical (...) validity correspond to those of second-order logical consequence, $\Omega$-logical validity is genuinely logical. Second, the foregoing provides a modal account of the interpretation of mathematical vocabulary. (shrink)
Relevance logic has become ontologically fertile. No longer is the idea of relevance restricted in its application to purely logical relations among propositions, for as Dunn has shown in his (1987), it is possible to extend the idea in such a way that we can distinguish also between relevant and irrelevant predications, as for example between “Reagan is tall” and “Reagan is such that Socrates is wise”. Dunn shows that we can exploit certain special properties of identity within the (...) context of standard relevance logic in a way which allows us to discriminate further between relevant and irrelevant properties, as also between relevant and irrelevant relations. The idea yields a family of ontologically interesting results concerning the different ways in which attributes and objects may hang together. Because of certain notorious peculiarities of relevance logic, however,1 Dunn’s idea breaks down where the attempt is made to have it bear fruit in application to relations among entities which are of homogeneous type. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence (...) the idea arises of a dual logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
Instead of the half-century old foundational feud between set theory and category theory, this paper argues that they are theories about two different complementary types of universals. The set-theoretic antinomies forced naïve set theory to be reformulated using some iterative notion of a set so that a set would always have higher type or rank than its members. Then the universal u_{F}={x|F(x)} for a property F() could never be self-predicative in the sense of u_{F}∈u_{F}. But the mathematical (...)theory of categories, dating from the mid-twentieth century, includes a theory of always-self-predicative universals--which can be seen as forming the "other bookend" to the never-self-predicative universals of set theory. The self-predicative universals of category theory show that the problem in the antinomies was not self-predication per se, but negated self-predication. They also provide a model (in the Platonic Heaven of mathematics) for the self-predicative strand of Plato's Theory of Forms as well as for the idea of a "concrete universal" in Hegel and similar ideas of paradigmatic exemplars in ordinary thought. (shrink)
Multialgebras have been much studied in mathematics and in computer science. In 2016 Carnielli and Coniglio introduced a class of multialgebras called swap structures, as a semantic framework for dealing with several Logics of Formal Inconsistency that cannot be semantically characterized by a single finite matrix. In particular, these LFIs are not algebraizable by the standard tools of abstract algebraic logic. In this paper, the first steps towards a theory of non-deterministic algebraization of logics by swap structures are (...) given. Specifically, a formal study of swap structures for LFIs is developed, by adapting concepts of universal algebra to multialgebras in a suitable way. A decomposition theorem similar to Birkhoff’s representation theorem is obtained for each class of swap structures. Moreover, when applied to the 3-valued algebraizable logics J3 and Ciore, their classes of algebraic models are retrieved, and the swap structures semantics become twist structures semantics. This fact, together with the existence of a functor from the category of Boolean algebras to the category of swap structures for each LFI, suggests that swap structures can be seen as non-deterministic twist structures. This opens new avenues for dealing with non-algebraizable logics by the more general methodology of multialgebraic semantics. (shrink)
The paper investigates the understanding of quantum indistinguishability after quantum information in comparison with the “classical” quantum mechanics based on the separable complex Hilbert space. The two oppositions, correspondingly “distinguishability / indistinguishability” and “classical / quantum”, available implicitly in the concept of quantum indistinguishability can be interpreted as two “missing” bits of classical information, which are to be added after teleportation of quantum information to be restored the initial state unambiguously. That new understanding of quantum indistinguishability is linked to the (...) distinction of classical (Maxwell-Boltzmann) versus quantum (either Fermi-Dirac or Bose-Einstein) statistics. The latter can be generalized to classes of wave functions (“empty” qubits) and represented exhaustively in Hilbert arithmetic therefore connectible to the foundations of mathematics, more precisely, to the interrelations of propositional logic and set theory sharing the structure of Booleanalgebra and two anti-isometric copies of Peano arithmetic. (shrink)
Despite the efforts undertaken to separate scientific reasoning and metaphysical considerations, despite the rigor of construction of mathematics, these are not, in their very foundations, independent of the modalities, of the laws of representation of the world. The OdC shows that the logical Facts Exist neither more nor less than the Facts of the world which are Facts of Knowledge. Mathematical facts are representation facts. The primary objective of this article is to integrate the subject into mathematics as a mode (...) of emergence of meaning and then to evaluate its consequences. (shrink)
A possible world is a junky world if and only if each thing in it is a proper part. The possibility of junky worlds contradicts the principle of general fusion. Bohn (2009) argues for the possibility of junky worlds, Watson (2010) suggests that Bohn‘s arguments are flawed. This paper shows that the arguments of both authors leave much to be desired. First, relying on the classical results of Cantor, Zermelo, Fraenkel, and von Neumann, this paper proves the possibility of junky (...) worlds for certain weak set theories. Second, the paradox of Burali-Forti shows that according to the Zermelo-Fraenkel set theory ZF, junky worlds are possible. Finally, it is shown that set theories are not the only sources for designing plausible models of junky worlds: Topology (and possibly other "algebraic" mathematical theories) may be used to construct models of junky worlds. In sum, junkyness is a relatively widespread feature among possible worlds. (shrink)
ABSTRACT Theories of sets such as Zermelo Fraenkel set theory are usually presented as the combination of two distinct kinds of principles: logical and set-theoretic principles. The set-theoretic principles are imposed ‘on top’ of first-order logic. This is in agreement with a traditional view of logic as universally applicable and topic neutral. Such a view of logic has been rejected by the intuitionists, on the ground that quantification over infinite domains requires the use of intuitionistic rather (...) than classical logic. In the following, I consider constructive set theories, which use intuitionistic rather than classical logic, and argue that they manifest a distinctive interdependence or an entanglement between sets and logic. In fact, Martin-Löf type theory identifies fundamental logical and set-theoretic notions. Remarkably, one of the motivations for this identification is the thought that classical quantification over infinite domains is problematic, while intuitionistic quantification is not. The approach to quantification adopted in Martin-Löf’s type theory is subtly interconnected with its predicativity. I conclude by recalling key aspects of an approach to predicativity inspired by Poincaré, which focuses on the issue of correct quantification over infinite domains and relate it back to Martin-Löf type theory. (shrink)
In his classic book “the Foundations of Statistics” Savage developed a formal system of rational decision making. The system is based on (i) a set of possible states of the world, (ii) a set of consequences, (iii) a set of acts, which are functions from states to consequences, and (iv) a preference relation over the acts, which represents the preferences of an idealized rational agent. The goal and the culmination of the enterprise is a representation theorem: Any preference relation that (...) satisfies certain arguably acceptable postulates determines a (finitely additive) probability distribution over the states and a utility assignment to the consequences, such that the preferences among acts are determined by their expected utilities. Additional problematic assumptions are however required in Savage's proofs. First, there is a Booleanalgebra of events (sets of states) which determines the richness of the set of acts. The probabilities are assigned to members of this algebra. Savage's proof requires that this be a σ-algebra (i.e., closed under infinite countable unions and intersections), which makes for an extremely rich preference relation. On Savage's view we should not require subjective probabilities to be σ-additive. He therefore finds the insistence on a σ-algebra peculiar and is unhappy with it. But he sees no way of avoiding it. Second, the assignment of utilities requires the constant act assumption: for every consequence there is a constant act, which produces that consequence in every state. This assumption is known to be highly counterintuitive. The present work contains two mathematical results. The first, and the more difficult one, shows that the σ-algebra assumption can be dropped. The second states that, as long as utilities are assigned to finite gambles only, the constant act assumption can be replaced by the more plausible and much weaker assumption that there are at least two non-equivalent constant acts. The second result also employs a novel way of deriving utilities in Savage-style systems -- without appealing to von Neumann-Morgenstern lotteries. The paper discusses the notion of “idealized agent" that underlies Savage's approach, and argues that the simplified system, which is adequate for all the actual purposes for which the system is designed, involves a more realistic notion of an idealized agent. (shrink)
El objetivo de este artículo es presentar una demostración de un teorema clásico sobre álgebras booleanas y ordenes parciales de relevancia actual en teoría de conjuntos, como por ejemplo, para aplicaciones del método de construcción de modelos llamado “forcing” (con álgebras booleanas completas o con órdenes parciales). El teorema que se prueba es el siguiente: “Todo orden parcial se puede extender a una única álgebra booleana completa (salvo isomorfismo)”. Donde extender significa “sumergir densamente”. Tal demostración se realiza utilizando cortaduras de (...) Dedekind siguiendo el texto “Set Theory” de Jech, y otras ideas propias del autor de este artículo. Adicionalmente, se formulan algunas versiones débiles del axioma de elección relacionadas con las álgebras booleanas, las cuales son también de gran importancia para la investigación en teoría de conjuntos y teoría de modelos, pues estas son poderosas técnicas de construcción de modelos, como por ejemplo, el teorema de compacidad (permite construir modelos no estándar, etc) y el teorema del ultrafiltro, que permite construir ultraproductos (pueden ser usados para investigar problemas de cardinales grandes, etc). Se presentan algunas referencias de problemas abiertos sobre el tema. -/- The objective of this paper is to present a demonstration of a classical theorem on boolean algebras and partial orders of current relevance in set theory, as for example, for applications of model construction method called forcing" (with boolean algebras complete or with partial orders). The theorem to be proved is as follows: Any partial order can be extended to a single complete booleanalgebra (up to isomorphism)". Where to extend means embed densely". Such a demonstration is done using Dedekind's cuts following the text Set Theory" of Jech, and other ideas of the author of this article. In addition, some weak versions of the axiom of choice related to boolean algebras are formulated, which are also of great importance for the research in set theory and model theory, since this are powerful model construction techniques, such as the compactness theorem (allows the construction of non-standard models, etc.) and the ultralter theorem, which allows the construction of ultraproducts (can be used to investigate problems of large cardinals, etc). Some references of open problems on the subject are presented. (shrink)
The concepts of choice, negation, and infinity are considered jointly. The link is the quantity of information interpreted as the quantity of choices measured in units of elementary choice: a bit is an elementary choice between two equally probable alternatives. “Negation” supposes a choice between it and confirmation. Thus quantity of information can be also interpreted as quantity of negations. The disjunctive choice between confirmation and negation as to infinity can be chosen or not in turn: This corresponds to set- (...) class='Hi'>theory or intuitionist approach to the foundation of mathematics and to Peano or Heyting arithmetic. Quantum mechanics can be reformulated in terms of information introducing the concept and quantity of quantum information. A qubit can be equivalently interpreted as that generalization of “bit” where the choice is among an infinite set or series of alternatives. The complex Hilbert space can be represented as both series of qubits and value of quantum information. The complex Hilbert space is that generalization of Peano arithmetic where any natural number is substituted by a qubit. “Negation”, “choice”, and “infinity” can be inherently linked to each other both in the foundation of mathematics and quantum mechanics by the meditation of “information” and “quantum information”. (shrink)
Modern categorical logic as well as the Kripke and topological models of intuitionistic logic suggest that the interpretation of ordinary “propositional” logic should in general be the logic of subsets of a given universe set. Partitions on a set are dual to subsets of a set in the sense of the category-theoretic duality of epimorphisms and monomorphisms—which is reflected in the duality between quotient objects and subobjects throughout algebra. If “propositional” logic is thus seen (...) as the logic of subsets of a universe set, then the question naturally arises of a dual logic of partitions on a universe set. This paper is an introduction to that logic of partitions dual to classical subset logic. The paper goes from basic concepts up through the correctness and completeness theorems for a tableau system of partition logic. (shrink)
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Booleanlogic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability (...) based on the Booleanlogic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition. (shrink)
According to Cantor (Mathematische Annalen 21:545–586, 1883 ; Cantor’s letter to Dedekind, 1899 ) a set is any multitude which can be thought of as one (“jedes Viele, welches sich als Eines denken läßt”) without contradiction—a consistent multitude. Other multitudes are inconsistent or paradoxical. Set theoretical paradoxes have common root—lack of understanding why some multitudes are not sets. Why some multitudes of objects of thought cannot themselves be objects of thought? Moreover, it is a logical truth that such multitudes do (...) exist. However we do not understand this logical truth so well as we understand, for example, the logical truth $${\forall x \, x = x}$$ . In this paper we formulate a logical truth which we call the productivity principle. Rusell (Proc Lond Math Soc 4(2):29–53, 1906 ) was the first one to formulate this principle, but in a restricted form and with a different purpose. The principle explicates a logical mechanism that lies behind paradoxical multitudes, and is understandable as well as any simple logical truth. However, it does not explain the concept of set. It only sets logical bounds of the concept within the framework of the classical two valued $${\in}$$ -language. The principle behaves as a logical regulator of any theory we formulate to explain and describe sets. It provides tools to identify paradoxical classes inside the theory. We show how the known paradoxical classes follow from the productivity principle and how the principle gives us a uniform way to generate new paradoxical classes. In the case of ZFC set theory the productivity principle shows that the limitation of size principles are of a restrictive nature and that they do not explain which classes are sets. The productivity principle, as a logical regulator, can have a definite heuristic role in the development of a consistent set theory. We sketch such a theory—the cumulative cardinal theory of sets. The theory is based on the idea of cardinality of collecting objects into sets. Its development is guided by means of the productivity principle in such a way that its consistency seems plausible. Moreover, the theory inherits good properties from cardinal conception and from cumulative conception of sets. Because of the cardinality principle it can easily justify the replacement axiom, and because of the cumulative property it can easily justify the power set axiom and the union axiom. It would be possible to prove that the cumulative cardinal theory of sets is equivalent to the Morse–Kelley set theory. In this way we provide a natural and plausibly consistent axiomatization for the Morse–Kelley set theory. (shrink)
Traditionally, pronouns are treated as ambiguous between bound and demonstrative uses. Bound uses are non-referential and function as bound variables, and demonstrative uses are referential and take as a semantic value their referent, an object picked out jointly by linguistic meaning and a further cue—an accompanying demonstration, an appropriate and adequately transparent speaker’s intention, or both. In this paper, we challenge tradition and argue that both demonstrative and bound pronouns are dependent on, and co-vary with, antecedent expressions. Moreover, the semantic (...) value of a pronoun is never determined, even partly, by extra-linguistic cues; it is fixed, invariably and unambiguously, by features of its context of use governed entirely by linguistic rules. We exploit the mechanisms of Centering and Coherence theories to develop a precise and general meta-semantics for pronouns, according to which the semantic value of a pronoun is determined by what is at the center of attention in a coherent discourse. Since the notions of attention and coherence are, we argue, governed by linguistic rules, we can give a uniform analysis of pronoun resolution that covers bound, demonstrative, and even discourse bound readings. Just as the semantic value of the first-person pronoun ‘I’ is conventionally set by a particular feature of its context of use—namely, the speaker—so too, we will argue, the semantic values of other pronouns, including ‘he’, are conventionally set by particular features of the context of use. (shrink)
Any logic is represented as a certain collection of well-orderings admitting or not some algebraic structure such as a generalized lattice. Then universal logic should refer to the class of all subclasses of all well-orderings. One can construct a mapping between Hilbert space and the class of all logics. Thus there exists a correspondence between universal logic and the world if the latter is considered a collection of wave functions, as which the points in Hilbert space can (...) be interpreted. The correspondence can be further extended to the foundation of mathematics by set theory and arithmetic, and thus to all mathematics. (shrink)
The notion of equality between two observables will play many important roles in foundations of quantum theory. However, the standard probabilistic interpretation based on the conventional Born formula does not give the probability of equality between two arbitrary observables, since the Born formula gives the probability distribution only for a commuting family of observables. In this paper, quantum set theory developed by Takeuti and the present author is used to systematically extend the standard probabilistic interpretation of quantum (...) class='Hi'>theory to define the probability of equality between two arbitrary observables in an arbitrary state. We apply this new interpretation to quantum measurement theory, and establish a logical basis for the difference between simultaneous measurability and simultaneous determinateness. (shrink)
Hyperboolean algebras are Boolean algebras with operators, constructed as algebras of complexes (or, power structures) of Boolean algebras. They provide an algebraic semantics for a modal logic (called here a {\em hyperboolean modal logic}) with a Kripke semantics accordingly based on frames in which the worlds are elements of Boolean algebras and the relations correspond to the Boolean operations. We introduce the hyperboolean modal logic, give a complete axiomatization of it, and show that (...) it lacks the finite model property. The method of axiomatization hinges upon the fact that a "difference" operator is definable in hyperboolean algebras, and makes use of additional non-Hilbert-style rules. Finally, we discuss a number of open questions and directions for further research. (shrink)
The program put forward in von Wright's last works defines deontic logic as ``a study of conditions which must be satisfied in rational norm-giving activity'' and thus introduces the perspective of logical pragmatics. In this paper a formal explication for von Wright's program is proposed within the framework of set-theoretic approach and extended to a two-sets model which allows for the separate treatment of obligation-norms and permission norms. The three translation functions connecting the language of deontic logic with (...) the language of the extended set-theoretical approach are introduced, and used in proving the correspondence between the deontic theorems, on one side, and the perfection properties of the norm-set and the ``counter-set'', on the other side. In this way the possibility of reinterpretation of standard deontic logic as the theory of perfection properties that ought to be achieved in norm-giving activity has been formally proved. The extended set-theoretic approach is applied to the problem of rationality of principles of completion of normative systems. The paper concludes with a plaidoyer for logical pragmatics turn envisaged in the late phase of Von Wright's work in deontic logic. (shrink)
One of the most expected properties of a logical system is that it can be algebraizable, in the sense that an algebraic counterpart of the deductive machinery could be found. Since the inception of da Costa's paraconsistent calculi, an algebraic equivalent for such systems have been searched. It is known that these systems are non self-extensional (i.e., they do not satisfy the replacement property). More than this, they are not algebraizable in the sense of Blok-Pigozzi. The same negative results hold (...) for several systems of the hierarchy of paraconsistent logics known as Logics of Formal Inconsistency (LFIs). Because of this, these logics are uniquely characterized by semantics of non-deterministic kind. This paper offers a solution for two open problems in the domain of paraconsistency, in particular connected to algebraization of LFIs, by obtaining several LFIs weaker than C1, each of one is algebraizable in the standard Lindenbaum-Tarski's sense by a suitable variety of Boolean algebras extended with operators. This means that such LFIs satisfy the replacement property. The weakest LFI satisfying replacement presented here is called RmbC, which is obtained from the basic LFI called mbC. Some axiomatic extensions of RmbC are also studied, and in addition a neighborhood semantics is defined for such systems. It is shown that RmbC can be defined within the minimal bimodal non-normal logic E+E defined by the fusion of the non-normal modal logic E with itself. Finally, the framework is extended to first-order languages. RQmbC, the quantified extension of RmbC, is shown to be sound and complete w.r.t. BALFI semantics. (shrink)
In this paper I apply the concept of _inter-Model Inconsistency in Set Theory_ (MIST), introduced by Carolin Antos (this volume), to select positions in the current universe-multiverse debate in philosophy of set theory: I reinterpret H. Woodin’s _Ultimate L_, J. D. Hamkins’ multiverse, S.-D. Friedman’s hyperuniverse and the algebraic multiverse as normative strategies to deal with the situation of de facto inconsistency toleration in set theory as described by MIST. In particular, my aim is to situate these positions (...) on the spectrum from inconsistency avoidance to inconsistency toleration. By doing so, I connect a debate in philosophy of set theory with a debate in philosophy of science about the role of inconsistencies in the natural sciences. While there are important differences, like the lack of threatening explosive inferences, I show how specific philosophical positions in the philosophy of set theory can be interpreted as reactions to a state of inconsistency similar to analogous reactions studied in the philosophy of science literature. My hope is that this transfer operation from philosophy of science to mathematics sheds a new light on the current discussion in philosophy of set theory; and that it can help to bring philosophy of mathematics and philosophy of science closer together. (shrink)
In the contemporary philosophy of set theory, discussion of new axioms that purport to resolve independence necessitates an explanation of how they come to be justified. Ordinarily, justification is divided into two broad kinds: intrinsic justification relates to how `intuitively plausible' an axiom is, whereas extrinsic justification supports an axiom by identifying certain `desirable' consequences. This paper puts pressure on how this distinction is formulated and construed. In particular, we argue that the distinction as often presented is neither well-demarcated (...) nor sufficiently precise. Instead, we suggest that the process of justification in set theory should not be thought of as neatly divisible in this way, but should rather be understood as a conceptually indivisible notion linked to the goal of explanation. (shrink)
DEFINING OUR TERMS A “paradox" is an argumentation that appears to deduce a conclusion believed to be false from premises believed to be true. An “inconsistency proof for a theory" is an argumentation that actually deduces a negation of a theorem of the theory from premises that are all theorems of the theory. An “indirect proof of the negation of a hypothesis" is an argumentation that actually deduces a conclusion known to be false from the hypothesis alone (...) or, more commonly, from the hypothesis augmented by a set of premises known to be true. A “direct proof of a hypothesis" is an argumentation that actually deduces the hypothesis itself from premises known to be true. Since `appears', `believes' and `knows' all make elliptical reference to a participant, it is clear that `paradox', `indirect proof' and `direct proof' are all participant-relative. PARTICIPANT RELATIVITY In normal mathematical writing the participant is presumed to be “the community of mathematicians" or some more or less well-defined subcommunity and, therefore, omission of explicit reference to the participant is often warranted. However, in historical, critical, or philosophical writing focused on emerging branches of mathematics such omission often invites confusion. One and the same argumentation has been a paradox for one mathematician, an inconsistency proof for another, and an indirect proof to a third. One and the same argumentation-text can appear to one mathematician to express an indirect proof while appearing to another mathematician to express a direct proof. WHAT IS A PARADOX’S SOLUTION? Of the above four sorts of argumentation only the paradox invites “solution" or “resolution", and ordinarily this is to be accomplished either by discovering a logical fallacy in the “reasoning" of the argumentation or by discovering that the conclusion is not really false or by discovering that one of the premises is not really true. Resolution of a paradox by a participant amounts to reclassifying a formerly paradoxical argumentation either as a “fallacy", as a direct proof of its conclusion, as an indirect proof of the negation of one of its premises, as an inconsistency proof, or as something else depending on the participant's state of knowledge or belief. This illustrates why an argumentation which is a paradox to a given mathematician at a given time may well not be a paradox to the same mathematician at a later time. -/- The present article considers several set-theoretic argumentations that appeared in the period 1903-1908. The year 1903 saw the publication of B. Russell's Principles of mathematics, [Cambridge Univ. Press, Cambridge, 1903; Jbuch 34, 62]. The year 1908 saw the publication of Russell's article on type theory as well as Ernst Zermelo's two watershed articles on the axiom of choice and the foundations of set theory. The argumentations discussed concern “the largest cardinal", “the largest ordinal", the well-ordering principle, “the well-ordering of the continuum", denumerability of ordinals and denumerability of reals. The article shows that these argumentations were variously classified by various mathematicians and that the surrounding atmosphere was one of confusion and misunderstanding, partly as a result of failure to make or to heed distinctions similar to those made above. The article implies that historians have made the situation worse by not observing or not analysing the nature of the confusion. -/- RECOMMENDATION This well-written and well-documented article exemplifies the fact that clarification of history can be achieved through articulation of distinctions that had not been articulated (or were not being heeded) at the time. The article presupposes extensive knowledge of the history of mathematics, of mathematics itself (especially set theory) and of philosophy. It is therefore not to be recommended for casual reading. AFTERWORD: This review was written at the same time Corcoran was writing his signature “Argumentations and logic”[249] that covers much of the same ground in much more detail. https://www.academia.edu/14089432/Argumentations_and_Logic . (shrink)
In this paper a solution of Whitehead’s problem is presented: Starting with a purely mereological system of regions a topological space is constructed such that the class of regions is isomorphic to the Boolean lattice of regular open sets of that space. This construction may be considered as a generalized completion in analogy to the well-known Dedekind completion of the rational numbers yielding the real numbers . The argument of the paper relies on the theories of continuous lattices and (...) “pointless” topology.
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Booleanlogic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical (...) level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Booleanlogic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized (...) counting measure on elements of subsets so there is a dual concept of logical entropy which is the normalized counting measure on distinctions of partitions. Thus the logical notion of information is a measure of distinctions. Classical logical entropy naturally extends to the notion of quantum logical entropy which provides a more natural and informative alternative to the usual Von Neumann entropy in quantum information theory. The quantum logical entropy of a post-measurement density matrix has the simple interpretation as the probability that two independent measurements of the same state using the same observable will have different results. The main result of the paper is that the increase in quantum logical entropy due to a projective measurement of a pure state is the sum of the absolute squares of the off-diagonal entries ("coherences") of the pure state density matrix that are zeroed ("decohered") by the measurement, i.e., the measure of the distinctions ("decoherences") created by the measurement. (shrink)
The identity theory’s rise to prominence in analytic philosophy of mind during the late 1950s and early 1960s is widely seen as a watershed in the development of physicalism, in the sense that whereas logical behaviourism proposed analytic and a priori ascertainable identities between the meanings of mental and physical-behavioural concepts, the identity theory proposed synthetic and a posteriori knowable identities between mental and physical properties. While this watershed does exist, the standard account of it is misleading, as (...) it is founded in erroneous intensional misreadings of the logical positivists’—especially Carnap’s—extensional notions of translation and meaning, as well as misinterpretations of the positivists’ shift from the strong thesis of translation-physicalism to the weaker and more liberal notion of reduction-physicalism that occurred in the Unity of Science programme. After setting the historical record straight, the essay traces the first truly modern identity theory to Schlick’s pre-positivist views circa 1920 and goes on to explore its further development in Feigl, arguing that the fundamental difference between the Schlick-Feigl identity theory and the more familiar and influential Place-Smart-Armstrong identity theory has resurfaced in the deep and seemingly unbridgeable gulf in contemporary philosophy of consciousness between inflationary mentalism and deflationary physicalism. (shrink)
Set-theoretic and category-theoretic foundations represent different perspectives on mathematical subject matter. In particular, category-theoretic language focusses on properties that can be determined up to isomorphism within a category, whereas set theory admits of properties determined by the internal structure of the membership relation. Various objections have been raised against this aspect of set theory in the category-theoretic literature. In this article, we advocate a methodological pluralism concerning the two foundational languages, and provide a theory that fruitfully interrelates (...) a `structural' perspective to a set-theoretic one. We present a set-theoretic system that is able to talk about structures more naturally, and argue that it provides an important perspective on plausibly structural properties such as cardinality. We conclude the language of set theory can provide useful information about the notion of mathematical structure. (shrink)
Abstract. The aim of this paper is to present a topological method for constructing discretizations (tessellations) of conceptual spaces. The method works for a class of topological spaces that the Russian mathematician Pavel Alexandroff defined more than 80 years ago. Alexandroff spaces, as they are called today, have many interesting properties that distinguish them from other topological spaces. In particular, they exhibit a 1-1 correspondence between their specialization orders and their topological structures. Recently, a special type of Alexandroff spaces was (...) used by Ian Rumfitt to elucidate the logic of vague concepts in a new way. According to his approach, conceptual spaces such as the color spectrum give rise to classical systems of concepts that have the structure of atomic Boolean algebras. More precisely, concepts are represented as regular open regions of an underlying conceptual space endowed with a topological structure. Something is subsumed under a concept iff it is represented by an element of the conceptual space that is maximally close to the prototypical element p that defines that concept. This topological representation of concepts comes along with a representation of the familiar logical connectives of Aristotelian syllogistics in terms of natural settheoretical operations that characterize regular open interpretations of classical Boolean propositional logic. In the last 20 years, conceptual spaces have become a popular tool of dealing with a variety of problems in the fields of cognitive psychology, artificial intelligence, linguistics and philosophy, mainly due to the work of Peter Gärdenfors and his collaborators. By using prototypes and metrics of similarity spaces, one obtains geometrical discretizations of conceptual spaces by so-called Voronoi tessellations. These tessellations are extensionally equivalent to topological tessellations that can be constructed for Alexandroff spaces. Thereby, Rumfitt’s and Gärdenfors’s constructions turn out to be special cases of an approach that works for a more general class of spaces, namely, for weakly scattered Alexandroff spaces. This class of spaces provides a convenient framework for conceptual spaces as used in epistemology and related disciplines in general. Alexandroff spaces are useful for elucidating problems related to the logic of vague concepts, in particular they offer a solution of the Sorites paradox (Rumfitt). Further, they provide a semantics for the logic of clearness (Bobzien) that overcomes certain problems of the concept of higher2 order vagueness. Moreover, these spaces help find a natural place for classical syllogistics in the framework of conceptual spaces. The crucial role of order theory for Alexandroff spaces can be used to refine the all-or-nothing distinction between prototypical and nonprototypical stimuli in favor of a more fine-grained gradual distinction between more-orless prototypical elements of conceptual spaces. The greater conceptual flexibility of the topological approach helps avoid some inherent inadequacies of the geometrical approach, for instance, the so-called “thickness problem” (Douven et al.) and problems of selecting a unique metric for similarity spaces. Finally, it is shown that only the Alexandroff account can deal with an issue that is gaining more and more importance for the theory of conceptual spaces, namely, the role that digital conceptual spaces play in the area of artificial intelligence, computer science and related disciplines. Keywords: Conceptual Spaces, Polar Spaces, Alexandroff Spaces, Prototypes, Topological Tessellations, Voronoi Tessellations, Digital Topology. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. (...) Much of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
The algebra of transactions as fundamental measurements is constructed on the basis of the analysis of their properties and represents an expansion of the Booleanalgebra. The notion of the generalized economic measurements of the economic “quantity” and “quality” of objects of transactions is introduced. It has been shown that the vector space of economic states constructed on the basis of these measurements is relativistic. The laws of kinematics of economic objects in this space have been analyzed (...) and the stages of constructing the dynamics have been formulated. In particular, the “principle of maximum benefit”, which represents an economic analog of the principle of least action in the classical mechanics, and the principle of relativity as the principle of equality of all possible consumer preferences have been formulated. The notion of economic interval between two economic objects invariant to the selection of the vector of consumer preferences has been introduced. Methods of experimental verification of the principle of relativity in the space of economic states have been proposed. (shrink)
The distinction between the discrete and the continuous lies at the heart of mathematics. Discrete mathematics (arithmetic, algebra, combinatorics, graph theory, cryptography, logic) has a set of concepts, techniques, and application areas largely distinct from continuous mathematics (traditional geometry, calculus, most of functional analysis, differential equations, topology). The interaction between the two – for example in computer models of continuous systems such as fluid flow – is a central issue in the applicable mathematics of the last hundred (...) years. This article explains the distinction and why it has proved to be one of the great organizing themes of mathematics. (shrink)
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the system (...) is sound and complete, and allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
An arithmetic theory of oppositions is devised by comparing expressions, Boolean bitstrings, and integers. This leads to a set of correspondences between three domains of investigation, namely: logic, geometry, and arithmetic. The structural properties of each area are investigated in turn, before justifying the procedure as a whole. Io finish, I show how this helps to improve the logical calculus of oppositions, through the consideration of corresponding operations between integers.
Multiverse Views in set theory advocate the claim that there are many universes of sets, no-one of which is canonical, and have risen to prominence over the last few years. One motivating factor is that such positions are often argued to account very elegantly for technical practice. While there is much discussion of the technical aspects of these views, in this paper I analyse a radical form of Multiversism on largely philosophical grounds. Of particular importance will be an account (...) of reference on the Multiversist conception, and the relativism that it implies. I argue that analysis of this central issue in the Philosophy of Mathematics indicates that Radical Multiversism must be algebraic, and cannot be viewed as an attempt to provide an account of reference without a softening of the position. (shrink)
Prior Analytics by the Greek philosopher Aristotle (384 – 322 BCE) and Laws of Thought by the English mathematician George Boole (1815 – 1864) are the two most important surviving original logical works from before the advent of modern logic. This article has a single goal: to compare Aristotle’s system with the system that Boole constructed over twenty-two centuries later intending to extend and perfect what Aristotle had started. This comparison merits an article itself. Accordingly, this article does not (...) discuss many other historically and philosophically important aspects of Boole’s book, e.g. his confused attempt to apply differential calculus to logic, his misguided effort to make his system of ‘class logic’ serve as a kind of ‘truth-functional logic’, his now almost forgotten foray into probability theory, or his blindness to the fact that a truth-functional combination of equations that follows from a given truth-functional combination of equations need not follow truth-functionally. One of the main conclusions is that Boole’s contribution widened logic and changed its nature to such an extent that he fully deserves to share with Aristotle the status of being a founding figure in logic. By setting forth in clear and systematic fashion the basic methods for establishing validity and for establishing invalidity, Aristotle became the founder of logic as formal epistemology. By making the first unmistakable steps toward opening logic to the study of ‘laws of thought’—tautologies and laws such as excluded middle and non-contradiction—Boole became the founder of logic as formal ontology. (shrink)
Logic is about reasoning, or so the story goes. This thesis looks at the concept of logic, what it is, and what claims of correctness of logics amount to. The concept of logic is not a settled matter, and has not been throughout the history of it as a notion. Tools from conceptual analysis aid in this historical venture. Once the unsettledness of logic is established we see the repercussions in current debates in the philosophy of (...)logic. Much of the battle over the ‘one true logic’ is conceptually talking past each other. The theory of logics-as-formalizations is presented as a conceptually open theory of logic which is Carnapian in flavour and grounding. Rudolf Carnap’s notions surrounding ‘external’ and ‘pseudo-questions’ about linguistic frameworks apply to formalizations, thus logics, as well. An account of what formalizations are, a more structured sub-set of modelling, is given to ground the claim that logics are formalizations. Finally, a novel account of correctness, the COFE framework, is developed which allows the notions of logical monism, pluralism and nihilism to be more precisely formulated than they currently are in the discourse. (shrink)
The paper proposes a new formal approach to vagueness and vague sets taking inspirations from Pawlak’s rough set theory. Following a brief introduction to the problem of vagueness, an approach to conceptualization and representation of vague knowledge is presented from a number of diﬀerent perspectives: those of logic, set theory, algebra, and computer science. The central notion of the vague set, in relation to the rough set, is deﬁned as a family of sets approximated by the (...) so called lower and upper limits. The family is simultaneously considered as a family of all denotations of sharp terms representing a suitable vague term, from the agent’s point of view. Some algebraic operations on vague sets and their properties are deﬁned. Some important conditions concerning the membership relation for vague sets, in connection to Blizard’s multisets and Zadeh’s fuzzy sets, are established as well. A classical outlook on a logic of vague sentences (vague logic) based on vague sets is also discussed. (shrink)
Philosophers of science since Nagel have been interested in the links between intertheoretic reduction and explanation, understanding and other forms of epistemic progress. Although intertheoretic reduction is widely agreed to occur in pure mathematics as well as empirical science, the relationship between reduction and explanation in the mathematical setting has rarely been investigated in a similarly serious way. This paper examines an important particular case: the reduction of arithmetic to set theory. I claim that the reduction is unexplanatory. In (...) defense of this claim, I offer evidence from mathematical practice, and I respond to contrary suggestions due to Steinhart, Maddy, Kitcher and Quine. I then show how, even if set-theoretic reductions are generally not explanatory, set theory can nevertheless serve as a legitimate foundation for mathematics. Finally, some implications of my thesis for philosophy of mathematics and philosophy of science are discussed. In particular, I suggest that some reductions in mathematics are probably explanatory, and I propose that differing standards of theory acceptance might account for the apparent lack of unexplanatory reductions in the empirical sciences. (shrink)
The latest draft (posted 05/14/22) of this short, concise work of proof, theory, and metatheory provides summary meta-proofs and verification of the work and results presented in the Theory and Metatheory of Atemporal Primacy and Riemann, Metatheory, and Proof. In this version, several new and revised definitions of terms were added to subsection SS.1; and many corrected equations, theorems, metatheorems, proofs, and explanations are included in the main text. The body of the text is approximately 18 pages, with (...) 3 sections; 1, an Introduction (with a 108 page listing of key terms & definitions), sect. 2, the Results, sect. 3, Discussion (commentary & predictions), and sect. 4, Works Cited. As much as possible, the style is intended for readability and understanding by very bright children (with some interest & knowledge of maths, etc.) and very interested nonprofessionals. The results of this project also enable upgrades of number theory, set theory, proof theory, metamathematics, the foundations of science, and quantum mechanics theory (etc.). (shrink)
Mind, according to cognitive neuroscience, is a set of brain functions. But, unlike sets, our minds are cohesive. Moreover, unlike the structureless elements of sets, the contents of our minds are structured. Mutual relations between the mental contents endow the mind its structure. Here we characterize the structural essence and the logical form of the mind by focusing on thinking. Examination of the relations between concepts, propositions, and syllogisms involved in thinking revealed the reflexive graph structure of the conceptual mind. (...) Objective logic of the conceptual mind is calculated from its structure. Noteworthy features of the logic of conceptual mind are: degrees of truth, varieties of negation, admission of contradiction, and the failure of a de Morgan's law. Furthermore, cohesion of the conceptual mind follows from its reflexive graph structure. Our characterization of the structure and logic of mind constitutes a substantial refinement of the contemporary cognitive neuroscientific conceptualization of the mind as a set. (shrink)
In the paper we will employ set theory to study the formal aspects of quantum mechanics without explicitly making use of space-time. It is demonstrated that von Neuman and Zermelo numeral sets, previously efectively used in the explanation of Hardy’s paradox, follow a Heisenberg quantum form. Here monadic union plays the role of time derivative. The logical counterpart of monadic union plays the part of the Hamiltonian in the commutator. The use of numerals and monadic union in the classical (...) probability resolution of Hardy’s paradox [1] is supported with the present derivation of a commutator for sets. (shrink)
This book concerns the foundations of epistemic modality. I examine the nature of epistemic modality, when the modal operator is interpreted as concerning both apriority and conceivability, as well as states of knowledge and belief. The book demonstrates how epistemic modality relates to the computational theory of mind; metaphysical modality; the types of mathematical modality; to the epistemic status of large cardinal axioms, undecidable propositions, and abstraction principles in the philosophy of mathematics; to the modal profile of rational intuition; (...) and to the types of intention, when the latter is interpreted as a modal mental state. Chapter \textbf{2} argues for a novel type of expressivism based on the duality between the categories of coalgebras and algebras, and argues that the duality permits of the reconciliation between modal cognitivism and modal expressivism. I also develop a novel topic-sensitive truthmaker semantics for dynamic epistemic logic, and develop a novel dynamic epistemic two-dimensional hyperintensional semantics. Chapter \textbf{3} provides an abstraction principle for epistemic intensions. Chapter \textbf{4} advances a topic-sensitive two-dimensional truthmaker semantics, and provides three novel interpretations of the framework along with the epistemic and metasemantic. Chapter \textbf{5} applies the fixed points of the modal $\mu$-calculus in order to account for the iteration of epistemic states, by contrast to availing of modal axiom 4 (i.e. the KK principle). Chapter \textbf{6} advances a solution to the Julius Caesar problem based on Fine's `criterial' identity conditions which incorporate conditions on essentiality and grounding. Chapter \textbf{7} provides a ground-theoretic regimentation of the proposals in the metaphysics of consciousness and examines its bearing on the two-dimensional conceivability argument against physicalism. The topic-sensitive epistemic two-dimensional truthmaker semantics developed in chapter \textbf{4} is availed of in order for epistemic states to be a guide to metaphysical states in the hyperintensional setting. Chapters \textbf{8-12} provide cases demonstrating how the two-dimensional intensions of epistemic two-dimensional semantics solve the access problem in the epistemology of mathematics. Chapter \textbf{8} examines the modal commitments of abstractionism, in particular necessitism, and epistemic modality and the epistemology of abstraction. Chapter \textbf{9} examines the modal profile of $\Omega$-logic in set theory. Chapter \textbf{10} examines the interaction between topic-sensitive epistemic two-dimensional truthmaker semantics, the axioms of epistemic set theory, large cardinal axioms, the Epistemic Church-Turing Thesis, the modal axioms governing the modal profile of $\Omega$-logic, Orey sentences such as the Generalized Continuum Hypothesis, and absolute decidability. Chapter \textbf{11} avails of modal coalgebraic automata to interpret the defining properties of indefinite extensibility, and avails of epistemic two-dimensional semantics in order to account for the interaction of the interpretational and objective modalities thereof. Chapter \textbf{12} provides a modal logic for rational intuition and provides a hyperintensional semantics. Chapter \textbf{13} examines modal responses to the alethic paradoxes. Chapter \textbf{14} examines, finally, the modal semantics for the different types of intention and the relation of the latter to evidential decision theory. The multi-hyperintensional, topic-sensitive epistemic two-dimensional truthmaker semantics developed in chapters \textbf{2} and \textbf{4} is applied in chapters \textbf{7}, \textbf{8}, \textbf{10}, \textbf{11}, \textbf{12}, and \textbf{14}. (shrink)
The incompleteness of set theory ZF C leads one to look for natural nonconservative extensions of ZF C in which one can prove statements independent of ZF C which appear to be “true”. One approach has been to add large cardinal axioms.Or, one can investigate second-order expansions like Kelley-Morse class theory, KM or Tarski-Grothendieck set theory T G or It is a nonconservative extension of ZF C and is obtained from other axiomatic set theories by the inclusion (...) of Tarski’s axiom which implies the existence of inaccessible cardinals. See also related set theory with a filter quantifier ZF (aa). In this paper we look at a set theory NC# ∞# , based on bivalent gyper infinitary logic with restricted Modus Ponens Rule In this paper we deal with set theory NC# ∞# based on bivalent gyper infinitary logic with Restricted Modus Ponens Rule. Nonconservative extensions of the canonical internal set theories IST and HST are proposed. (shrink)
2nd edition. Many-valued logics are those logics that have more than the two classical truth values, to wit, true and false; in fact, they can have from three to infinitely many truth values. This property, together with truth-functionality, provides a powerful formalism to reason in settings where classical logic—as well as other non-classical logics—is of no avail. Indeed, originally motivated by philosophical concerns, these logics soon proved relevant for a plethora of applications ranging from switching theory to cognitive (...) modeling, and they are today in more demand than ever, due to the realization that inconsistency and vagueness in knowledge bases and information processes are not only inevitable and acceptable, but also perhaps welcome. The main modern applications of (any) logic are to be found in the digital computer, and we thus require the practical knowledge how to computerize—which also means automate—decisions (i.e. reasoning) in many-valued logics. This, in turn, necessitates a mathematical foundation for these logics. This book provides both these mathematical foundation and practical knowledge in a rigorous, yet accessible, text, while at the same time situating these logics in the context of the satisfiability problem (SAT) and automated deduction. The main text is complemented with a large selection of exercises, a plus for the reader wishing to not only learn about, but also do something with, many-valued logics. (shrink)
We present an algebraic account of the Tongan kinship terminology (TKT) that provides an insightful journey into the fabric of Tongan culture. We begin with the ethnographic account of a social event. The account provides us with the activities of that day and the centrality of kin relations in the event, but it does not inform us of the conceptual system that the participants bring with them. Rather, it is a slice in time of an ongoing dynamic process that links (...) behavior with a conceptual system of kin relations and vice versa. To understand this interplay, we need an account of the underlying conceptual system that is being activated during the event. Thus, we introduce a formal, algebraically based account of TKT. This account brings to the fore the underlying logic of TKT and allows us to distinguish between features of the kinship system that arise from the logic of TKT as a generative structure and features that must have arisen through cultural intervention. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.