This paper investigates a generalization of Boolean algebras which I call agglomerativealgebras. It also outlines two conceptions of propositions according to which they form an agglomerative algebra but not a Boolean algebra with respect to conjunction and negation.
Hyperboolean algebras are Boolean algebras with operators, constructed as algebras of complexes (or, power structures) of Boolean algebras. They provide an algebraic semantics for a modal logic (called here a {\em hyperboolean modal logic}) with a Kripke semantics accordingly based on frames in which the worlds are elements of Boolean algebras and the relations correspond to the Boolean operations. We introduce the hyperboolean modal logic, give a complete axiomatization of it, and show that it lacks (...) the finite model property. The method of axiomatization hinges upon the fact that a "difference" operator is definable in hyperboolean algebras, and makes use of additional non-Hilbert-style rules. Finally, we discuss a number of open questions and directions for further research. (shrink)
The aim of this paper is to show that every topological space gives rise to a wealth of topological models of the modal logic S4.1. The construction of these models is based on the fact that every space defines a Boolean closure algebra (to be called a McKinsey algebra) that neatly reflects the structure of the modal system S4.1. It is shown that the class of topological models based on McKinsey algebras contains a canonical model that can be used (...) to prove a completeness theorem for S4.1. Further, it is shown that the McKinsey algebra MKX of a space X endoewed with an alpha-topologiy satisfies Esakia's GRZ axiom. (shrink)
An endomorphism on an algebra \ is said to be strong if it is compatible with every congruence on \; and \ is said to have the strong endomorphism kernel property if every congruence on \, other than the universal congruence, is the kernel of a strong endomorphism on \. Here we characterise the structure of those double MS-algebras that have this property by the way of Priestley duality.
Multialgebras have been much studied in mathematics and in computer science. In 2016 Carnielli and Coniglio introduced a class of multialgebras called swap structures, as a semantic framework for dealing with several Logics of Formal Inconsistency that cannot be semantically characterized by a single finite matrix. In particular, these LFIs are not algebraizable by the standard tools of abstract algebraic logic. In this paper, the first steps towards a theory of non-deterministic algebraization of logics by swap structures are given. Specifically, (...) a formal study of swap structures for LFIs is developed, by adapting concepts of universal algebra to multialgebras in a suitable way. A decomposition theorem similar to Birkhoff’s representation theorem is obtained for each class of swap structures. Moreover, when applied to the 3-valued algebraizable logics J3 and Ciore, their classes of algebraic models are retrieved, and the swap structures semantics become twist structures semantics. This fact, together with the existence of a functor from the category of Boolean algebras to the category of swap structures for each LFI, suggests that swap structures can be seen as non-deterministic twist structures. This opens new avenues for dealing with non-algebraizable logics by the more general methodology of multialgebraic semantics. (shrink)
In contemporary mathematics, a Colombeau algebra of Colombeau generalized functions is an algebra of a certain kind containing the space of Schwartz distributions. While in classical distribution theory a general multiplication of distributions is not possible, Colombeau algebras provide a rigorous framework for this. Remark 1.1.1.Such a multiplication of distributions has been a long time mistakenly believed to be impossible because of Schwartz’ impossibility result, which basically states that there cannot be a differential algebra containing the space of distributions (...) and preserving the product of continuous functions. However, if one only wants to preserve the product of smooth functions instead such a construction becomes possible, as demonstrated first by J.F.Colombeau [1],[2]. As a mathematical tool, Colombeau algebras can be said to combine a treatment of singularities, differentiation and nonlinear operations in one framework, lifting the limitations of distribution theory. These algebras have found numerous applications in the fields of partial differential equations, geophysics, microlocal analysis and general relativity so far. (shrink)
One of the most expected properties of a logical system is that it can be algebraizable, in the sense that an algebraic counterpart of the deductive machinery could be found. Since the inception of da Costa's paraconsistent calculi, an algebraic equivalent for such systems have been searched. It is known that these systems are non self-extensional (i.e., they do not satisfy the replacement property). More than this, they are not algebraizable in the sense of Blok-Pigozzi. The same negative results hold (...) for several systems of the hierarchy of paraconsistent logics known as Logics of Formal Inconsistency (LFIs). Because of this, these logics are uniquely characterized by semantics of non-deterministic kind. This paper offers a solution for two open problems in the domain of paraconsistency, in particular connected to algebraization of LFIs, by obtaining several LFIs weaker than C1, each of one is algebraizable in the standard Lindenbaum-Tarski's sense by a suitable variety of Boolean algebras extended with operators. This means that such LFIs satisfy the replacement property. The weakest LFI satisfying replacement presented here is called RmbC, which is obtained from the basic LFI called mbC. Some axiomatic extensions of RmbC are also studied, and in addition a neighborhood semantics is defined for such systems. It is shown that RmbC can be defined within the minimal bimodal non-normal logic E+E defined by the fusion of the non-normal modal logic E with itself. Finally, the framework is extended to first-order languages. RQmbC, the quantified extension of RmbC, is shown to be sound and complete w.r.t. BALFI semantics. (shrink)
How can different individuals' probability assignments to some events be aggregated into a collective probability assignment? Classic results on this problem assume that the set of relevant events -- the agenda -- is a sigma-algebra and is thus closed under disjunction (union) and conjunction (intersection). We drop this demanding assumption and explore probabilistic opinion pooling on general agendas. One might be interested in the probability of rain and that of an interest-rate increase, but not in the probability of rain or (...) an interest-rate increase. We characterize linear pooling and neutral pooling for general agendas, with classic results as special cases for agendas that are sigma-algebras. As an illustrative application, we also consider probabilistic preference aggregation. Finally, we compare our results with existing results on binary judgment aggregation and Arrovian preference aggregation. This paper is the first of two self-contained, but technically related companion papers inspired by binary judgment-aggregation theory. (shrink)
Boolean-valued models of set theory were independently introduced by Scott, Solovay and Vopěnka in 1965, offering a natural and rich alternative for describing forcing. The original method was adapted by Takeuti, Titani, Kozawa and Ozawa to lattice-valued models of set theory. After this, Löwe and Tarafder proposed a class of algebras based on a certain kind of implication which satisfy several axioms of ZF. From this class, they found a specific 3-valued model called PS3 which satisfies all the axioms (...) of ZF, and can be expanded with a paraconsistent negation *, thus obtaining a paraconsistent model of ZF. The logic (PS3 ,*) coincides (up to language) with da Costa and D'Ottaviano logic J3, a 3-valued paraconsistent logic that have been proposed independently in the literature by several authors and with different motivations such as CluNs, LFI1 and MPT. We propose in this paper a family of algebraic models of ZFC based on LPT0, another linguistic variant of J3 introduced by us in 2016. The semantics of LPT0, as well as of its first-order version QLPT0, is given by twist structures defined over Boolean agebras. From this, it is possible to adapt the standard Boolean-valued models of (classical) ZFC to twist-valued models of an expansion of ZFC by adding a paraconsistent negation. We argue that the implication operator of LPT0 is more suitable for a paraconsistent set theory than the implication of PS3, since it allows for genuinely inconsistent sets w such that [(w = w)] = 1/2 . This implication is not a 'reasonable implication' as defined by Löwe and Tarafder. This suggests that 'reasonable implication algebras' are just one way to define a paraconsistent set theory. Our twist-valued models are adapted to provide a class of twist-valued models for (PS3,*), thus generalizing Löwe and Tarafder result. It is shown that they are in fact models of ZFC (not only of ZF). (shrink)
In this paper it is shown that Heyting and Co-Heyting mereological systems provide a convenient conceptual framework for spatial reasoning, in which spatial concepts such as connectedness, interior parts, (exterior) contact, and boundary can be defined in a natural and intuitively appealing way. This fact refutes the wide-spread contention that mereology cannot deal with the more advanced aspects of spatial reasoning and therefore has to be enhanced by further non-mereological concepts to overcome its congenital limitations. The allegedly unmereological concept of (...) boundary is treated in detail and shown to be essentially affected by mereological considerations. More precisely, the concept of boundary turns out to be realizable in a variety of different mereologically grounded versions. In particular, every part K of a Heyting algebra H gives rise to a well-behaved K-relative boundary operator. (shrink)
Deontic logic is devoted to the study of logical properties of normative predicates such as permission, obligation and prohibition. Since it is usual to apply these predicates to actions, many deontic logicians have proposed formalisms where actions and action combinators are present. Some standard action combinators are action conjunction, choice between actions and not doing a given action. These combinators resemble boolean operators, and therefore the theory of boolean algebra offers a well-known athematical framework to study the properties of the (...) classic deontic operators when applied to actions. In his seminal work, Segerberg uses constructions coming from boolean algebras to formalize the usual deontic notions. Segerberg’s work provided the initial step to understand logical properties of deontic operators when they are applied to actions. In the last years, other authors have proposed related logics. In this chapter we introduce Segerberg’s work, study related formalisms and investigate further challenges in this area. (shrink)
In 1988, J. Ivlev proposed some (non-normal) modal systems which are semantically characterized by four-valued non-deterministic matrices in the sense of A. Avron and I. Lev. Swap structures are multialgebras (a.k.a. hyperalgebras) of a special kind, which were introduced in 2016 by W. Carnielli and M. Coniglio in order to give a non-deterministic semantical account for several paraconsistent logics known as logics of formal inconsistency, which are not algebraizable by means of the standard techniques. Each swap structure induces naturally a (...) non-deterministic matrix. The aim of this paper is to obtain a swap structures semantics for some Ivlev-like modal systems proposed in 2015 by M. Coniglio, L. Fariñas del Cerro and N. Peron. Completeness results will be stated by means of the notion of Lindenbaum–Tarski swap structures, which constitute a natural generalization to multialgebras of the concept of Lindenbaum–Tarski algebras. (shrink)
Hybrid languages are introduced in order to evaluate the strength of “minimal” mereologies with relatively strong frame definability properties. Appealing to a robust form of nominalism, I claim that one investigated language \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\mathcal {H}_{\textsf {m}}$\end{document} is maximally acceptable for nominalistic mereology. In an extension \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\mathcal {H}_{\textsf {gem}}$\end{document} of \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\mathcal {H}_{\textsf {m}}$\end{document}, a modal analog (...) for the classical systems of Leonard and Goodman and Leśniewski is introduced and shown to be complete with respect to 0-deleted Boolean algebras. We characterize the formulas of first-order logic invariant for \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\mathcal {H}_{\textsf {gem}}$\end{document}-bisimulations. (shrink)
We study the general problem of axiomatizing structures in the framework of modal logic and present a uniform method for complete axiomatization of the modal logics determined by a large family of classes of structures of any signature.
Abstract. Let REL(O*E) be the relation algebra of binary relations defined on the Boolean algebra O*E of regular open regions of the Euclidean plane E. The aim of this paper is to prove that the canonical contact relation C of O*E generates a subalgebra REL(O*E, C) of REL(O*E) that has infinitely many elements. More precisely, REL(O*,C) contains an infinite family {SPPn, n ≥ 1} of relations generated by the relation SPP (Separable Proper Part). This relation can be used to define (...) point-free concept of connectedness that for the regular open regions of E coincides with the standard topological notion of connectedness, i.e., a region of the plane E is connected in the sense of topology if and only if it has no separable proper part. Moreover, it is shown that the contact relation algebra REL(O*E, C) and the relation algebra REL(O*E, NTPP) generated by the non-tangential proper parthood relation NTPP, coincide. This entails that the allegedly purely topological notion of connectedness can be defined in mereological terms. (shrink)
Relevance logic has become ontologically fertile. No longer is the idea of relevance restricted in its application to purely logical relations among propositions, for as Dunn has shown in his (1987), it is possible to extend the idea in such a way that we can distinguish also between relevant and irrelevant predications, as for example between “Reagan is tall” and “Reagan is such that Socrates is wise”. Dunn shows that we can exploit certain special properties of identity within the context (...) of standard relevance logic in a way which allows us to discriminate further between relevant and irrelevant properties, as also between relevant and irrelevant relations. The idea yields a family of ontologically interesting results concerning the different ways in which attributes and objects may hang together. Because of certain notorious peculiarities of relevance logic, however,1 Dunn’s idea breaks down where the attempt is made to have it bear fruit in application to relations among entities which are of homogeneous type. (shrink)
This book concerns the foundations of epistemic modality. I examine the nature of epistemic modality, when the modal operator is interpreted as concerning both apriority and conceivability, as well as states of knowledge and belief. The book demonstrates how epistemic modality relates to the computational theory of mind; metaphysical modality; the types of mathematical modality; to the epistemic status of undecidable propositions and abstraction principles in the philosophy of mathematics; to the modal profile of rational propositional intuition; and to the (...) types of intention, when the latter is interpreted as a modal mental state. Each essay is informed by either epistemic logic, modal and cylindric algebra or coalgebra, intensional semantics or hyperintensional semantics. The book's original contributions include theories of: (i) epistemic modal algebras and coalgebras; (ii) cognitivism about epistemic modality; (iii) two-dimensional truthmaker semantics, and interpretations thereof; (iv) the ground-theoretic ontology of consciousness; (v) fixed-points in vagueness; (vi) the modal foundations of mathematical platonism; (vii) a solution to the Julius Caesar problem based on metaphysical definitions availing of notions of ground and essence; (viii) the application of epistemic two-dimensional semantics to the epistemology of mathematics; and (ix) a modal logic for rational intuition. I develop, further, (x) a novel approach to conditions of self-knowledge in the setting of the modal $\mu$-calculus, as well as (xi) novel epistemicist solutions to Curry's and the liar paradoxes. (shrink)
This paper aims to provide a mathematically tractable background against which to model both modal cognitivism and modal expressivism. I argue that epistemic modal algebras comprise a materially adequate fragment of the language of thought. I demonstrate, then, how modal expressivism can be regimented by modal coalgebraic automata, to which the above epistemic modal algebras are dual. I examine, in particular, the virtues unique to the modal expressivist approach here proffered in the setting of the foundations of mathematics, (...) by contrast to competing approaches based upon both the inferentialist approach to concept-individuation and the codification of speech acts via intensional semantics. (shrink)
This paper aims to provide a mathematically tractable background against which to model both modal cognitivism and modal expressivism. I argue that epistemic modal algebras comprise a materially adequate fragment of the language of thought. I demonstrate, then, how modal expressivism can be regimented by modal coalgebraic automata, to which the above epistemic modal algebras are dually isomorphic. I examine, in particular, the virtues unique to the modal expressivist approach here proffered in the setting of the foundations of (...) mathematics, by contrast to competing approaches based upon both the inferentialist approach to concept-individuation and the codification of speech acts via intensional semantics. (shrink)
In this paper we motivate and develop the analytic theory of measurement, in which autonomously specified algebras of quantities (together with the resources of mathematical analysis) are used as a unified mathematical framework for modeling (a) the time-dependent behavior of natural systems, (b) interactions between natural systems and measuring instruments, (c) error and uncertainty in measurement, and (d) the formal propositional language for describing and reasoning about measurement results. We also discuss how a celebrated theorem in analysis, known as (...) Gelfand representation, guarantees that autonomously specified algebras of quantities can be interpreted as algebras of observables on a suitable state space. Such an interpretation is then used to support (i) a realist conception of quantities as objective characteristics of natural systems, and (ii) a realist conception of measurement results (evaluations of quantities) as determined by and descriptive of the states of a target natural system. As a way of motivating the analytic approach to measurement, we begin with a discussion of some serious philosophical and theoretical problems facing the well-known representational theory of measurement. We then explain why we consider the analytic approach, which avoids all these problems, to be far more attractive on both philosophical and theoretical grounds. (shrink)
The notion of equality between two observables will play many important roles in foundations of quantum theory. However, the standard probabilistic interpretation based on the conventional Born formula does not give the probability of equality between two arbitrary observables, since the Born formula gives the probability distribution only for a commuting family of observables. In this paper, quantum set theory developed by Takeuti and the present author is used to systematically extend the standard probabilistic interpretation of quantum theory to define (...) the probability of equality between two arbitrary observables in an arbitrary state. We apply this new interpretation to quantum measurement theory, and establish a logical basis for the difference between simultaneous measurability and simultaneous determinateness. (shrink)
This paper for an upcoming journal volume examines Grete Hermann's Naturphilosophischen Grundlagen der Quantenmechanik (1935) and the relative context, or perspectival, interpretation of standard quantum mechanics found therein. I find an argument for the emergence of limited spatio-temporal and retrocausal stories, from a chosen experimental perspective, within a larger set of entangled systems not subject to a spatio-temporal interpretation. This argument can be read in reverse as giving some of the necessary preconditions of spatio-temporal representations as based upon perspectival relations, (...) carrying on a Kantian transcendental argument on one hand, and on the other hand, looking forward to Weyl's use of symmetry groups, Lie algebras and their representations in quantum mechanics. (shrink)
We review our approach to quantum mechanics adding also some new interesting results. We start by giving proof of two important theorems on the existence of the A(Si) and i,±1 N Clifford algebras. This last algebra gives proof of the von Neumann basic postulates on the quantum measurement explaining thus in an algebraic manner the wave function collapse postulated in standard quantum theory. In this manner we reach the objective to expose a self-consistent version of quantum mechanics. In detail (...) we realize a bare bone skeleton of quantum mechanics recovering all the basic foundations of this theory on an algebraic framework. We give proof of the quantum like Heisenberg uncertainty relations using only the basic support of the Clifford algebra. In addition we demonstrate the well known phenomenon of quantum Mach Zender interference using the same algebraic framework, as well as we give algebraic proof of quantum collapse in some cases of physical interest by direct application of the theorem that we derive to elaborate the i,±1 N algebra. We also discuss the problem of time evolution of quantum systems as well as the changes in space location, in momentum and the linked invariance principles. We are also able to re-derive the basic wave function of standard quantum mechanics by using only the Clifford algebraic approach. In this manner we obtain a full exposition of standard quantum mechanics using only the basic axioms of Clifford algebra. We also discuss more advanced features of quantum mechanics. In detail, we give demonstration of the Kocken-Specher theorem, and also we give an algebraic formulation and explanation of the EPR paradox only using the Clifford algebra. By using the same approach we also derive Bell inequalities. Our formulation is strongly based on the use of idempotents that are contained in Clifford algebra. Their counterpart in quantum mechanics is represented by the projection operators that, as it is well known, are interpreted as logical statements, following the basic von Neumann results. Von Neumann realized a matrix logic on the basis of quantum mechanics. Using the Clifford algebra we are able to invert such result. According to the results previously obtained by Orlov in 1994, we are able to give proof that quantum mechanics derives from logic. We show that indeterminism and quantum interference have their origin in the logic. Therefore, it seems that we may conclude that quantum mechanics, as it appears when investigated by the Clifford algebra, is a two-faced theory in the sense that it looks from one side to “matter per se”, thus to objects but simultaneously also to conceptual entities. We advance the basic conclusion of the paper: There are stages of our reality in which we no more can separate the logic ( and thus cognition and thus conceptual entity) from the features of “matter per se”. In quantum mechanics the logic, and thus the cognition and thus the conceptual entity-cognitive performance, assume the same importance as the features of what is being described. We are at levels of reality in which the truths of logical statements about dynamic variables become dynamic variables themselves so that a profound link is established from its starting in this theory between physics and conceptual entities. Finally, in this approach there is not an absolute definition of logical truths. Transformations , and thus … “redefinitions”…. of truth values are permitted in such scheme as well as the well established invariance principles, clearly indicate . (shrink)
Scroggs's theorem on the extensions of S5 is an early landmark in the modern mathematical studies of modal logics. From it, we know that the lattice of normal extensions of S5 is isomorphic to the inverse order of the natural numbers with infinity and that all extensions of S5 are in fact normal. In this paper, we consider extending Scroggs's theorem to modal logics with propositional quantifiers governed by the axioms and rules analogous to the usual ones for ordinary quantifiers. (...) We call them Π-logics. Taking S5Π, the smallest normal Π-logic extending S5, as the natural counterpart to S5 in Scroggs's theorem, we show that all normal Π-logics extending S5Π are complete with respect to their complete simple S5 algebras, that they form a lattice that is isomorphic to the lattice of the open sets of the disjoint union of two copies of the one-point compactification of N, that they have arbitrarily high Turing-degrees, and that there are non-normal Π-logics extending S5Π. (shrink)
In this paper the class of Fidel-structures for the paraconsistent logic mbC is studied from the point of view of Model Theory and Category Theory. The basic point is that Fidel-structures for mbC (or mbC-structures) can be seen as first-order structures over the signature of Boolean algebras expanded by two binary predicate symbols N (for negation) and O (for the consistency connective) satisfying certain Horn sentences. This perspective allows us to consider notions and results from Model Theory in order (...) to analyze the class of mbC-structures. Thus, substructures, union of chains, direct products, direct limits, congruences and quotient structures can be analyzed under this perspective. In particular, a Birkhoff-like representation theorem for mbC-structures as subdirect poducts in terms of subdirectly irreducible mbC-structures is obtained by adapting a general result for first-order structures due to Caicedo. Moreover, a characterization of all the subdirectly irreducible mbC-structures is also given. An alternative decomposition theorem is obtained by using the notions of weak substructure and weak isomorphism considered by Fidel for Cn-structures. (shrink)
We study a new formal logic LD introduced by Prof. Grzegorczyk. The logic is based on so-called descriptive equivalence, corresponding to the idea of shared meaning rather than shared truth value. We construct a semantics for LD based on a new type of algebras and prove its soundness and completeness. We further show several examples of classical laws that hold for LD as well as laws that fail. Finally, we list a number of open problems. -/- .
The indefinability of concepts is explored through the idea of a conceptual scheme. Using the Stone duality of Boolean algebras indefinable concepts are categorized as specific types of subspaces. Additionally, indefinability is formulated as a type of algebraic independence and conceptual atomism is investigated from a mathematical perspective.
Abstract. The aim of this paper is to present a topological method for constructing discretizations (tessellations) of conceptual spaces. The method works for a class of topological spaces that the Russian mathematician Pavel Alexandroff defined more than 80 years ago. Alexandroff spaces, as they are called today, have many interesting properties that distinguish them from other topological spaces. In particular, they exhibit a 1-1 correspondence between their specialization orders and their topological structures. Recently, a special type of Alexandroff spaces was (...) used by Ian Rumfitt to elucidate the logic of vague concepts in a new way. According to his approach, conceptual spaces such as the color spectrum give rise to classical systems of concepts that have the structure of atomic Boolean algebras. More precisely, concepts are represented as regular open regions of an underlying conceptual space endowed with a topological structure. Something is subsumed under a concept iff it is represented by an element of the conceptual space that is maximally close to the prototypical element p that defines that concept. This topological representation of concepts comes along with a representation of the familiar logical connectives of Aristotelian syllogistics in terms of natural settheoretical operations that characterize regular open interpretations of classical Boolean propositional logic. In the last 20 years, conceptual spaces have become a popular tool of dealing with a variety of problems in the fields of cognitive psychology, artificial intelligence, linguistics and philosophy, mainly due to the work of Peter Gärdenfors and his collaborators. By using prototypes and metrics of similarity spaces, one obtains geometrical discretizations of conceptual spaces by so-called Voronoi tessellations. These tessellations are extensionally equivalent to topological tessellations that can be constructed for Alexandroff spaces. Thereby, Rumfitt’s and Gärdenfors’s constructions turn out to be special cases of an approach that works for a more general class of spaces, namely, for weakly scattered Alexandroff spaces. This class of spaces provides a convenient framework for conceptual spaces as used in epistemology and related disciplines in general. Alexandroff spaces are useful for elucidating problems related to the logic of vague concepts, in particular they offer a solution of the Sorites paradox (Rumfitt). Further, they provide a semantics for the logic of clearness (Bobzien) that overcomes certain problems of the concept of higher2 order vagueness. Moreover, these spaces help find a natural place for classical syllogistics in the framework of conceptual spaces. The crucial role of order theory for Alexandroff spaces can be used to refine the all-or-nothing distinction between prototypical and nonprototypical stimuli in favor of a more fine-grained gradual distinction between more-orless prototypical elements of conceptual spaces. The greater conceptual flexibility of the topological approach helps avoid some inherent inadequacies of the geometrical approach, for instance, the so-called “thickness problem” (Douven et al.) and problems of selecting a unique metric for similarity spaces. Finally, it is shown that only the Alexandroff account can deal with an issue that is gaining more and more importance for the theory of conceptual spaces, namely, the role that digital conceptual spaces play in the area of artificial intelligence, computer science and related disciplines. Keywords: Conceptual Spaces, Polar Spaces, Alexandroff Spaces, Prototypes, Topological Tessellations, Voronoi Tessellations, Digital Topology. (shrink)
A new advanced systems theory concerning the emergent nature of the Social, Consciousness, and Life based on Mathematics and Physical Analogies is presented. This meta-theory concerns the distance between the emergent levels of these phenomena and their ultra-efficacious nature. The theory is based on the distinction between Systems and Meta-systems (organized Openscape environments). We first realize that we can understand the difference between the System and the Meta-system in terms of the relationship between a ‘Whole greater than the sum of (...) the parts’ and a ‘Whole less than the sum of its parts’, i.e., a whole full of holes (like a sponge) that provide niches for systems in the environment. Once we understand this distinction and clarify the nature of the unusual organization of the Meta-system, then it is possible to understand that there is a third possibility which is a whole exactly equal to the sum of its parts that is only supervenient like perfect numbers. In fact, there are three kinds of Special System corresponding to the perfect, amicable, and sociable aliquot numbers. These are all equal to the sum of their parts but with different degrees of differing and deferring in what Jacques Derrida calls “differance”. All other numbers are either excessive (systemic) or deficient (metasystemic) in this regard. The Special Systems are based on various mathematical analogies and some physical analogies. But the most important of the mathematical analogies are the hypercomplex algebras, which include the Complex Numbers, Quaternions, and Octonions, with the Sedenions corresponding to the Emergent Meta-system. However, other analogies are the Hopf fibrations between hyperspheres of various dimensions, nonorientable surfaces, soliton solutions, etc. These Special Systems have a long history within the tradition since they can be traced back to the imaginary cities of Plato. The Emergent Meta-system is a higher order global structure that includes the System with the three Special Systems as a cycle. An example of this from our tradition is in the Monadology of Gottfried Wilhelm von Leibniz. There is a conjunctive relationship between the System schema and the Special Systems that produce the Meta-system schema cycle. The Special Systems are a meta-model for the relationship between the emergent levels of Consciousness (Dissipative Ordering based on the theory of negative entropy of Prigogine), Living (Autopoietic Symbiotic based on the theory of Maturana and Varela), and Social (Reflexive based on the theory of John O’Malley and Barry Sandywell). These different special systems are related to the various existenitals identified by Martin Heidegger in Being and Time and various temporal reference frames identified by Richard M. Pico. We also relate the special systems to morphodynamic and teleodynamic systems of Terrence Deacon in Incomplete Nature to which we add sociodynamic systems to complete the series of Special Systems. (shrink)
In this paper, I define and study an abstract algebraic structure, the dimensive algebra, which embodies the most general features of the algebra of dimensional physical quantities. I prove some elementary results about dimensive algebras and suggest some directions for future work.
In this paper we recall, improve, and extend several definitions, properties and applications of our previous 2019 research referred to NeutroAlgebras and AntiAlgebras (also called NeutroAlgebraic Structures and respectively AntiAlgebraic Structures). Let <A> be an item (concept, attribute, idea, proposition, theory, etc.). Through the process of neutrosphication, we split the nonempty space we work on into three regions {two opposite ones corresponding to <A> and <antiA>, and one corresponding to neutral (indeterminate) <neutA> (also denoted <neutroA>) between the opposites}, which may (...) or may not be disjoint – depending on the application, but they are exhaustive (their union equals the whole space). A NeutroAlgebra is an algebra which has at least one NeutroOperation or one NeutroAxiom (axiom that is true for some elements, indeterminate for other elements, and false for the other elements). A Partial Algebra is an algebra that has at least one Partial Operation, and all its Axioms are classical (i.e. axioms true for all elements). Through a theorem we prove that NeutroAlgebra is a generalization of Partial Algebra, and we give examples of NeutroAlgebras that are not Partial Algebras. We also introduce the NeutroFunction (and NeutroOperation). (shrink)
We study the general problem of axiomatizing structures in the framework of modal logic and present a uniform method for complete axiomatization of the modal logics determined by a large family of classes of structures of any signature.
In the study of modal and nonclassical logics, translations have frequently been employed as a way of measuring the inferential capabilities of a logic. It is sometimes claimed that two logics are “notational variants” if they are translationally equivalent. However, we will show that this cannot be quite right, since first-order logic and propositional logic are translationally equivalent. Others have claimed that for two logics to be notational variants, they must at least be compositionally intertranslatable. The definition of compositionality these (...) accounts use, however, is too strong, as the standard translation from modal logic to first-order logic is not compositional in this sense. In light of this, we will explore a weaker version of this notion that we will call schematicity and show that there is no schematic translation either from first-order logic to propositional logic or from intuitionistic logic to classical logic. (shrink)
I defend an analog of probabilism that characterizes rationally coherent estimates for chances. Specifically, I demonstrate the following accuracy-dominance result for stochastic theories in the C*-algebraic framework: supposing an assignment of chance values is possible if and only if it is given by a pure state on a given algebra, your estimates for chances avoid accuracy-dominance if and only if they are given by a state on that algebra. When your estimates avoid accuracy-dominance (roughly: when you cannot guarantee that other (...) estimates would be more accurate), I say that they are sufficiently coherent. In formal epistemology and quantum foundations, the notion of rational coherence that gets more attention requires that you never allow for a sure loss (or “Dutch book”) in a given sort of betting game; I call this notion full coherence. I characterize when these two notions of rational coherence align, and I show that there is a quantum state giving estimates that are sufficiently coherent, but not fully coherent. (shrink)
The paper is an introduction to geometric algebra and geometric calculus for those with a knowledge of undergraduate mathematics. No knowledge of physics is required. The section Further Study lists many papers available on the web.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.