One of the most expected properties of a logical system is that it can be algebraizable, in the sense that an algebraic counterpart of the deductive machinery could be found. Since the inception of da Costa's paraconsistent calculi, an algebraic equivalent for such systems have been searched. It is known that these systems are non self-extensional (i.e., they do not satisfy the replacement property). More than this, they are not algebraizable in the sense of Blok-Pigozzi. The same negative results hold (...) for several systems of the hierarchy of paraconsistent logics known as Logics of Formal Inconsistency (LFIs). Because of this, these logics are uniquely characterized by semantics of non-deterministic kind. This paper offers a solution for two open problems in the domain of paraconsistency, in particular connected to algebraization of LFIs, by obtaining several LFIs weaker than C1, each of one is algebraizable in the standard Lindenbaum-Tarski's sense by a suitable variety of Boolean algebras extended with operators. This means that such LFIs satisfy the replacement property. The weakest LFI satisfying replacement presented here is called RmbC, which is obtained from the basic LFI called mbC. Some axiomatic extensions of RmbC are also studied, and in addition a neighborhood semantics is defined for such systems. It is shown that RmbC can be defined within the minimal bimodal non-normal logic E+E defined by the fusion of the non-normal modal logic E with itself. Finally, the framework is extended to first-order languages. RQmbC, the quantified extension of RmbC, is shown to be sound and complete w.r.t. BALFI semantics. (shrink)
The present dissertation presents an examination of the Carrollian logic through the reconstruction of its syllogistic theory. Lewis Carroll was one of the main responsible for the dissemination of logic during the nineteenth century, but most of his logical writings remained unknown until a posthumous publication of 1977. The reconstruction of the Carrollian syllogistic theory was based on the comparison of the two books on author's logic, "The Game of Logic" and "Symbolic Logic". The analysis of the Carrollian syllogistics starts (...) from a study of the historical context of development of the logic and the developments of syllogistics previous to the contribution of the author. Situated in the historical period of algebraical logic, Carrollian syllogistics is characterized as a conservative extension of the Aristotelian syllogistics, the main innovation is the use of negative terms and the introduction of a diagrammatic method suitable for the representation of negative terms. The diagrammatic method of the Carrollian syllogistics presents advances in relation to the methods of Euler and Venn. The use of negative terms also requires a redefinition of the notion of syllogism, simplifying and expanding the amount of arguments amenable to logical treatment. Carroll does not use four, but only three categorical propositions in his syllogistic, with interpretation of existential presuppositions congruent with a syntactic-existential reading. Carrollian syllogistics uses some techniques found in the work of algebraists of logic and also made the same confusions between notions of "class" and "member" that were common in the period. Convinced of the social utility of logic and dedicated to popularize it, Carroll priorized a creation of new didactics for the teaching of logic in his works, where he can include his diagrammatic method of solving syllogisms. Carroll made only scant considerations of his conception of logic. Based on the small considerations found throughout the study and on the constant claim of the social utility of logic, it is suggested that Carroll is close to the so-called pragmatic position, which considers a logic as an instrument of regulation of discourse. (shrink)
Starting with 1985, we discovered the possible existence of electrons with net helicity in biomolecules as amino acids and their possibility to discern between the two quantum spin states. It is well known that the question of a possible fundamental role of quantum mechanics in biological matter constitutes still a long debate. In the last ten years we have given a rather complete quantum mechanical elaboration entirely based on Clifford algebra whose basic entities are isomorphic to the well known (...) spin Pauli matrices. A number of our recent results indicate the possible logical origin of quantum mechanics and the direct admission of quantum mechanics in the field of cognitive sciences. In February 2011 the authors Gölder et al., published their important discovery on Science about Spin Selectivity in Electron Transmission Through Self-Assembled Monolayers of Double-Stranded DNA confirming in such manner that the principles of quantum mechanics apply to biological systems. (shrink)
Robert Batterman’s ontological insights (2002, 2004, 2005) are apt: Nature abhors singularities. “So should we,” responds the physicist. However, the epistemic assessments of Batterman concerning the matter prove to be less clear, for in the same vein he write that singularities play an essential role in certain classes of physical theories referring to certain types of critical phenomena. I devise a procedure (“methodological fundamentalism”) which exhibits how singularities, at least in principle, may be avoided within the same classes of formalisms (...) discussed by Batterman. I show that we need not accept some divergence between explanation and reduction (Batterman 2002), or between epistemological and ontological fundamentalism (Batterman 2004, 2005). Though I remain sympathetic to the ‘principle of charity’ (Frisch (2005)), which appears to favor a pluralist outlook, I nevertheless call into question some of the forms such pluralist implications take in Robert Batterman’s conclusions. It is difficult to reconcile some of the pluralist assessments that he and some of his contemporaries advocate with what appears to be a countervailing trend in a burgeoning research tradition known as Clifford (or geometric) algebra. In my critical chapters (2 and 3) I use some of the demonstrated formal unity of Clifford algebra to argue that Batterman (2002) equivocates a physical theory’s ontology with its purely mathematical content. Carefully distinguishing the two, and employing Clifford algebraic methods reveals a symmetry between reduction and explanation that Batterman overlooks. I refine this point by indicating that geometric algebraic methods are an active area of research in computational fluid dynamics, and applied in modeling the behavior of droplet-formation appear to instantiate a “methodologically fundamental” approach. I argue in my introductory and concluding chapters that the model of inter-theoretic reduction and explanation offered by Fritz Rohrlich (1988, 1994) provides the best framework for accommodating the burgeoning pluralism in philosophical studies of physics, with the presumed claims of formal unification demonstrated by physicists choices of mathematical formalisms such as Clifford algebra. I show how Batterman’s insights can be reconstructed in Rohrlich’s framework, preserving Batterman’s important philosophical work, minus what I consider are his incorrect conclusions. (shrink)
This paper investigates a generalization of Boolean algebras which I call agglomerative algebras. It also outlines two conceptions of propositions according to which they form an agglomerative algebra but not a Boolean algebra with respect to conjunction and negation.
Hyperboolean algebras are Boolean algebras with operators, constructed as algebras of complexes (or, power structures) of Boolean algebras. They provide an algebraic semantics for a modal logic (called here a {\em hyperboolean modal logic}) with a Kripke semantics accordingly based on frames in which the worlds are elements of Boolean algebras and the relations correspond to the Boolean operations. We introduce the hyperboolean modal logic, give a complete axiomatization of it, and show that it lacks the finite model property. The (...) method of axiomatization hinges upon the fact that a "difference" operator is definable in hyperboolean algebras, and makes use of additional non-Hilbert-style rules. Finally, we discuss a number of open questions and directions for further research. (shrink)
Boolean-valued models of set theory were independently introduced by Scott, Solovay and Vopěnka in 1965, offering a natural and rich alternative for describing forcing. The original method was adapted by Takeuti, Titani, Kozawa and Ozawa to lattice-valued models of set theory. After this, Löwe and Tarafder proposed a class of algebras based on a certain kind of implication which satisfy several axioms of ZF. From this class, they found a specific 3-valued model called PS3 which satisfies all the axioms of (...) ZF, and can be expanded with a paraconsistent negation *, thus obtaining a paraconsistent model of ZF. The logic (PS3 ,*) coincides (up to language) with da Costa and D'Ottaviano logic J3, a 3-valued paraconsistent logic that have been proposed independently in the literature by several authors and with different motivations such as CluNs, LFI1 and MPT. We propose in this paper a family of algebraic models of ZFC based on LPT0, another linguistic variant of J3 introduced by us in 2016. The semantics of LPT0, as well as of its first-order version QLPT0, is given by twist structures defined over Boolean agebras. From this, it is possible to adapt the standard Boolean-valued models of (classical) ZFC to twist-valued models of an expansion of ZFC by adding a paraconsistent negation. We argue that the implication operator of LPT0 is more suitable for a paraconsistent set theory than the implication of PS3, since it allows for genuinely inconsistent sets w such that [(w = w)] = 1/2 . This implication is not a 'reasonable implication' as defined by Löwe and Tarafder. This suggests that 'reasonable implication algebras' are just one way to define a paraconsistent set theory. Our twist-valued models are adapted to provide a class of twist-valued models for (PS3,*), thus generalizing Löwe and Tarafder result. It is shown that they are in fact models of ZFC (not only of ZF). (shrink)
We present an algebraic account of the Tongan kinship terminology (TKT) that provides an insightful journey into the fabric of Tongan culture. We begin with the ethnographic account of a social event. The account provides us with the activities of that day and the centrality of kin relations in the event, but it does not inform us of the conceptual system that the participants bring with them. Rather, it is a slice in time of an ongoing dynamic process that links (...) behavior with a conceptual system of kin relations and vice versa. To understand this interplay, we need an account of the underlying conceptual system that is being activated during the event. Thus, we introduce a formal, algebraically based account of TKT. This account brings to the fore the underlying logic of TKT and allows us to distinguish between features of the kinship system that arise from the logic of TKT as a generative structure and features that must have arisen through cultural intervention. (shrink)
Mathematical diagrams are frequently used in contemporary mathematics. They are, however, widely seen as not contributing to the justificatory force of proofs: they are considered to be either mere illustrations or shorthand for non-diagrammatic expressions. Moreover, when they are used inferentially, they are seen as threatening the reliability of proofs. In this paper, I examine certain examples of diagrams that resist this type of dismissive characterization. By presenting two diagrammatic proofs, one from topology and one from algebra, I show (...) that diagrams form genuine notational systems, and I argue that this explains why they can play a role in the inferential structure of proofs without undermining their reliability. I then consider whether diagrams can be essential to the proofs in which they appear. (shrink)
Multialgebras have been much studied in mathematics and in computer science. In 2016 Carnielli and Coniglio introduced a class of multialgebras called swap structures, as a semantic framework for dealing with several Logics of Formal Inconsistency that cannot be semantically characterized by a single finite matrix. In particular, these LFIs are not algebraizable by the standard tools of abstract algebraic logic. In this paper, the first steps towards a theory of non-deterministic algebraization of logics by swap structures are given. Specifically, (...) a formal study of swap structures for LFIs is developed, by adapting concepts of universal algebra to multialgebras in a suitable way. A decomposition theorem similar to Birkhoff’s representation theorem is obtained for each class of swap structures. Moreover, when applied to the 3-valued algebraizable logics J3 and Ciore, their classes of algebraic models are retrieved, and the swap structures semantics become twist structures semantics. This fact, together with the existence of a functor from the category of Boolean algebras to the category of swap structures for each LFI, suggests that swap structures can be seen as non-deterministic twist structures. This opens new avenues for dealing with non-algebraizable logics by the more general methodology of multialgebraic semantics. (shrink)
The aim of this article is to investigate the roles of commutative diagrams (CDs) in a speciﬁc mathematical domain, and to unveil the reasons underlying their effectiveness as a mathematical notation; this will be done through a case study. It will be shown that CDs do not depict spatial relations, but represent mathematical structures. CDs will be interpreted as a hybrid notation that goes beyond the traditional bipartition of mathematical representations into diagrammatic and linguistic. It will be argued that one (...) of the reasons why CDs form a good notation is that they are highly mathematically tractable: experts can obtain valid results by ‘calculating’ with CDs. These calculations, take the form of ‘diagram chases’. In order to draw inferences, experts move algebraic elements around the diagrams. It will be argued that these diagrams are dynamic. It is thanks to their dynamicity that CDs can externalize the relevant reasoning and allow experts to draw conclusions directly by manipulating them. Lastly, it will be shown that CDs play essential roles in the context of proof as well as in other phases of the mathematical enterprise, such as discovery and conjecture formation. (shrink)
We give a complete axiomatization of the identities of the basic game algebra valid with respect to the abstract game board semantics. We also show that the additional conditions of termination and determinacy of game boards do not introduce new valid identities. En route we introduce a simple translation of game terms into plain modal logic and thus translate, while preserving validity both ways game identities into modal formulae. The completeness proof is based on reduction of game terms to (...) a certain 'minimal canonical form', by using only the axiomatic identities, and on showing that the equivalence of two minimal canonical terms can be established from these identities. (shrink)
In abstract algebraic logic, many systems, such as those paraconsistent logics taking inspiration from da Costa's hierarchy, are not algebraizable by even the broadest standard methodologies, as that of Blok and Pigozzi. However, these logics can be semantically characterized by means of non-deterministic algebraic structures such as Nmatrices, RNmatrices and swap structures. These structures are based on multialgebras, which generalize algebras by allowing the result of an operation to assume a non-empty set of values. This leads to an interest in (...) exploring the foundations of multialgebras applied to the study of logic systems. -/- It is well known from universal algebra that, for every signature Sigma, there exist algebras over Sigma which are absolutely free, meaning that they do not satisfy any identities or, alternatively, satisfy the universal mapping property for the class of Sigma-algebras. Furthermore, once we fix a cardinality of the generating set, they are, up to isomorphisms, unique, and equal to algebras of terms (or propositional formulas, in the context of logic). Equivalently, the forgetful functor, from the category of Sigma-algebras to Set, has a left adjoint. This result does not extend to multialgebras. Not only multialgebras satisfying the universal mapping property do not exist, but the forgetful functor U, from the category of Sigma-multialgebras to Set, does not have a left adjoint. -/- In this paper we generalize, in a natural way, algebras of terms to multialgebras of terms, whose family of submultialgebras enjoys many properties of the former. One example is that, to every pair consisting of a function, from a submultialgebra of a multialgebra of terms to another multialgebra, and a collection of choices (which selects how a homomorphism approaches indeterminacies), there corresponds a unique homomorphism, what resembles the universal mapping property. Another example is that the multialgebras of terms are generated by a set that may be viewed as a strong basis, which we call the ground of the multialgebra. Submultialgebras of multialgebras of terms are what we call weakly free multialgebras. Finally, with these definitions at hand, we offer a simple proof that multialgebras with the universal mapping property for the class of all multialgebras do not exist and that U does not have a left adjoint. (shrink)
Deontic logic is devoted to the study of logical properties of normative predicates such as permission, obligation and prohibition. Since it is usual to apply these predicates to actions, many deontic logicians have proposed formalisms where actions and action combinators are present. Some standard action combinators are action conjunction, choice between actions and not doing a given action. These combinators resemble boolean operators, and therefore the theory of boolean algebra offers a well-known athematical framework to study the properties of (...) the classic deontic operators when applied to actions. In his seminal work, Segerberg uses constructions coming from boolean algebras to formalize the usual deontic notions. Segerberg’s work provided the initial step to understand logical properties of deontic operators when they are applied to actions. In the last years, other authors have proposed related logics. In this chapter we introduce Segerberg’s work, study related formalisms and investigate further challenges in this area. (shrink)
An elementary algebra identifies conceptual and corresponding applicational limitations in John Kemeny and Paul Oppenheim’s (K-O) 1956 model of theoretical reduction in the sciences. The K-O model was once widely accepted, at least in spirit, but seems afterward to have been discredited, or in any event superceeded. Today, the K-O reduction model is seldom mentioned, except to clarify when a reduction in the Kemeny-Oppenheim sense is not intended. The present essay takes a fresh look at the basic mathematics of (...) K-O comparative vocabulary theoretical term reductions, from historical and philosophical standpoints, as a contribution to the history of the philosophy of science. The K-O theoretical reduction model qualifies a theory replacement as a successful reduction when preconditions of explanatory adequacy and comparable systematicization are met, and there occur fewer numbers of theoretical terms identified as replicable syntax types in the most economical statement of a theory’s putative propositional truths, as compared with the theoretical term count for the theory it replaces. The challenge to the historical model developed here, to help explain its scope and limitations, involves the potential for equivocal theoretical meanings of multiple theoretical term tokens of the same syntactical type. (shrink)
We deepen the study of conjoined and disjoined conditional events in the setting of coherence. These objects, differently from other approaches, are defined in the framework of conditional random quantities. We show that some well known properties, valid in the case of unconditional events, still hold in our approach to logical operations among conditional events. In particular we prove a decomposition formula and a related additive property. Then, we introduce the set of conditional constituents generated by $n$ conditional events and (...) we show that they satisfy the basic properties valid in the case of unconditional events. We obtain a generalized inclusion-exclusion formula, which can be interpreted by introducing a suitable distributive property. Moreover, under logical independence of basic unconditional events, we give two necessary and sufficient coherence conditions. The first condition gives a geometrical characterization for the coherence of prevision assessments on a family F constituted by n conditional events and all possible conjunctions among them. The second condition characterizes the coherence of prevision assessments defined on $F\cup K$, where $K$ is the set of conditional constituents associated with the conditional events in $F$. Then, we give some further theoretical results and we examine some examples and counterexamples. Finally, we make a comparison with other approaches and we illustrate some theoretical aspects and applications. (shrink)
Scroggs's theorem on the extensions of S5 is an early landmark in the modern mathematical studies of modal logics. From it, we know that the lattice of normal extensions of S5 is isomorphic to the inverse order of the natural numbers with infinity and that all extensions of S5 are in fact normal. In this paper, we consider extending Scroggs's theorem to modal logics with propositional quantifiers governed by the axioms and rules analogous to the usual ones for ordinary quantifiers. (...) We call them Π-logics. Taking S5Π, the smallest normal Π-logic extending S5, as the natural counterpart to S5 in Scroggs's theorem, we show that all normal Π-logics extending S5Π are complete with respect to their complete simple S5 algebras, that they form a lattice that is isomorphic to the lattice of the open sets of the disjoint union of two copies of the one-point compactification of N, that they have arbitrarily high Turing-degrees, and that there are non-normal Π-logics extending S5Π. (shrink)
There are two foundational, but not fully developed, ideas in paraconsistency, namely, the duality between paraconsistent and intuitionistic paradigms, and the introduction of logical operators that express meta-logical notions in the object language. The aim of this paper is to show how these two ideas can be adequately accomplished by the Logics of Formal Inconsistency (LFIs) and by the Logics of Formal Undeterminedness (LFUs). LFIs recover the validity of the principle of explosion in a paraconsistent scenario, while LFUs recover the (...) validity of the principle of excluded middle in a paracomplete scenario. We introduce definitions of duality between inference rules and connectives that allow comparing rules and connectives that belong to different logics. Two formal systems are studied, the logics mbC and mbD, that display the duality between paraconsistency and paracompleteness as a duality between inference rules added to a common core– in the case studied here, this common core is classical positive propositional logic (CPL + ). The logics mbC and mbD are equipped with recovery operators that restore classical logic for, respectively, consistent and determined propositions. These two logics are then combined obtaining a pair of logics of formal inconsistency and undeterminedness (LFIUs), namely, mbCD and mbCDE. The logic mbCDE exhibits some nice duality properties. Besides, it is simultaneously paraconsistent and paracomplete, and able to recover the principles of excluded middle and explosion at once. The last sections offer an algebraic account for such logics by adapting the swap-structures semantics framework of the LFIs the LFUs. This semantics highlights some subtle aspects of these logics, and allows us to prove decidability by means of finite non-deterministic matrices. (shrink)
We start from previous studies of G.N. Ord and A.S. Deakin showing that both the classical diffusion equation and Schrödinger equation of quantum mechanics have a common stump. Such result is obtained in rigorous terms since it is demonstrated that both diffusion and Schrödinger equations are manifestation of the same mathematical axiomatic set of the Clifford algebra. By using both such ( ) i A S and the i,±1 N algebra, it is evidenced, however, that possibly the two (...) basic equations of the physics cannot be reconciled. 1. (shrink)
Gauss’s quadratic reciprocity theorem is among the most important results in the history of number theory. It’s also among the most mysterious: since its discovery in the late 18th century, mathematicians have regarded reciprocity as a deeply surprising fact in need of explanation. Intriguingly, though, there’s little agreement on how the theorem is best explained. Two quite different kinds of proof are most often praised as explanatory: an elementary argument that gives the theorem an intuitive geometric interpretation, due to Gauss (...) and Eisenstein, and a sophisticated proof using algebraic number theory, due to Hilbert. Philosophers have yet to look carefully at such explanatory disagreements in mathematics. I do so here. According to the view I defend, there are two important explanatory virtues—depth and transparency—which different proofs (and other potential explanations) possess to different degrees. Although not mutually exclusive in principle, the packages of features associated with the two stand in some tension with one another, so that very deep explanations are rarely transparent, and vice versa. After developing the theory of depth and transparency and applying it to the case of quadratic reciprocity, I draw some morals about the nature of mathematical explanation. (shrink)
In this essay I will defend three points, the first being that Descartes- unlike the aristotelian traditon- maintained that abstraction is not a operation in which the intellect builds the mathematical object resorting to sensible ob- jects. Secondly I will demonstrate that, according to cartesian philosophy, the faculty of understanding has the ability to instatiate- within the process of abstraction- mathematical symbols that represent the relation between quantities, whether magnitude or multitude.And finally I will advocate that the lack of onthological (...) commitment with sensible experience found in cartesian philosophy of mathematics allows for the creation of a mathematical language that regards the objects of geometry and arithmetics through a system of rules and notations, in other words, algebra. (shrink)
We comment some recent results obtained by using a Clifford bare bone skeleton of quantum mechanics in order to formulate the conclusion that quantum mechanics has its origin in the logic, and relates conceptual entities. Such results touch directly the basic problem about the structure of our cognitive and conceptual dynamics and thus of our mind. The problem of exploring consciousness results consequently to be strongly linked. This is the reason because studies on quantum mechanics applied to this matter are (...) so important for neurologists and psychologists. Under this profile we present some experimental results showing violation of Bell inequality during the MBTI test in investigation of C.V. Jung’s theory of personality. (shrink)
We study the general problem of axiomatizing structures in the framework of modal logic and present a uniform method for complete axiomatization of the modal logics determined by a large family of classes of structures of any signature.
We study the general problem of axiomatizing structures in the framework of modal logic and present a uniform method for complete axiomatization of the modal logics determined by a large family of classes of structures of any signature.
The main algebraic foundations of quantum mechanics are quickly reviewed. They have been suggested since the birth of this theory till up to last years. They are the following ones: Heisenberg-Born- Jordan’s (1925), Weyl’s (1928), Dirac’s (1930), von Neumann’s (1936), Segal’s (1947), T.F. Jordan’s (1986), Morchio and Strocchi’s (2009) and Buchholz and Fregenhagen’s (2019). Four cases are stressed: 1) the misinterpretation of Dirac’s algebraic foundation; 2) von Neumann’s ‘conversion’ from the analytic approach of Hilbert space to the algebraic approach of (...) the rings of operators; 3) Morchio and Strocchi’s improving Dirac’s analogy between commutators and Poisson Brackets into an exact equivalence; 4) the recent foundation of quantum mechanics upon the algebra of perturbations. Some considerations on alternating theoretical importance of the algebraic approach in the history of QM are offered. The level of formalism has increased from the mere introduction of matrices to group theory and C*-algebras but has not led to a definition of the foundations of physics; in particular, an algebraic formulation of QM organized as a problem-based theory and an exclusive use of constructive mathematics is still to be discovered. (shrink)
The objective of this paper is to study algebraic properties of neutrosophic matrices, where a necessary and sufficient condition for the invertibility of a square neutrosophic matrix is presented by defining the neutrosophic determinant. On the other hand, this work introduces the concept of neutrosophic Eigen values and vectors with an easy algorithm to compute them. Also, this article finds a necessary and sufficient condition for the diagonalization of a neutrosophic matrix.
This paper is concerned with the construction of theories of software systems yielding adequate predictions of their target systems’ computations. It is first argued that mathematical theories of programs are not able to provide predictions that are consistent with observed executions. Empirical theories of software systems are here introduced semantically, in terms of a hierarchy of computational models that are supplied by formal methods and testing techniques in computer science. Both deductive top-down and inductive bottom-up approaches in the discovery of (...) semantic software theories are refused to argue in favour of the abductive process of hypothesising and refining models at each level in the hierarchy, until they become satisfactorily predictive. Empirical theories of computational systems are required to be modular, as modular are most software verification and testing activities. We argue that logic relations must be thereby defined among models representing different modules in a semantic theory of a modular software system. We exclude that scientific structuralism is able to define module relations needed in software modular theories. The algebraic Theory of Institutions is finally introduced to specify the logic structure of modular semantic theories of computational systems. (shrink)
An endomorphism on an algebra \ is said to be strong if it is compatible with every congruence on \; and \ is said to have the strong endomorphism kernel property if every congruence on \, other than the universal congruence, is the kernel of a strong endomorphism on \. Here we characterise the structure of those double MS-algebras that have this property by the way of Priestley duality.
A study of the making of George Peacock's highly influential, yet disturbingly split, 1830 account of algebra as an entanglement of two separate undertakings: arithmetical and symbolical or formal.
By the middle of the seventeenth century we that find that algebra is able to offer proofs in its own right. That is, by that time algebraic argument had achieved the status of proof. How did this transformation come about?
In this paper, a formal theory is presented that describes syntactic and semantic mechanisms of philosophical discourses. They are treated as peculiar language systems possessing deep derivational structures called architectonic forms of philosophical systems, encoded in philosophical mind. Architectonic forms are constituents of more complex structures called architectonic spaces of philosophy. They are understood as formal and algorithmic representations of various philosophical traditions. The formal derivational machinery of a given space determines its class of all possible architectonic forms. Some of (...) them stand under factual historical philosophical systems and they organize processes of doing philosophy within these systems. Many architectonic forms have never been realized in the history of philosophy. The presented theory may be interpreted as falling under Hegel’s paradigm of comprehending cultural texts. This paradigm is enriched and inspired with Propp’s formal, morphological view on texts. The peculiarity of this modification of the Hegel-Propp paradigm consists of the use of algebraic and algorithmic tools of modeling processes of cultural development. To speak metaphorically, the theory is an attempt at the mathematical and logical history of philosophy inspired by the Internet metaphor. And that is why it belongs to the tradition of doing metaphilosophy in The Lvov-Warsaw School, which is continued today mainly by Woleński, Pelc, Perzanowski, and Jadacki. (shrink)
We review our approach to quantum mechanics adding also some new interesting results. We start by giving proof of two important theorems on the existence of the A(Si) and i,±1 N Clifford algebras. This last algebra gives proof of the von Neumann basic postulates on the quantum measurement explaining thus in an algebraic manner the wave function collapse postulated in standard quantum theory. In this manner we reach the objective to expose a self-consistent version of quantum mechanics. In detail (...) we realize a bare bone skeleton of quantum mechanics recovering all the basic foundations of this theory on an algebraic framework. We give proof of the quantum like Heisenberg uncertainty relations using only the basic support of the Clifford algebra. In addition we demonstrate the well known phenomenon of quantum Mach Zender interference using the same algebraic framework, as well as we give algebraic proof of quantum collapse in some cases of physical interest by direct application of the theorem that we derive to elaborate the i,±1 N algebra. We also discuss the problem of time evolution of quantum systems as well as the changes in space location, in momentum and the linked invariance principles. We are also able to re-derive the basic wave function of standard quantum mechanics by using only the Clifford algebraic approach. In this manner we obtain a full exposition of standard quantum mechanics using only the basic axioms of Clifford algebra. We also discuss more advanced features of quantum mechanics. In detail, we give demonstration of the Kocken-Specher theorem, and also we give an algebraic formulation and explanation of the EPR paradox only using the Clifford algebra. By using the same approach we also derive Bell inequalities. Our formulation is strongly based on the use of idempotents that are contained in Clifford algebra. Their counterpart in quantum mechanics is represented by the projection operators that, as it is well known, are interpreted as logical statements, following the basic von Neumann results. Von Neumann realized a matrix logic on the basis of quantum mechanics. Using the Clifford algebra we are able to invert such result. According to the results previously obtained by Orlov in 1994, we are able to give proof that quantum mechanics derives from logic. We show that indeterminism and quantum interference have their origin in the logic. Therefore, it seems that we may conclude that quantum mechanics, as it appears when investigated by the Clifford algebra, is a two-faced theory in the sense that it looks from one side to “matter per se”, thus to objects but simultaneously also to conceptual entities. We advance the basic conclusion of the paper: There are stages of our reality in which we no more can separate the logic ( and thus cognition and thus conceptual entity) from the features of “matter per se”. In quantum mechanics the logic, and thus the cognition and thus the conceptual entity-cognitive performance, assume the same importance as the features of what is being described. We are at levels of reality in which the truths of logical statements about dynamic variables become dynamic variables themselves so that a profound link is established from its starting in this theory between physics and conceptual entities. Finally, in this approach there is not an absolute definition of logical truths. Transformations , and thus … “redefinitions”…. of truth values are permitted in such scheme as well as the well established invariance principles, clearly indicate . (shrink)
In the study of modal and nonclassical logics, translations have frequently been employed as a way of measuring the inferential capabilities of a logic. It is sometimes claimed that two logics are “notational variants” if they are translationally equivalent. However, we will show that this cannot be quite right, since first-order logic and propositional logic are translationally equivalent. Others have claimed that for two logics to be notational variants, they must at least be compositionally intertranslatable. The definition of compositionality these (...) accounts use, however, is too strong, as the standard translation from modal logic to first-order logic is not compositional in this sense. In light of this, we will explore a weaker version of this notion that we will call schematicity and show that there is no schematic translation either from first-order logic to propositional logic or from intuitionistic logic to classical logic. (shrink)
In 2001, W. Carnielli and Marcos considered a 3-valued logic in order to prove that the schema ϕ ∨ (ϕ → ψ) is not a theorem of da Costa’s logic Cω. In 2006, this logic was studied (and baptized) as G'3 by Osorio et al. as a tool to deﬁne semantics of logic programming. It is known that the truth-tables of G'3 have the same expressive power than the one of Łukasiewicz 3-valued logic as well as the one of Gödel (...) 3-valued logic G3. From this, the three logics coincide up-to language, taking into acccount that 1 is the only designated truth-value in these logics. -/- From the algebraic point of view, Canals-Frau and Figallo have studied the 3-valued modal implicative semilattices, where the modal operator is the well-known Moisil-Monteiro-Baaz Δ operator, and the supremum is deﬁnable from this. We prove that the subvariety obtained from this by adding a bottom element 0 is term-equivalent to the variety generated by the 3-valued algebra of G'3. The algebras of that variety are called G'3-algebras. From this result, we obtain the equations which axiomatize the variety of G'3-algebras. Moreover, we prove that this variety is semisimple, and the 3-element and the 2-element chains are the unique simple algebras of the variety. Finally an extension of G'3 to ﬁrst-order languages is presented, with an algebraic semantics based on complete G'3-algebras. The corresponding soundness and completeness theorems are obtained. (shrink)
We indicate a new way in the solution of the problem of the quantum measurement . In past papers we used the well-known formalism of the density matrix using an algebraic approach in a two states quantum spin system S, considering the particular case of three anticommuting elements. We demonstrated that, during the wave collapse, we have a transition from the standard Clifford algebra, structured in its space and metrics, to the new spatial structure of the Clifford dihedral (...) class='Hi'>algebra. This structured geometric transition, which occurs during the interaction of the S system with the macroscopic measurement system M, causes the destruction of the interferential factors. In the present paper we construct a detailed model of the (S+M) interaction evidencing the particular role of the Time Ordering in the (S+M) coupling since we have a time asymmetric interaction . We demonstrate that , during the measurement , the physical circumstance that the fermion creation and annihilation operators of the S system must be destroyed during such interaction has a fundamental role . (shrink)
We review a rough scheme of quantum mechanics using the Clifford algebra. Following the steps previously published in a paper by another author [31], we demonstrate that quantum interference arises in a Clifford algebraic formulation of quantum mechanics. In 1932 J. von Neumann showed that projection operators and, in particular, quantum density matrices can be interpreted as logical statements. In accord with a previously obtained result by V. F Orlov , in this paper we invert von Neumann’s result. Instead (...) of constructing logic from quantum mechanics , we construct quantum mechanics from an extended classical logic. It follows that the origins of the two most fundamental quantum phenomena , the indeterminism and the interference of probabilities, lie not in the traditional physics by itself but in the logical structure as realized here by the Clifford algebra. (shrink)
We review a rough scheme of quantum mechanics using the Clifford algebra. Following the steps previously published in a paper by another author [31], we demonstrate that quantum interference arises in a Clifford algebraic formulation of quantum mechanics. In 1932 J. von Neumann showed that projection operators and, in particular, quantum density matrices can be interpreted as logical statements. In accord with a previously obtained result by V. F Orlov , in this paper we invert von Neumann’s result. Instead (...) of constructing logic from quantum mechanics , we construct quantum mechanics from an extended classical logic. It follows that the origins of the two most fundamental quantum phenomena , the indeterminism and the interference of probabilities, lie not in the traditional physics by itself but in the logical structure as realized here by the Clifford algebra. (shrink)
A model of consciousness and conscious experience is introduced. Starting with a non-Lipschitz Chaotic dynamics of neural activity, we propose that the synaptic transmission between adjacent as well as distant neurons should be regulated in brain dynamics through quantum tunneling. Further, based on various studies of different previous authors, we consider the emergence of very large quantum mechanical system representable by an abstract quantum net entirely based on quantum-like entities having in particular the important feature of expressing self-reference similar to (...) what occurs in consciousness. The properties of such quantum-like mind entities are discussed in detail. A quantum-like model of conscious experience is also discussed. It is shown that such quantum mechanical entities are able to arrange themselves alternatively on the basis of the subject story, memory, and pain-pleasure in response to an external stimulus, thus giving the subject the possibility to response to the stimulus on the basis of his emotion as well as cognitive state. Finally, we discuss the possible connections between the quantum-like model introduced in this paper and the chaotic behaviors often identified experimentally in studies on brain dynamics. Part I of this article contains: Introduction; 1. Non Lipschitz Terminal Dynamics of Single Neuron Activity; and References; 2. Quantum Mechanical Properties of Neuron Dynamics; and 3. A Quantum Model of Consciousness I. (shrink)
Among Dunn’s many important contributions to relevance logic was his work on the system RM (R-mingle). Although RM is an interesting system in its own right, it is widely considered to be too strong. In this chapter, I revisit a closely related system, RM0 (sometimes known as ‘constructive mingle’), which includes the mingle axiom while not degenerating in the way that RM itself does. My main interest will be in examining this logic from two related semantical perspectives. First, I give (...) a purely operational bisemilattice semantics for it by adapting previous work of Humberstone. Second, I examine a more conventional algebraic semantics for it and discuss how this relates to the operational semantics. A novel operational semantics for J (intuitionistic logic) as well as its conventional Heyting algebraic semantics emerge as special cases of the corresponding semantics for RM0. The results of this chapter suggest that RM0 is a more interesting logic than has been appreciated and that Humberstone’s operational semantic framework similarly deserves more attention than it has received. (shrink)
Modifying the descriptive and theoretical generalizations of Relativized Minimality, we argue that a significant subset of weak island violations arise when an extracted phrase should scope over some intervener but is unable to. Harmless interveners seem harmless because they can support an alternative reading. This paper focuses on why certain wh-phrases are poor wide scope takers, and offers an algebraic perspective on scope interaction. Each scopal element SE is associated with certain operations (e.g., not with complements). When a wh-phrase scopes (...) over some SE, the operations associated with that SE are performed in its denotation domain. The requisite operations may or may not be available in a domain, however. We present an empirical analysis of a variety of wh-phrases. It is argued that the wh-phrases that escape all weak islands (i.e., can scope over any intervener) are those that range over individuals, the reason being that all Boolean operations are defined for their domain. Collectives, manners, amounts, numbers, etc. all denote in domains with fewer operations and are thus selectively sensitive to scopal interveners—a “semantic relativized minimality effect”. (shrink)
The human attempts to access, measure and organize physical phenomena have led to a manifold construction of mathematical and physical spaces. We will survey the evolution of geometries from Euclid to the Algebraic Geometry of the 20th century. The role of Persian/Arabic Algebra in this transition and its Western symbolic development is emphasized. In this relation, we will also discuss changes in the ontological attitudes toward mathematics and its applications. Historically, the encounter of geometric and algebraic perspectives enriched the (...) mathematical practices and their foundations. Yet, the collapse of Euclidean certitudes, of over 2300 years, and the crisis in the mathematical analysis of the 19th century, led to the exclusion of “geometric judgments” from the foundations of Mathematics. After the success and the limits of the logico-formal analysis, it is necessary to broaden our foundational tools and re-examine the interactions with natural sciences. In particular, the way the geometric and algebraic approaches organize knowledge is analyzed as a cross-disciplinary and cross-cultural issue and will be examined in Mathematical Physics and Biology. We finally discuss how the current notions of mathematical (phase) “space” should be revisited for the purposes of life sciences. (shrink)
Intra-molecular connectivity (that is, chemical structure) does not emerge from computations based on fundamental quantum-mechanical principles. In order to compute molecular electronic energies (of C 3 H 4 hydrocarbons, for instance) quantum chemists must insert intra-molecular connectivity “by hand.” Some take this as an indication that chemistry cannot be reduced to physics: others consider it as evidence that quantum chemistry needs new logical foundations. Such discussions are generally synchronic rather than diachronic —that is, they neglect ‘historical’ aspects. However, systems of (...) interest to chemists generally are metastable . In many cases chemical systems of a given elemental composition may exist in any one of several different metastable states depending on the history of the system. Molecular structure generally depends on contingent historical circumstances of synthesis and separation, rather than solely or mainly on relative energies of alternative stable states, those energies in turn determined by relationships among components. Chemical structure is usually ‘kinetically-determined’ rather than ‘thermodynamically-determined.’ For instance, cyclical hydrocarbon ring-systems (as in cyclopropene) are produced only in special circumstances. Adequate theoretical treatments must take account of the persistent effects of such contingent historical events whenever they are relevant—as they generally are in chemistry. (shrink)
Recent work in philosophy of language has raised significant problems for the traditional theory of propositions, engendering serious skepticism about its general workability. These problems are, I believe, tied to fundamental misconceptions about how the theory should be developed. The goal of this paper is to show how to develop the traditional theory in a way which solves the problems and puts this skepticism to rest. The problems fall into two groups. The first has to do with reductionism, specifically attempts (...) to reduce propositions to extensional entities-either extensional functions or sets. The second group concerns problems of fine grained content-both traditional 'Cicero'/'Tully' puzzles and recent variations on them which confront scientific essentialism. After characterizing the problems, I outline a non-reductionist approach-the algebraic approach-which avoids the problems associated with reductionism. I then go on to show how the theory can incorporate non-Platonic (as well as Platonic) modes of presentation. When these are implemented nondescriptively, they yield the sort of fine-grained distinctions which have been eluding us. The paper closes by applying the theory to a cluster of remaining puzzles, including a pair of new puzzles facing scientific essentialism. (shrink)
The concept and usage of the word 'metric' within General Relativity is briefly described. The early work of Roy Kerr led to his original 1963 algebraic, rotating metric. This discovery and his subsequent recollection in 2008 are summarised as the motivation for this article. Computer algebra has confirmed that nominal transformations of this early metric can generate further natural algebraic metrics. The algebra is not abstract, nor advanced, and these metrics have been overlooked for many years. The 1916 (...) metric due to Schwarzschild misled Kerr into seeking a similar metric for rotation. The philosophy of astrophysics for Penrose, Hawking and others would have been very different had they used the two lost metrics. (shrink)
In this paper we motivate and develop the analytic theory of measurement, in which autonomously specified algebras of quantities (together with the resources of mathematical analysis) are used as a unified mathematical framework for modeling (a) the time-dependent behavior of natural systems, (b) interactions between natural systems and measuring instruments, (c) error and uncertainty in measurement, and (d) the formal propositional language for describing and reasoning about measurement results. We also discuss how a celebrated theorem in analysis, known as Gelfand (...) representation, guarantees that autonomously specified algebras of quantities can be interpreted as algebras of observables on a suitable state space. Such an interpretation is then used to support (i) a realist conception of quantities as objective characteristics of natural systems, and (ii) a realist conception of measurement results (evaluations of quantities) as determined by and descriptive of the states of a target natural system. As a way of motivating the analytic approach to measurement, we begin with a discussion of some serious philosophical and theoretical problems facing the well-known representational theory of measurement. We then explain why we consider the analytic approach, which avoids all these problems, to be far more attractive on both philosophical and theoretical grounds. (shrink)
What does it mean to ‘give’ the value of a variable in an algebraic context, and how does giving the value of a variable differ from merely describing it? I argue that to answer this question, we need to examine the role that giving the value of a variable plays in problem-solving practice. I argue that four different features are required for a statement to count as giving the value of a variable in the context of solving an elementary (...) class='Hi'>algebra problem: the variable must be in the scope opened by the problem statement; the values given must be in the range of the variable, which is determined by the problem; the statement giving the values must represent a complete solution; and it must be in a canonical form. This account helps us better understand elementary algebra itself, as well as the use of algebraic tools to analyze phenomena in natural language. (shrink)
The paper concentrates on the problem of adequate reflection of fragments of reality via expressions of language and inter-subjective knowledge about these fragments, called here, in brief, language adequacy. This problem is formulated in several aspects, the most being: the compatibility of language syntax with its bi-level semantics: intensional and extensional. In this paper, various aspects of language adequacy find their logical explication on the ground of the formal-logical theory T of any categorial language L generated by the so-called classical (...) categorial grammar, and also on the ground of its extension to the bi-level, intensional and extensional semantic-pragmatic theory ST for L. In T, according to the token-type distinction of Ch.S. Peirce, L is characterized first as a language of well-formed expression-tokens (wfe-tokens) - material, concrete objects - and then as a language of wfe-types - abstract objects, classes of wfe-tokens. In ST the semantic-pragmatic notions of meaning and interpretation for wfe-types of L of intensional semantics and the notion of denotation of extensional semanics for wfe-types and constituents of knowledge are formalized. These notions allow formulating a postulate (an axiom of categorial adequacy) from which follow all the most important conditions of the language adequacy, including the above, and a structural one connected with three principles of compositionality. (shrink)
Logics based on weak Kleene algebra (WKA) and related structures have been recently proposed as a tool for reasoning about flaws in computer programs. The key element of this proposal is the presence, in WKA and related structures, of a non-classical truth-value that is “contaminating” in the sense that whenever the value is assigned to a formula ϕ, any complex formula in which ϕ appears is assigned that value as well. Under such interpretations, the contaminating states represent occurrences of (...) a flaw. However, since different programs and machines can interact with (or be nested into) one another, we need to account for different kind of errors, and this calls for an evaluation of systems with multiple contaminating values. In this paper, we make steps toward these evaluation systems by considering two logics, HYB1 and HYB2, whose semantic interpretations account for two contaminating values beside classical values 0 and 1. In particular, we provide two main formal contributions. First, we give a characterization of their relations of (multiple-conclusion) logical consequence—that is, necessary and sufficient conditions for a set Δ of formulas to logically follow from a set Γ of formulas in HYB1 or HYB2 . Second, we provide sound and complete sequent calculi for the two logics. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.