The independence phenomenon in set theory, while pervasive, can be partially addressed through the use of large cardinal axioms. A commonly assumed idea is that large cardinal axioms are species of maximality principles. In this paper, I argue that whether or not large cardinal axioms count as maximality principles depends on prior commitments concerning the richness of the subset forming operation. In particular I argue that there is a conception of maximality through absoluteness, on which large cardinal (...)axioms are restrictive. I argue, however, that large cardinals are still important axioms of set theory and can play many of their usual foundational roles. (shrink)
The Hyperuniverse Programme, introduced in Arrigoni and Friedman (2013), fosters the search for new set-theoretic axioms. In this paper, we present the procedure envisaged by the programme to find new axioms and the conceptual framework behind it. The procedure comes in several steps. Intrinsically motivated axioms are those statements which are suggested by the standard concept of set, i.e. the `maximal iterative concept', and the programme identi fies higher-order statements motivated by the maximal iterative concept. The satisfaction (...) of these statements (H-axioms) in countable transitive models, the collection of which constitutes the `hyperuniverse' (H), has remarkable 1st-order consequences, some of which we review in section 5. (shrink)
In this article I develop an elementary system of axioms for Euclidean geometry. On one hand, the system is based on the symmetry principles which express our a priori ignorant approach to space: all places are the same to us, all directions are the same to us and all units of length we use to create geometric figures are the same to us. On the other hand, through the process of algebraic simplification, this system of axioms directly provides (...) the Weyl’s system of axioms for Euclidean geometry. The system of axioms, together with its a priori interpretation, offers new views to philosophy and pedagogy of mathematics: it supports the thesis that Euclidean geometry is a priori, it supports the thesis that in modern mathematics the Weyl’s system of axioms is dominant to the Euclid’s system because it reflects the a priori underlying symmetries, it gives a new and promising approach to learn geometry which, through the Weyl’s system of axioms, leads from the essential geometric symmetry principles of the mathematical nature directly to modern mathematics. (shrink)
In the early 1900s, Russell began to recognize that he, and many other mathematicians, had been using assertions like the Axiom of Choice implicitly, and without explicitly proving them. In working with the Axioms of Choice, Infinity, and Reducibility, and his and Whitehead’s Multiplicative Axiom, Russell came to take the position that some axioms are necessary to recovering certain results of mathematics, but may not be proven to be true absolutely. The essay traces historical roots of, and motivations (...) for, Russell’s method of analysis, which are intended to shed light on his view about the status of mathematical axioms. I describe the position Russell develops in consequence as “immanent logicism,” in contrast to what Irving (1989) describes as “epistemic logicism.” Immanent logicism allows Russell to avoid the logocentric predicament, and to propose a method for discovering structural relationships of dependence within mathematical theories. (shrink)
A description of consciousness leads to a contradiction with the postulation from special relativity that there can be no connections between simultaneous event. This contradiction points to consciousness involving quantum level mechanisms. The Quantum level description of the universe is re- evaluated in the light of what is observed in consciousness namely 4 Dimensional objects. A new improved interpretation of Quantum level observations is introduced. From this vantage point the following axioms of consciousness is presented. Consciousness consists of two (...) distinct components, the observed U and the observer I. The observed U consist of all the events I is aware of. A vast majority of these occur simultaneously. Now if I were to be an entity within the space-time continuum, all of these events of U together with I would have to occur at one point in space-time. However, U is distributed over a definite region of space-time (region in brain). Thus, I is aware of a multitude of space-like separated events. It is seen that this awareness necessitates I to be an entity outside the space-time continuum. With I taken as such, a new concept called concept A is introduced. With the help of concept A a very important axiom of consciousness, namely Free Will is explained. Libet s Experiment which was originally seen to contradict Free will, in the light of Concept A is shown to support it. A variation to Libet s Experiment is suggested that will give conclusive proof for Concept A and Free Will. (shrink)
We present an elementary system of axioms for the geometry of Minkowski spacetime. It strikes a balance between a simple and streamlined set of axioms and the attempt to give a direct formalization in first-order logic of the standard account of Minkowski spacetime in [Maudlin 2012] and [Malament, unpublished]. It is intended for future use in the formalization of physical theories in Minkowski spacetime. The choice of primitives is in the spirit of [Tarski 1959]: a predicate of betwenness (...) and a four place predicate to compare the square of the relativistic intervals. Minkowski spacetime is described as a four dimensional ‘vector space’ that can be decomposed everywhere into a spacelike hyperplane - which obeys the Euclidean axioms in [Tarski and Givant, 1999] - and an orthogonal timelike line. The length of other ‘vectors’ are calculated according to Pythagora’s theorem. We conclude with a Representation Theorem relating models of our system that satisfy second order continuity to the mathematical structure called ‘Minkowski spacetime’ in physics textbooks. (shrink)
In quantum theory every state can be diagonalized, i.e. decomposed as a convex combination of perfectly distinguishable pure states. This elementary structure plays an ubiquitous role in quantum mechanics, quantum information theory, and quantum statistical mechanics, where it provides the foundation for the notions of majorization and entropy. A natural question then arises: can we reconstruct these notions from purely operational axioms? We address this question in the framework of general probabilistic theories, presenting a set of axioms that (...) guarantee that every state can be diagonalized. The first axiom is Causality, which ensures that the marginal of a bipartite state is well defined. Then, Purity Preservation states that the set of pure transformations is closed under composition. The third axiom is Purification, which allows to assign a pure state to the composition of a system with its environment. Finally, we introduce the axiom of Pure Sharpness, stating that for every system there exists at least one pure effect occurring with unit probability on some state. For theories satisfying our four axioms, we show a constructive algorithm for diagonalizing every given state. The diagonalization result allows us to formulate a majorization criterion that captures the convertibility of states in the operational resource theory of purity, where random reversible transformations are regarded as free operations. (shrink)
In this article, a possible generalization of the Löb’s theorem is considered. Main result is: let κ be an inaccessible cardinal, then ¬Con( ZFC +∃κ) .
The systems of arithmetic discussed in this work are non-elementary theories. In this paper, natural numbers are characterized axiomatically in two di erent ways. We begin by recalling the classical set P of axioms of Peano’s arithmetic of natural numbers proposed in 1889 (including such primitive notions as: set of natural numbers, zero, successor of natural number) and compare it with the set W of axioms of this arithmetic (including the primitive notions like: set of natural numbers and (...) relation of inequality) proposed by Witold Wilkosz, a Polish logician, philosopher and mathematician, in 1932. The axioms W are those of ordered sets without largest element, in which every non-empty set has a least element, and every set bounded from above has a greatest element. We show that P and W are equivalent and also that the systems of arithmetic based on W or on P, are categorical and consistent. There follows a set of intuitive axioms PI of integers arithmetic, modelled on P and proposed by B. Iwanuś, as well as a set of axioms WI of this arithmetic, modelled on the W axioms, PI and WI being also equivalent, categorical and consistent. We also discuss the problem of independence of sets of axioms, which were dealt with earlier. (shrink)
The idea of science being the best – or the only – way to reach the truth about our cosmos has been a major belief of modern civilization. Yet, science has grown tall on fragile legs of clay. Every scientific theory uses axioms and assumptions that by definition cannot be proved. This poses a serious limitation to the use of science as a tool to find the truth. The only way to search for the latter is to redefine the (...) former to its original glory. In the days well before Galileo and Newton, science and religion were not separated. They worked together to discover the truth and while the latter had God as its final destination, the former had God as its starting point. Science is based on the irrational (unproven) belief that the world is intelligible along many other assumptions. This poses a serious limitation to science that can only be overcome if we accept the irrationality of the cosmos. The motto “Credo quia absurdum” holds more truth than one can ever realize at first glance. There is nothing logical in logic, whereas there is deep wisdom in the irrational. For while the former tries to build castles on moving sand, the latter digs deep inside the depths of existence itself in order to build on the most concrete foundations that there can be: the cosmos itself. The only way forward is backwards. Backwards to a time when religion led the quest for knowledge by accepting what we cannot know, rather than trying to comprehend what we do not. Science was anyway based on that in the first place. (shrink)
“I am me”, but what does this mean? For centuries humans identified themselves as conscious beings with free will, beings that are important in the cosmos they live in. However, modern science has been trying to reduce us into unimportant pawns in a cold universe and diminish our sense of consciousness into a mere illusion generated by lifeless matter. Our identity in the cosmos is nothing more than a deception and all the scientific evidence seem to support this idea. Or (...) is it not? The goal of this paper is to discard current underlying dogmatism (axioms taken for granted as "self-evident") of modern mind research and to show that consciousness seems to be the ultimate frontier that will cause a major change in the way exact sciences think. If we want to re-discover our identity as luminous beings in the cosmos, we must first try to pinpoint our prejudices and discard them. Materialism is an obsolete philosophical dogma and modern scientists should try to also use other premises as the foundation of their theories to approach the mysteries of the self. Exact sciences need to examine the world with a more open mind, accepting potentially different interpretations of existing experimental data in the fields of brain research, which are currently not considered simply on the basis of a strong anti-spiritual dogmatism. Such interpretations can be compatible with the notion of an immaterial spirit proposed by religion for thousands of years. Mind seems that is not the by-product of matter, but the opposite: its master. No current materialistic theory can explain how matter may give rise to what we call “self” and only a drastic paradigm shift towards more idealistic theories will help us avoid rejecting our own nature. (shrink)
A description of consciousness leads to a contradiction with the postulation from special relativity that there can be no connections between simultaneous event. This contradiction points to consciousness involving quantum level mechanisms. The Quantum level description of the universe is re- evaluated in the light of what is observed in consciousness namely 4 Dimensional objects. A new improved interpretation of Quantum level observations is introduced. From this vantage point the following axioms of consciousness is presented. Consciousness consists of two (...) distinct components, the observed U and the observer I. The observed U consist of all the events I is aware of. A vast majority of these occur simultaneously. Now if I were to be an entity within the space-time continuum, all of these events of U together with I would have to occur at one point in space-time. However, U is distributed over a definite region of space-time (region in brain). Thus, I is aware of a multitude of space-like separated events. It is seen that this awareness necessitates I to be an entity outside the space-time continuum. With I taken as such, a new concept called concept A is introduced. With the help of concept A a very important axiom of consciousness, namely Free Will is explained. (shrink)
This paper is about Poincaré’s view of the foundations of geometry. According to the established view, which has been inherited from the logical positivists, Poincaré, like Hilbert, held that axioms in geometry are schemata that provide implicit definitions of geometric terms, a view he expresses by stating that the axioms of geometry are “definitions in disguise.” I argue that this view does not accord well with Poincaré’s core commitment in the philosophy of geometry: the view that geometry is (...) the study of groups of operations. In place of the established view I offer a revised view, according to which Poincaré held that axioms in geometry are in fact assertions about invariants of groups. Groups, as forms of the understanding, are prior in conception to the objects of geometry and afford the proper definition of those objects, according to Poincaré. Poincaré’s view therefore contrasts sharply with Kant’s foundation of geometry in a unique form of sensibility. According to my interpretation, axioms are not definitions in disguise because they themselves implicitly define their terms, but rather because they disguise the definitions which imply them. (shrink)
The aim of this paper is to show that topology has a bearing on Leibniz’s Principle of the Identity of Indiscernibles (PII). According to (PII), if, for all properties F, an object a has property F iff object b has property F, then a and b are identical. If any property F whatsoever is permitted in PII, then Leibniz’s principle is trivial, as is shown by “identity properties”. The aim of this paper is to show that topology can make a (...) contribution to the problem of giving criteria of how to restrict the domain of properties to render (PII) non-trivial. In topology a wealth of different Leibnizian principles of identity can be defined - PII turns out to be just the weakest topological separation axiom T0 in disguise, stronger principles of can be defined with the aid of higher separation axioms Ti, i > 0. Topologically defined properties have a variety of nice features, in particular they are stable in a natural sense. Topologically defined properties do not have a monopoly on defining “good” properties. In the final section of the paper it is show that the topological approach is closely related to Gärdenfors’s approach of conceptual spaces based on the concept of convexity. (shrink)
For centuries the case of Galileo Galilei has been the cornerstone of every major argument against the church and its supposedly unscientific dogmatism. The church seems to have condemned Galileo for his heresies, just because it couldn’t and wouldn’t handle the truth. Galileo was a hero of science wrongfully accused and now – at last – everyone knows that. But is that true? This paper tries to examine the case from the point of modern physics and the conclusions drawn are (...) startling. It seems that contemporary church was too haste into condemning itself. The evidence provided by Galileo to support the heliocentric system do not even pass simple scrutiny, while modern physics has ruled for a long time now against both heliocentric and geocentric models as depictions of the “truth”. As Einstein eloquently said, the debate about which system is chosen is void of any meaning from a physics’ point of view. At the end, the selection of the center is more a matter of choice rather than a matter of ‘truth’ of any kind. And this choice is driven by specific philosophical axioms penetrating astronomy for hundreds of years now. From Galileo to Hubble, the Copernican principle has been slowly transformed to a dogma followed by all mainstream astronomers. It is time to challenge our dogmatic adherence to the anti-humanism idea that we are insignificant in the cosmos and start making true honest science again, as Copernicus once postulated. (shrink)
Distributive justice deals with allocations of goods and bads within a group. Different principles and results of distributions are seen as possible ideals. Often those normative approaches are solely framed verbally, which complicates the application to different concrete distribution situations that are supposed to be evaluated in regard to justice. One possibility in order to frame this precisely and to allow for a fine-grained evaluation of justice lies in formal modelling of these ideals by metrics. Choosing a metric that is (...) supposed to map a certain ideal has to be justified. Such justification might be given by demanding specific substantiated axioms, which have to be met by a metric. This paper introduces such axioms for metrics of distributive justice shown by the example of needs-based justice. Furthermore, some exemplary metrics of needs-based justice and a three dimensional method for visualisation of non-comparative justice axioms or evaluations are presented. Therewith, a base worth discussing for the evaluation and modelling of metrics of distributive justice is given. (shrink)
L’Ipotesi del Continuo, formulata da Cantor nel 1878, è una delle congetture più note della teoria degli insiemi. Il Problema del Continuo, che ad essa è collegato, fu collocato da Hilbert, nel 1900, fra i principali problemi insoluti della matematica. A seguito della dimostrazione di indipendenza dell’Ipotesi del Continuo dagli assiomi della teoria degli insiemi, lo status attuale del problema è controverso. In anni più recenti, la ricerca di una soluzione del Problema del Continuo è stata anche una delle ragioni (...) fondamentali per la ricerca di nuovi assiomi in matematica. L’articolo fornisce un quadro generale dei risultati matematici fondamentali, e un’analisi di alcune delle questioni filosofiche connesse al Problema del Continuo. (shrink)
The orthodox theory of instrumental rationality, expected utility (EU) theory, severely restricts the way in which risk-considerations can figure into a rational individual's preferences. It is argued here that this is because EU theory neglects an important component of instrumental rationality. This paper presents a more general theory of decision-making, risk-weighted expected utility (REU) theory, of which expected utility maximization is a special case. According to REU theory, the weight that each outcome gets in decision-making is not the subjective probability (...) of that outcome; rather, the weight each outcome gets depends on both its subjective probability and its position in the gamble. Furthermore, the individual's utility function, her subjective probability function, and a function that measures her attitude towards risk can be separately derived from her preferences via a Representation Theorem. This theorem illuminates the role that each of these entities plays in preferences, and shows how REU theory explicates the components of instrumental rationality. (shrink)
Speaking for God has been part of religion for many years. However, science has come in the past few years to question that role or even our very ability to speak about God in general. My goal is to show that dogmatism, under any form, is wrong. And even though dogmatism had for a long time been associated with ill-intentioned religion, nowadays science has replaced religion in the throne of doctrinaire thinking. The point of the paper is to illustrate that (...) one-way thinking is never correct – most of the times a combination of science and religion, measurements and theoretical thinking, logic and intuition, is required to draw a conclusion about the most important philosophical questions. The paper establishes that exact sciences can be very useful, but they also have limits. The Religion-vs-Science problem is a pseudo-problem; logic and evidence can easily be used to defend theistic views. Both science and religion use common tools and methods and can be unified in a new way of thinking. This paper sets the foundations on how this can be achieved. The conclusion is that science and religion both complete our knowledge for the world, our understanding of humans and our purpose in life. Speaking about God is part of science as well as of religion. Only when we think of God as theologians and as scientists at the same time can we fully reach Him…. (shrink)
This paper presents and defends an argument that the continuum hypothesis is false, based on considerations about objective chance and an old theorem due to Banach and Kuratowski. More specifically, I argue that the probabilistic inductive methods standardly used in science presuppose that every proposition about the outcome of a chancy process has a certain chance between 0 and 1. I also argue in favour of the standard view that chances are countably additive. Since it is possible to randomly pick (...) out a point on a continuum, for instance using a roulette wheel or by flipping a countable infinity of fair coins, it follows, given the axioms of ZFC, that there are many different cardinalities between countable infinity and the cardinality of the continuum. (shrink)
The paper studies a cluster of systems for fully disquotational truth based on the restriction of initial sequents. Unlike well-known alternative approaches, such systems display both a simple and intuitive model theory and remarkable proof-theoretic properties. We start by showing that, due to a strong form of invertibility of the truth rules, cut is eliminable in the systems via a standard strategy supplemented by a suitable measure of the number of applications of truth rules to formulas in derivations. Next, we (...) notice that cut remains eliminable when suitable arithmetical axioms are added to the system. Finally, we establish a direct link between cut-free derivability in infinitary formulations of the systems considered and fixed-point semantics. Noticeably, unlike what happens with other background logics, such links are established without imposing any restriction to the premisses of the truth rules. (shrink)
“De jure naturae multa fabulamur” — after 450 years, Luther's statement has lost none of its original validity. After a brief pseudo-renaissance following WWII, one now hears far less in legal theory about natural law, which appears finally to have fallen victim to what Weber early in the century characterized as “a progressive decomposition and relativization of all meta-legal axioms” — a destruction resulting partly “from legal rationalism itself,” and partly “from the skepticism which characterizes modern intellectual life generally.” (...) Law today, wrote Weber, “is all too tangibly (in the great majority of its determinations, and especially in many which are particularly important in terms of principle) revealed to be both the product and the technical medium of a compromise of interests,”. (shrink)
Analysing several characteristic mathematical models: natural and real numbers, Euclidean geometry, group theory, and set theory, I argue that a mathematical model in its final form is a junction of a set of axioms and an internal partial interpretation of the corresponding language. It follows from the analysis that (i) mathematical objects do not exist in the external world: they are our internally imagined objects, some of which, at least approximately, we can realize or represent; (ii) mathematical truths are (...) not truths about the external world but specifications (formulations) of mathematical conceptions; (iii) mathematics is first and foremost our imagined tool by which, with certain assumptions about its applicability, we explore nature and synthesize our rational cognition of it. (shrink)
We give two social aggregation theorems under conditions of risk, one for constant population cases, the other an extension to variable populations. Intra and interpersonal welfare comparisons are encoded in a single ‘individual preorder’. The theorems give axioms that uniquely determine a social preorder in terms of this individual preorder. The social preorders described by these theorems have features that may be considered characteristic of Harsanyi-style utilitarianism, such as indifference to ex ante and ex post equality. However, the theorems (...) are also consistent with the rejection of all of the expected utility axioms, completeness, continuity, and independence, at both the individual and social levels. In that sense, expected utility is inessential to Harsanyi-style utilitarianism. In fact, the variable population theorem imposes only a mild constraint on the individual preorder, while the constant population theorem imposes no constraint at all. We then derive further results under the assumption of our basic axioms. First, the individual preorder satisfies the main expected utility axiom of strong independence if and only if the social preorder has a vector-valued expected total utility representation, covering Harsanyi’s utilitarian theorem as a special case. Second, stronger utilitarian-friendly assumptions, like Pareto or strong separability, are essentially equivalent to strong independence. Third, if the individual preorder satisfies a ‘local expected utility’ condition popular in non-expected utility theory, then the social preorder has a ‘local expected total utility’ representation. Fourth, a wide range of non-expected utility theories nevertheless lead to social preorders of outcomes that have been seen as canonically egalitarian, such as rank-dependent social preorders. Although our aggregation theorems are stated under conditions of risk, they are valid in more general frameworks for representing uncertainty or ambiguity. (shrink)
Non-Archimedean probability functions allow us to combine regularity with perfect additivity. We discuss the philosophical motivation for a particular choice of axioms for a non-Archimedean probability theory and answer some philosophical objections that have been raised against infinitesimal probabilities in general.
Basic Formal Ontology (BFO) is a top-level ontology used in hundreds of active projects in scientific and other domains. BFO has been selected to serve as top-level ontology in the Industrial Ontologies Foundry (IOF), an initiative to create a suite of ontologies to support digital manufacturing on the part of representatives from a number of branches of the advanced manufacturing industries. We here present a first draft set of axioms and definitions of an IOF upper ontology descending from BFO. (...) The axiomatization is designed to capture the meanings of terms commonly used in manufacturing and is designed to serve as starting point for the construction of the IOF ontology suite. (shrink)
The existence of fundamental moral disagreements is a central problem for moral realism and has often been contrasted with an alleged absence of disagreement in mathematics. However, mathematicians do in fact disagree on fundamental questions, for example on which set-theoretic axioms are true, and some philosophers have argued that this increases the plausibility of moral vis-à-vis mathematical realism. I argue that the analogy between mathematical and moral disagreement is not as straightforward as those arguments present it. In particular, I (...) argue that pluralist accounts of mathematics render fundamental mathematical disagreements compatible with mathematical realism in a way in which moral disagreements and moral realism are not. 11. (shrink)
Functionally graded materials (FGMs) have been used in many different kinds of applications in recent years and have attracted significant research attention. However, we do not yet have a commonly accepted way of representing the various aspects of FGMs. Lack of standardised vocabulary creates obstacles to the extraction of useful information relating to pertinent aspects of different applications. A standard resource is needed for describing various elements of FGMs, including existing applications, manufacturing techniques, and material characteristics. This motivated the creation (...) of the FGM Ontology (FGMO) in 2016. Here, we present a revised and expanded version of the FGM Ontology, which includes enrichments along four dimensions: (1) documenting recent FGMs applications; (2) reorganising the framework to incorporate an updated representation of types of manufacturing processes; (3) enriching the axioms of the ontology; and (4) importing mid-level ontologies from the Common Core Ontologies (CCO) and Product Life Cycle (PLC) Ontologies. The work is being carried out within the framework of the Industry Ontology Foundry (IOF), and the ontology is conformant to Basic Formal Ontology (BFO). (shrink)
Boolean-valued models of set theory were independently introduced by Scott, Solovay and Vopěnka in 1965, offering a natural and rich alternative for describing forcing. The original method was adapted by Takeuti, Titani, Kozawa and Ozawa to lattice-valued models of set theory. After this, Löwe and Tarafder proposed a class of algebras based on a certain kind of implication which satisfy several axioms of ZF. From this class, they found a specific 3-valued model called PS3 which satisfies all the (...) class='Hi'>axioms of ZF, and can be expanded with a paraconsistent negation *, thus obtaining a paraconsistent model of ZF. The logic (PS3 ,*) coincides (up to language) with da Costa and D'Ottaviano logic J3, a 3-valued paraconsistent logic that have been proposed independently in the literature by several authors and with different motivations such as CluNs, LFI1 and MPT. We propose in this paper a family of algebraic models of ZFC based on LPT0, another linguistic variant of J3 introduced by us in 2016. The semantics of LPT0, as well as of its first-order version QLPT0, is given by twist structures defined over Boolean agebras. From this, it is possible to adapt the standard Boolean-valued models of (classical) ZFC to twist-valued models of an expansion of ZFC by adding a paraconsistent negation. We argue that the implication operator of LPT0 is more suitable for a paraconsistent set theory than the implication of PS3, since it allows for genuinely inconsistent sets w such that [(w = w)] = 1/2 . This implication is not a 'reasonable implication' as defined by Löwe and Tarafder. This suggests that 'reasonable implication algebras' are just one way to define a paraconsistent set theory. Our twist-valued models are adapted to provide a class of twist-valued models for (PS3,*), thus generalizing Löwe and Tarafder result. It is shown that they are in fact models of ZFC (not only of ZF). (shrink)
Although the theory of the assertoric syllogism was Aristotle's great invention, one which dominated logical theory for the succeeding two millenia, accounts of the syllogism evolved and changed over that time. Indeed, in the twentieth century, doctrines were attributed to Aristotle which lost sight of what Aristotle intended. One of these mistaken doctrines was the very form of the syllogism: that a syllogism consists of three propositions containing three terms arranged in four figures. Yet another was that a syllogism is (...) a conditional proposition deduced from a set of axioms. There is even unclarity about what the basis of syllogistic validity consists in. Returning to Aristotle's text, and reading it in the light of commentary from late antiquity and the middle ages, we find a coherent and precise theory which shows all these claims to be based on a misunderstanding and misreading. (shrink)
This paper embeds the core part of Discourse Representation Theory in the classical theory of types plus a few simple axioms that allow the theory to express key facts about variables and assignments on the object level of the logic. It is shown how the embedding can be used to combine core analyses of natural language phenomena in Discourse Representation Theory with analyses that can be obtained in Montague Semantics.
Joyce (1998) gives an argument for probabilism: the doctrine that rational credences should conform to the axioms of probability. In doing so, he provides a distinctive take on how the normative force of probabilism relates to the injunction to believe what is true. But Joyce presupposes that the truth values of the propositions over which credences are defined are classical. I generalize the core of Joyce’s argument to remove this presupposition. On the same assumptions as Joyce uses, the credences (...) of a rational agent should always be weighted averages of truth value assignments. In the special case where the truth values are classical, the weighted averages of truth value assignments are exactly the probability functions. But in the more general case, probabilistic axioms formulated in terms of classical logic are violated—but we will show that generalized versions of the axioms formulated in terms of non-classical logics are satisfied. (shrink)
Philosophers have spilled a lot of ink over the past few years exploring the nature and significance of grounding. Kit Fine has made several seminal contributions to this discussion, including an exact treatment of the formal features of grounding [Fine, 2012a]. He has specified a language in which grounding claims may be expressed, proposed a system of axioms which capture the relevant formal features, and offered a semantics which interprets the language. Unfortunately, the semantics Fine offers faces a number (...) of problems. In this paper, I review the problems and offer an alternative that avoids them. I offer a semantics for the pure logic of ground that is motivated by ideas already present in the grounding literature, and for which a natural axiomatization capturing central formal features of grounding is sound and complete. I also show how the semantics I offer avoids the problems faced by Fine’s semantics. (shrink)
How the arguments of Spinoza's Ethics work might seem obvious. Even if Spinoza's exposition is not perfect, and some suppressed premises might have to be recovered, it seems clear enough that the demonstrations are supposed to show, in Euclidian fashion, how truths about the basic structure of nature—as well as truths about how to live—follow from axioms and uncontroversial definitions. If readers keep their imagination and emotions from sullying their reasoning, they will see the force of the demonstrations and (...) be convinced.In his engaging, highly original book, Garver argues that the Ethics is not a linear march through timeless truths, but rather a complicated drama that works precisely because its "characters,"... (shrink)
The iterative conception of set is typically considered to provide the intuitive underpinnings for ZFCU (ZFC+Urelements). It is an easy theorem of ZFCU that all sets have a definite cardinality. But the iterative conception seems to be entirely consistent with the existence of “wide” sets, sets (of, in particular, urelements) that are larger than any cardinal. This paper diagnoses the source of the apparent disconnect here and proposes modifications of the Replacement and Powerset axioms so as to allow for (...) the existence of wide sets. Drawing upon Cantor’s notion of the absolute infinite, the paper argues that the modifications are warranted and preserve a robust iterative conception of set. The resulting theory is proved consistent relative to ZFC + “there exists an inaccessible cardinal number.”. (shrink)
I argue that prioritarianism cannot be assessed in abstraction from an account of the measure of utility. Rather, the soundness of this view crucially depends on what counts as a greater, lesser, or equal increase in a person’s utility. In particular, prioritarianism cannot accommodate a normatively compelling measure of utility that is captured by the axioms of John von Neumann and Oskar Morgenstern’s expected utility theory. Nor can it accommodate a plausible and elegant generalization of this theory that has (...) been offered in response to challenges to von Neumann and Morgenstern. This is, I think, a theoretically interesting and unexpected source of difficulty for prioritarianism, which I explore in this article. (shrink)
According to the priority view, or prioritarianism, it matters more to beneﬁt people the worse oﬀ they are. But how exactly should the priority view be deﬁned? This article argues for a highly general characterization which essentially involves risk, but makes no use of evaluative measurements or the expected utility axioms. A representation theorem is provided, and when further assumptions are added, common accounts of the priority view are recovered. A defense of the key idea behind the priority view, (...) the priority principle, is provided. But it is argued that the priority view fails on both ethical and conceptual grounds. (shrink)
I prove the nonexistence of gods. The proof is based on three axioms: Ockham’s razor (OR), religiosity is endogenous in humans, and, there are no miracles. The OR is formulated operationally, to remove improper postulates, such that it yields not only a plausible argument but truth. The validity of the second and the third axiom is established empirically by inductive reasoning relying on a thorough analysis of the psychiatric literature and skeptical publications. With these axioms I prove that (...) gods are not necessary for our universe. Applying OR yields that gods do not exist. The implications of this article are enormous. Mankind’s understanding of the world is elevated to a higher level to a unified view on the world being nature and mankind being a part of it. (shrink)
Contemporary Humeans treat laws of nature as statements of exceptionless regularities that function as the axioms of the best deductive system. Such ‘Best System Accounts’ marry realism about laws with a denial of necessary connections among events. I argue that Hume’s predecessor, George Berkeley, offers a more sophisticated conception of laws, equally consistent with the absence of powers or necessary connections among events in the natural world. On this view, laws are not statements of regularities but the most general (...) rules God follows in producing the world. Pace most commentators, I argue that Berkeley’s view is neither instrumentalist nor reductionist. More important, the Berkeleyan Best System can solve some of the problems afflicting its Humean rivals, including the problems of theory choice and Nancy Cartwright’s ‘facticity’ dilemma. Some of these solutions are available in the contemporary context, without any appeal to God. Berkeley’s account deserves to be taken seriously in its own right. (shrink)
We present a general framework for representing belief-revision rules and use it to characterize Bayes's rule as a classical example and Jeffrey's rule as a non-classical one. In Jeffrey's rule, the input to a belief revision is not simply the information that some event has occurred, as in Bayes's rule, but a new assignment of probabilities to some events. Despite their differences, Bayes's and Jeffrey's rules can be characterized in terms of the same axioms: "responsiveness", which requires that revised (...) beliefs incorporate what has been learnt, and "conservativeness", which requires that beliefs on which the learnt input is "silent" do not change. To illustrate the use of non-Bayesian belief revision in economic theory, we sketch a simple decision-theoretic application. (shrink)
Amalgamating evidence of different kinds for the same hypothesis into an overall confirmation is analogous, I argue, to amalgamating individuals’ preferences into a group preference. The latter faces well-known impossibility theorems, most famously “Arrow’s Theorem”. Once the analogy between amalgamating evidence and amalgamating preferences is tight, it is obvious that amalgamating evidence might face a theorem similar to Arrow’s. I prove that this is so, and end by discussing the plausibility of the axioms required for the theorem.
A crucial part of the contemporary interest in logicism in the philosophy of mathematics resides in its idea that arithmetical knowledge may be based on logical knowledge. Here an implementation of this idea is considered that holds that knowledge of arithmetical principles may be based on two things: (i) knowledge of logical principles and (ii) knowledge that the arithmetical principles are representable in the logical principles. The notions of representation considered here are related to theory-based and structure-based notions of representation (...) from contemporary mathematical logic. It is argued that the theory-based versions of such logicism are either too liberal (the plethora problem) or are committed to intuitively incorrect closure conditions (the consistency problem). Structure-based versions must on the other hand respond to a charge of begging the question (the circularity problem) or explain how one may have a knowledge of structure in advance of a knowledge of axioms (the signature problem). This discussion is significant because it gives us a better idea of what a notion of representation must look like if it is to aid in realizing some of the traditional epistemic aims of logicism in the philosophy of mathematics. (shrink)
It is often alleged that, unlike typical axioms of mathematics, the Continuum Hypothesis (CH) is indeterminate. This position is normally defended on the ground that the CH is undecidable in a way that typical axioms are not. Call this kind of undecidability “absolute undecidability”. In this paper, I seek to understand what absolute undecidability could be such that one might hope to establish that (a) CH is absolutely undecidable, (b) typical axioms are not absolutely undecidable, and (c) (...) if a mathematical hypothesis is absolutely undecidable, then it is indeterminate. I shall argue that on no understanding of absolute undecidability could one hope to establish all of (a)–(c). However, I will identify one understanding of absolute undecidability on which one might hope to establish both (a) and (c) to the exclusion of (b). This suggests that a new style of mathematical antirealism deserves attention—one that does not depend on familiar epistemological or ontological concerns. The key idea behind this view is that typical mathematical hypotheses are indeterminate because they are relevantly similar to CH. (shrink)
I show that the act-type theories of Soames and Hanks entail that every sentence with alternative analyses (including every atomic sentence with a polyadic predicate) is ambiguous, many of them massively so. I assume that act types directed toward distinct objects are themselves distinct, plus some standard semantic axioms, and infer that act-type theorists are committed to saying that ‘Mary loves John’ expresses both the act type of predicating [loving John] of Mary and that of predicating [being loved by (...) Mary] of John. Since the two properties are distinct, so are the act types. Hence, the sentence expresses two propositions. I also discuss a non-standard “pluralist” act-type theory, as well as some retreat positions, which all come with considerable problems. Finally, I extrapolate to a general constraint on theories of structured propositions, and find that Jeffrey King’s theory has the same unacceptable consequence as the act-type theory. (shrink)
The aim of this paper is to show that (elementary) topology may be useful for dealing with problems of epistemology and metaphysics. More precisely, I want to show that the introduction of topological structures may elucidate the role of the spatial structures (in a broad sense) that underly logic and cognition. In some detail I’ll deal with “Cassirer’s problem” that may be characterized as an early forrunner of Goodman’s “grue-bleen” problem. On a larger scale, topology turns out to be useful (...) in elaborating the approach of conceptual spaces that in the last twenty years or so has found quite a few applications in cognitive science, psychology, and linguistics. In particular, topology may help distinguish “natural” from “not-so-natural” concepts. This classical problem that up to now has withstood all efforts to solve (or dissolve) it by purely logical methods. Finally, in order to show that a topological perspective may also offer a fresh look on classical metaphysical problems, it is shown that Leibniz’s famous principle of the identity of indiscernibles is closely related to some well-known topological separation axioms. More precisely, the topological perspective gives rise in a natural way to some novel variations of Leibniz’s principle. (shrink)
Need considerations play an important role in empirically informed theories of distributive justice. We propose a concept of need-based justice that is related to social participation and provide an ethical measurement of need-based justice. The β-ε-index satisfies the need-principle, monotonicity, sensitivity, transfer and several »technical« axioms. A numerical example is given.
In economics, thought experiments are frequently justified by the difficulty of conducting controlled experiments. They serve several functions, such as establishing causal facts, isolating tendencies, and allowing inferences from models to reality. In this paper, I argue that thought experiments served a further function in economics: facilitating the quantitative definition and measurement of the theoretical concept of utility, thereby bridging the gap between theory and statistical data. I support my argument by a case study, the “hypothetical experiments” of the Norwegian (...) economist Ragnar Frisch (1895-1973). Frisch aimed to eliminate introspection and a subjective concept of utility from economic reasoning. At the same time, he sought behavioral foundations for economic theory that enabled quantitative reasoning. By using thought experiments to justify his set of choice axioms and facilitating the operationalization of utility, Frisch circumvented the problem of observing utility via actual experiments without eliminating the concept of utility from economic theory altogether. As such, these experiments helped Frisch to empirically support the theory’s most important results, such as the laws of demand and supply, without the input of new empirical findings. I suggest that Frisch’s experiments fulfill the main characteristics of thought experiments. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.