This paper proposes a new dialetheiclogic, a DialetheicLogic with Exclusive Assumptions and Conclusions ), including classical logic as a particular case. In \, exclusivity is expressed via the speech acts of assuming and concluding. In the paper we adopt the semantics of the logic of paradox extended with a generalized notion of model and we modify its proof theory by refining the notions of assumption and conclusion. The paper starts with an explanation (...) of the adopted philosophical perspective, then we propose our \ logic. Finally, we show how \ supports the dialetheic solution of the liar paradox. (shrink)
Aim of the paper is to analyze Priest’s dialetheic solution to Curry’s paradox. It has been shown that a solution refuting ABS, accepting MPP and consequently refuting CP meets some difficulties. Here I just concentrate on one difficulty: one obtains the validity of MPP just using FA in the metalanguage, an invalid rule for a dialetheist.
This article argues that Nietzsche's transvaluation project refers not to a mere inversion or negation of a set of values but, instead, to a different conception of what a value is and how it functions. Traditional values function within a standard logical framework and claim legitimacy and bindingness based on exogenous authority with absolute extension. Nietzsche regards this framework as unnecessarily reductive in its attempted exclusion of contradiction and real opposition among competing values and proposes a nonstandard, dialetheic model (...) of valuation. (shrink)
In this paper we first develop a DialetheicLogic with Exclusive Assumptions and Conclusions, DLEAC. We adopt the semantics of the logic of paradox (LP) extended with a notion of model suitable for DLEAC, and we modify its proof theory by refining the notions of assumption and conclusion, which are understood as speech acts. We introduce a new paradox – the rejectability paradox – first informally, then formally. We then provide its derivation in an extension of DLEAC (...) contanining the rejectability predicate. (shrink)
The programmatic statement put forward in von Wright's last works on deontic logic introduces the perspective of logical pragmatics, which has been formally explicated here and extended so to include the role of norm-recipient as well as the role of norm-giver. Using the translation function from the language of deontic logic to the language of set-theoretical approach, the connection has been established between the deontic postulates, on one side, and the perfection properties of the norm-set and the counter-set, (...) on the other side. In the study of conditions of rational norm-related activities it has been shown that diverse dynamic second-order norms related to the concept of the consistency norm-system hold: -- the norm-giver ought to restore ``classical'' consistency by revising an inconsistent system, -- the norm-recipient ought to preserve an inconsistent system by revision of its logic so that inconsistency does not imply destruction of the system. Dialetheic deontic logic of Priest is a suitable logic for the purpose since it preserves other perfection properties of the system. (shrink)
In his famous work on vagueness, Russell named “fallacy of verbalism” the fallacy that consists in mistaking the properties of words for the properties of things. In this paper, I examine two (clusters of) mainstream paraconsistent logical theories – the non-adjunctive and relevant approaches –, and show that, if they are given a strongly paraconsistent or dialetheic reading, the charge of committing the Russellian Fallacy can be raised against them in a sophisticated way, by appealing to the intuitive reading (...) of their underlying semantics. The meaning of “intuitive reading” is clarified by exploiting a well-established distinction between pure and applied semantics. If the proposed arguments go through, the dialetheist or strong paraconsistentist faces the following Dilemma: either she must withdraw her claim to have exhibited true contradictions in a metaphysically robust sense – therefore, inconsistent objects and/or states of affairs that make those contradictions true; or she has to give up realism on truth, and embrace some form of anti-realistic (idealistic, or broadly constructivist) metaphysics. Sticking to the second horn of the Dilemma, though, appears to be promising: it could lead to a collapse of the very distinction, commonly held in the literature, between a weak and a strong form of paraconsistency – and this could be a welcome result for a dialetheist. (shrink)
The perennial nature of some of philosophy’s deepest problems is a puzzle. Here, one problem, the realism–anti-realism debate, and one type of explanation for its longevity, are examined. It is argued that realism and anti-realism form a dialetheic pair: While they are in fact each other’s logical opposite, nevertheless, both are true (and both false). First, several reasons why one might think such a thing are presented. These reasons are merely the beginning, however. In the following sections, the (...) class='Hi'>dialetheic conclusion is directly argued for by showing how realism and anti-realism satisfy Priest’s “inclosure schema”. In the last section and the conclusion, the conscious mind’s role in creating realism and anti-realism is discussed. This role further supports the conclusion that realism and anti-realism form a dialetheic pair. (shrink)
Mysticism claims of its logical scheme that it is Euclidean, that from its first axiom or principle the remainder of its doctrine follows, but it makes this claim in so many languages and in such a variety of obscure and self-contradictory ways that it is difficult to discern how this could be possible, and it is rarely considered a plausible claim in metaphysics. I believe it is plausible, and in this essay I try to explain why. -/- .
I here present and defend what I call the Triviality Theory of Truth, to be understood in analogy with Matti Eklund’s Inconsistency Theory of Truth. A specific formulation of is defended and compared with alternatives found in the literature. A number of objections against the proposed notion of meaning-constitutivity are discussed and held inconclusive. The main focus, however, is on the problem, discussed at length by Gupta and Belnap, that speakers do not accept epistemically neutral conclusions of Curry derivations. I (...) first argue that the facts about speakers’ reactions to such Curry derivations do not constitute a problem for the Triviality Theory specifically. Rather, they follow from independent, uncontroversial facts. I then propose a solution which coheres with the theory as I understand it. Finally, I consider a normative reading of their objection and offer a response. (shrink)
Is there a notion of contradiction—let us call it, for dramatic effect, “absolute”—making all contradictions, so understood, unacceptable also for dialetheists? It is argued in this paper that there is, and that spelling it out brings some theoretical benefits. First it gives us a foothold on undisputed ground in the methodologically difficult debate on dialetheism. Second, we can use it to express, without begging questions, the disagreement between dialetheists and their rivals on the nature of truth. Third, dialetheism has an (...) operator allowing it, against the opinion of many critics, to rule things out and manifest disagreement: for unlike other proposed exclusion-expressing-devices (for instance, the entailment of triviality), the operator used to formulate the notion of absolute contradiction appears to be immune both from crippling expressive limitations and from revenge paradoxes—pending a rigorous nontriviality proof for a formal dialetheic theory including it. (shrink)
The logical interpretation of probability, or "objective Bayesianism'' – the theory that (some) probabilities are strictly logical degrees of partial implication – is defended. The main argument against it is that it requires the assignment of prior probabilities, and that any attempt to determine them by symmetry via a "principle of insufficient reason" inevitably leads to paradox. Three replies are advanced: that priors are imprecise or of little weight, so that disagreement about them does not matter, within limits; that it (...) is possible to distinguish reasonable from unreasonable priors on logical grounds; and that in real cases disagreement about priors can usually be explained by differences in the background information. It is argued also that proponents of alternative conceptions of probability, such as frequentists, Bayesians and Popperians, are unable to avoid committing themselves to the basic principles of logical probability. (shrink)
In the 1951 Gibbs lecture, Gödel asserted his famous dichotomy, where the notion of informal proof is at work. G. Priest developed an argument, grounded on the notion of naïve proof, to the effect that Gödel’s first incompleteness theorem suggests the presence of dialetheias. In this paper, we adopt a plausible ideal notion of naïve proof, in agreement with Gödel’s conception, superseding the criticisms against the usual notion of naïve proof used by real working mathematicians. We explore the connection between (...) Gödel’s theorem and naïve proof so understood, both from a classical and a dialetheic perspective. (shrink)
Logical realism is a view about the metaphysical status of logic. Common to most if not all the views captured by the label ‘logical realism’ is that logical facts are mind- and language-independent. But that does not tell us anything about the nature of logical facts or about our epistemic access to them. The goal of this paper is to outline and systematize the different ways that logical realism could be entertained and to examine some of the challenges that (...) these views face. It will be suggested that logical realism is best understood as a metaphysical view about the logical structure of the world, but this raises an important question: does logical realism collapse into standard metaphysical realism? It will be argued that this result can be accommodated, even if it cannot be altogether avoided. (shrink)
Leibniz argues that there must be a fundamental level of simple substances because composites borrow their reality from their constituents and not all reality can be borrowed. I contend that the underlying logic of this ‘borrowed reality argument’ has been misunderstood, particularly the rationale for the key premise that not all reality can be borrowed. Contrary to what has been suggested, the rationale turns neither on the alleged viciousness of an unending regress of reality borrowers nor on the Principle (...) of Sufficient Reason, but on the idea that composites are phenomena and thus can be real only insofar as they have a foundation in substances, from which they directly ‘borrow’ their reality. The claim that composites are phenomena rests in turn on Leibniz's conceptualism about relations. So understood, what initially looked like a disappointingly simple argument for simples turns out to be a rather rich and sophisticated one. (shrink)
This article aims to introduce a new solution to the Logical Problem of the Trinity. This solution is provided by utilising a number of theses within the field of contemporary metaphysics in order to establish a conceptual basis for a novel account and model of the doctrine of the Trinity termed Monarchical Aspectivalism, which will provide the means for proposing an alternative reading of the Athanasian Creed that is free from any consistency problems.
This essay examines the philosophical significance of $\Omega$-logic in Zermelo-Fraenkel set theory with choice (ZFC). The categorical duality between coalgebra and algebra permits Boolean-valued algebraic models of ZFC to be interpreted as coalgebras. The modal profile of $\Omega$-logical validity can then be countenanced within a coalgebraic logic, and $\Omega$-logical validity can be defined via deterministic automata. I argue that the philosophical significance of the foregoing is two-fold. First, because the epistemic and modal profiles of $\Omega$-logical validity correspond to (...) those of second-order logical consequence, $\Omega$-logical validity is genuinely logical. Second, the foregoing provides a modal account of the interpretation of mathematical vocabulary. (shrink)
A natural suggestion and increasingly popular account of how to revise our logical beliefs treats revision of logic analogously to the revision of scientific theories. I investigate this approach and argue that simple applications of abductive methodology to logic result in revision-cycles, developing a detailed case study of an actual dispute with this property. This is problematic if we take abductive methodology to provide justification for revising our logical framework. I then generalize the case study, pointing to similarities (...) with more recent and popular heterodox logics such as naïve logics of truth. I use this discussion to motivate a constraint—logical partisanhood—on the uses of such methodology: roughly: both the proposed alternative and our actual background logic must be able to agree that moving to the alternative logic is no worse than staying put. (shrink)
This paper clarifies the relationship between the Triviality Results for the conditional and the Restrictor Theory of the conditional. On the understanding of Triviality proposed here, it is implausible—pace many proponents of the Restrictor Theory—that Triviality rests on a syntactic error. As argued here, Triviality arises from simply mistaking the feature a claim has when that claim is logically unacceptable for the feature a claim has when that claim is unsatisfiable. Triviality rests on a semantic confusion—one which some semantic theories, (...) but not others, are prone to making. On the interpretation proposed here, Triviality Results thus play a theoretically constructive role in the project of natural language semantics. (shrink)
According to certain normative theories in epistemology, rationality requires us to be logically omniscient. Yet this prescription clashes with our ordinary judgments of rationality. How should we resolve this tension? In this paper, I focus particularly on the logical omniscience requirement in Bayesian epistemology. Building on a key insight by Hacking :311–325, 1967), I develop a version of Bayesianism that permits logical ignorance. This includes: an account of the synchronic norms that govern a logically ignorant individual at any given time; (...) an account of how we reduce our logical ignorance by learning logical facts and how we should update our credences in response to such evidence; and an account of when logical ignorance is irrational and when it isn’t. At the end, I explain why the requirement of logical omniscience remains true of ideal agents with no computational, processing, or storage limitations. (shrink)
Logical Indefinites.Jack Woods - 2014 - Logique Et Analyse -- Special Issue Edited by Julien Murzi and Massimiliano Carrara 227: 277-307.details
I argue that we can and should extend Tarski's model-theoretic criterion of logicality to cover indefinite expressions like Hilbert's ɛ operator, Russell's indefinite description operator η, and abstraction operators like 'the number of'. I draw on this extension to discuss the logical status of both abstraction operators and abstraction principles.
Philosophy of biology is often said to have emerged in the last third of the twentieth century. Prior to this time, it has been alleged that the only authors who engaged philosophically with the life sciences were either logical empiricists who sought to impose the explanatory ideals of the physical sciences onto biology, or vitalists who invoked mystical agencies in an attempt to ward off the threat of physicochemical reduction. These schools paid little attention to actual biological science, and as (...) a result philosophy of biology languished in a state of futility for much of the twentieth century. The situation, we are told, only began to change in the late 1960s and early 1970s, when a new generation of researchers began to focus on problems internal to biology, leading to the consolidation of the discipline. In this paper we challenge this widely accepted narrative of the history of philosophy of biology. We do so by arguing that the most important tradition within early twentieth-century philosophy of biology was neither logical empiricism nor vitalism, but the organicist movement that flourished between the First and Second World Wars. We show that the organicist corpus is thematically and methodologically continuous with the contemporary literature in order to discredit the view that early work in the philosophy of biology was unproductive, and we emphasize the desirability of integrating the historical and contemporary conversations into a single, unified discourse. (shrink)
In spite of its significance for everyday and philosophical discourse, the explanatory connective has not received much treatment in the philosophy of logic. The present paper develops a logic for based on systematic connections between and the truth-functional connectives.
A wide family of many-valued logics—for instance, those based on the weak Kleene algebra—includes a non-classical truth-value that is ‘contaminating’ in the sense that whenever the value is assigned to a formula φ, any complex formula in which φ appears is assigned that value as well. In such systems, the contaminating value enjoys a wide range of interpretations, suggesting scenarios in which more than one of these interpretations are called for. This calls for an evaluation of systems with multiple contaminating (...) values. In this paper, we consider the countably infinite family of multiple-conclusion consequence relations in which classical logic is enriched with one or more contaminating values whose behavior is determined by a linear ordering between them. We consider some motivations and applications for such systems and provide general characterizations for all consequence relations in this family. Finally, we provide sequent calculi for a pair of four-valued logics including two linearly ordered contaminating values before defining two-sided sequent calculi corresponding to each of the infinite family of many-valued logics studied in this paper. (shrink)
We argue that the extant evidence for Stoic logic provides all the elements required for a variable-free theory of multiple generality, including a number of remarkably modern features that straddle logic and semantics, such as the understanding of one- and two-place predicates as functions, the canonical formulation of universals as quantified conditionals, a straightforward relation between elements of propositional and first-order logic, and the roles of anaphora and rigid order in the regimented sentences that express multiply general (...) propositions. We consider and reinterpret some ancient texts that have been neglected in the context of Stoic universal and existential propositions and offer new explanations of some puzzling features in Stoic logic. Our results confirm that Stoic logic surpasses Aristotle’s with regard to multiple generality, and are a reminder that focusing on multiple generality through the lens of Frege-inspired variable-binding quantifier theory may hamper our understanding and appreciation of pre-Fregean theories of multiple generality. (shrink)
What is a logical constant? The question is addressed in the tradition of Tarski's definition of logical operations as operations which are invariant under permutation. The paper introduces a general setting in which invariance criteria for logical operations can be compared and argues for invariance under potential isomorphism as the most natural characterization of logical operations.
We explore the view that Frege's puzzle is a source of straightforward counterexamples to Leibniz's law. Taking this seriously requires us to revise the classical logic of quantifiers and identity; we work out the options, in the context of higher-order logic. The logics we arrive at provide the resources for a straightforward semantics of attitude reports that is consistent with the Millian thesis that the meaning of a name is just the thing it stands for. We provide models (...) to show that some of these logics are non-degenerate. (shrink)
Logic arguably plays a role in the normativity of reasoning. In particular, there are plausible norms of belief/disbelief whose antecedents are constituted by claims about what follows from what. But is logic also relevant to the normativity of agnostic attitudes? The question here is whether logical entailment also puts constraints on what kinds of things one can suspend judgment about. In this paper I address that question and I give a positive answer to it. In particular, I advance (...) two logical norms of agnosticism, where the first one allows us to assess situations in which the subject is agnostic about the conclusion of a valid argument and the second one allows us to assess situations in which the subject is agnostic about one of the premises of a valid argument. (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth-conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the `logicality of language', accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter-examples consisting of acceptable tautologies and contradictions, the logicality of language is often paired (...) with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is `blind' to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non-classical---indeed quite exotic---kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis-a-vis its open class terms and employs a deductive system that is basically classical. (shrink)
The result of combining classical quantificational logic with modal logic proves necessitism – the claim that necessarily everything is necessarily identical to something. This problem is reflected in the purely quantificational theory by theorems such as ∃x t=x; it is a theorem, for example, that something is identical to Timothy Williamson. The standard way to avoid these consequences is to weaken the theory of quantification to a certain kind of free logic. However, it has often been noted (...) that in order to specify the truth conditions of certain sentences involving constants or variables that don’t denote, one has to apparently quantify over things that are not identical to anything. In this paper I defend a contingentist, non-Meinongian metaphysics within a positive free logic. I argue that although certain names and free variables do not actually refer to anything, in each case there might have been something they actually refer to, allowing one to interpret the contingentist claims without quantifying over mere possibilia. (shrink)
Many philosophers take purportedly logical cases of ground ) to be obvious cases, and indeed such cases have been used to motivate the existence of and importance of ground. I argue against this. I do so by motivating two kinds of semantic determination relations. Intuitions of logical ground track these semantic relations. Moreover, our knowledge of semantics for first order logic can explain why we have such intuitions. And, I argue, neither semantic relation can be a species of ground (...) even on a quite broad conception of what ground is. Hence, without a positive argument for taking so-called ‘logical ground’ to be something distinct from a semantic determination relation, we should cease treating logical cases as cases of ground. (shrink)
In this paper I will develop a view about the semantics of imperatives, which I term Modal Noncognitivism, on which imperatives might be said to have truth conditions (dispositionally, anyway), but on which it does not make sense to see them as expressing propositions (hence does not make sense to ascribe to them truth or falsity). This view stands against “Cognitivist” accounts of the semantics of imperatives, on which imperatives are claimed to express propositions, which are then enlisted in explanations (...) of the relevant logico-semantic phenomena. It also stands against the major competitors to Cognitivist accounts—all of which are non-truth-conditional and, as a result, fail to provide satisfying explanations of the fundamental semantic characteristics of imperatives (or so I argue). The view of imperatives I defend here improves on various treatments of imperatives on the market in giving an empirically and theoretically adequate account of their semantics and logic. It yields explanations of a wide range of semantic and logical phenomena about imperatives—explanations that are, I argue, at least as satisfying as the sorts of explanations of semantic and logical phenomena familiar from truth-conditional semantics. But it accomplishes this while defending the notion—which is, I argue, substantially correct—that imperatives could not have propositions, or truth conditions, as their meanings. (shrink)
We investigate an enrichment of the propositional modal language L with a "universal" modality ■ having semantics x ⊧ ■φ iff ∀y(y ⊧ φ), and a countable set of "names" - a special kind of propositional variables ranging over singleton sets of worlds. The obtained language ℒ $_{c}$ proves to have a great expressive power. It is equivalent with respect to modal definability to another enrichment ℒ(⍯) of ℒ, where ⍯ is an additional modality with the semantics x ⊧ ⍯φ (...) iff Vy(y ≠ x → y ⊧ φ). Model-theoretic characterizations of modal definability in these languages are obtained. Further we consider deductive systems in ℒ $_{c}$ . Strong completeness of the normal ℒ $_{c}$ logics is proved with respect to models in which all worlds are named. Every ℒ $_{c}$ -logic axiomatized by formulae containing only names (but not propositional variables) is proved to be strongly frame-complete. Problems concerning transfer of properties ([in]completeness, filtration, finite model property etc.) from ℒ to ℒ $_{c}$ are discussed. Finally, further perspectives for names in multimodal environment are briefly sketched. (shrink)
We present a framework for epistemic logic, modeling the logical aspects of System 1 and System 2 cognitive processes, as per dual process theories of reasoning. The framework combines non-normal worlds semantics with the techniques of Dynamic Epistemic Logic. It models non-logically-omniscient, but moderately rational agents: their System 1 makes fast sense of incoming information by integrating it on the basis of their background knowledge and beliefs. Their System 2 allows them to slowly, step-wise unpack some of the (...) logical consequences of such knowledge and beliefs, by paying a cognitive cost. The framework is applied to three instances of limited rationality, widely discussed in cognitive psychology: Stereotypical Thinking, the Framing Effect, and the Anchoring Effect. (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth‐conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the ‘logicality of language’, accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter‐examples consisting of acceptable tautologies and contradictions, the logicality of language is often paired (...) with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is ‘blind’ to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non‐classical—indeed quite exotic—kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis‐á‐vis its open class terms and employs a deductive system that is basically classical. (shrink)
In this article, I outline a logic of design of a system as a specific kind of conceptual logic of the design of the model of a system, that is, the blueprint that provides information about the system to be created. In section two, I introduce the method of levels of abstraction as a modelling tool borrowed from computer science. In section three, I use this method to clarify two main conceptual logics of information inherited from modernity: Kant’s (...) transcendental logic of conditions of possibility of a system, and Hegel’s dialectical logic of conditions of in/stability of a system. Both conceptual logics of information analyse structural properties of given systems. Strictly speaking, neither is a conceptual logic of information about the conditions of feasibility of a system, that is, neither is a logic of information as a logic of design. So, in section four, I outline this third conceptual logic of information and then interpret the conceptual logic of design as a logic of requirements, by introducing the relation of “sufficientisation”. In the conclusion, I argue that the logic of requirements is exactly what we need in order to make sense of, and buttress, a constructionist approach to knowledge. (shrink)
In this paper I argue that pluralism at the level of logical systems requires a certain monism at the meta-logical level, and so, in a sense, there cannot be pluralism all the way down. The adequate alternative logical systems bottom out in a shared basic meta-logic, and as such, logical pluralism is limited. I argue that the content of this basic meta-logic must include the analogue of logical rules Modus Ponens and Universal Instantiation. I show this through a (...) detailed analysis of the ‘adoption problem’, which manifests something special about MP and UI. It appears that MP and UI underwrite the very nature of a logical rule of inference, due to all rules of inference being conditional and universal in their structure. As such, all logical rules presuppose MP and UI, making MP and UI self-governing, basic, unadoptable, and required in the meta-logic for the adequacy of any logical system. (shrink)
In explaining the notion of a fundamental property or relation, metaphysicians will often draw an analogy with languages. The fundamental properties and relations stand to reality as the primitive predicates and relations stand to a language: the smallest set of vocabulary God would need in order to write the “book of the world.” This paper attempts to make good on this metaphor. To that end, a modality is introduced that, put informally, stands to propositions as logical truth stands to sentences. (...) The resulting theory, formulated in higher-order logic, also vindicates the Humean idea that fundamental properties and relations are freely recombinable and a variant of the structural idea that propositions can be decomposed into their fundamental constituents via logical operations. Indeed, it is seen that, although these ideas are seemingly distinct, they are not independent, and fall out of a natural and general theory about the granularity of reality. (shrink)
One logic or many? I say—many. Or rather, I say there is one logic for each way of specifying the class of all possible circumstances, or models, i.e., all ways of interpreting a given language. But because there is no unique way of doing this, I say there is no unique logic except in a relative sense. Indeed, given any two competing logical theories T1 and T2 (in the same language) one could always consider their common core, (...) T, and settle on that theory. So, given any language L, one could settle on the minimal logic T0 corresponding to the common core shared by all competitors. That would be a way of resisting relativism, as long as one is willing to redraw the bounds of logic accordingly. However, such a minimal theory T0 may be empty if the syntax of L contains no special ingredients the interpretation of which is independent of the specification of the relevant L-models. And generally—I argue—this is indeed the case. (shrink)
In the paper, original formal-logical conception of syntactic and semantic: intensional and extensional senses of expressions of any language L is outlined. Syntax and bi-level intensional and extensional semantics of language L are characterized categorically: in the spirit of some Husserl’s ideas of pure grammar, Leśniewski-Ajukiewicz’s theory syntactic/semantic categories and in accordance with Frege’s ontological canons, Bocheński’s famous motto—syntax mirrors ontology and some ideas of Suszko: language should be a linguistic scheme of ontological reality and simultaneously a tool of its (...) cognition. In the logical conception of language L, its expressions should satisfy some general conditions of language adequacy. The adequacy ensures their unambiguous syntactic and semantic senses and mutual, syntactic, and semantic compatibility, correspondence guaranteed by the acceptance of a postulate of categorial compatibility syntactic and semantic categories of expressions of L. From this postulate, three principles of compositionality follow: one syntactic and two semantic already known to Frege. They are treated as conditions of homomorphism partial algebra of L into algebraic models of L: syntactic, intensional, and extensional. In the paper, they are applied to some expressions with quantifiers. Language adequacy connected with the logical senses described in the logical conception of language L is, of course, an idealization, but only expressions with high degrees of precision of their senses, after due justification, may become theorems of science. (shrink)
In this essay I consider Kaplan’s challenge to Frege’s so-called dictum: “Logic (and perhaps even truth) is immune to epithetical color”. I show that if it is to challenge anything, it rather challenges the view (attributable to Frege) that logic is immune to pejorative colour. This granted, I show that Kaplan’s inference-based challenge can be set even assuming that the pejorative doesn’t make any non-trivial truth-conditional (descriptive) contribution. This goes against the general tendency to consider the truth-conditionally inert (...) logically irrelevant. But I take it that Kaplan is right and take his examples to show that truth-conditional inertness need not entail inferential inertness. I end up assessing the Kaplan-Frege “debate” as giving edge to the former to the extent that clarity is achieved through Kaplanian inferences on what should be considered part of the explanandum. (shrink)
Buddhist philosophers have developed a rich tradition of logic. Buddhist material on logic that forms the Buddhist tradition of logic, however, is hardly discussed or even known. This article presents some of that material in a manner that is accessible to contemporary logicians and philosophers of logic and sets agendas for global philosophy of logic.
I argue against inferentialism about logic. First, I argue against an analogy between logic and chess, before considering a more basic objection to stipulating inference rules as a way of establishing the meaning of logical constants. The objectionthe Mushroom Omelette Objectionis that stipulative acts are partly constituted by logical notions, and therefore cannot be used to explain logical thought. I then argue that the same problem also attaches to following existing conventional rules, since either those rules have logical (...) contents, or following those conventional rules is done for logical reasons. Lastly, I compare this argument with other arguments found in Quine’s early work, and consider two attempts to reply to Quine. (shrink)
This essay aims to provide a modal logic for rational intuition. Similarly to treatments of the property of knowledge in epistemic logic, I argue that rational intuition can be codified by a modal operator governed by the axioms of a dynamic provability logic, which embeds GL within the modal $\mu$-calculus. Via correspondence results between modal logic and the bisimulation-invariant fragment of second-order logic, a precise translation can then be provided between the notion of 'intuition-of', i.e., (...) the cognitive phenomenal properties of thoughts, and the modal operators regimenting the notion of 'intuition-that'. I argue that intuition-that can further be shown to entrain conceptual elucidation, by way of figuring as a dynamic-interpretational modality which induces the reinterpretation of both domains of quantification and the intensions and hyperintensions of mathematical concepts that are formalizable in monadic first- and second-order formal languages. Hyperintensionality is countenanced via a topic-sensitive epistemic two-dimensional truthmaker semantics. (shrink)
Prior to Kripke's seminal work on the semantics of modal logic, McKinsey offered an alternative interpretation of the necessity operator, inspired by the Bolzano-Tarski notion of logical truth. According to this interpretation, `it is necessary that A' is true just in case every sentence with the same logical form as A is true. In our paper, we investigate this interpretation of the modal operator, resolving some technical questions, and relating it to the logical interpretation of modality and some views (...) in modal metaphysics. In particular, we present an hitherto unpublished solution to problems 41 and 42 from Friedman's 102 problems, which uses a different method of proof from the solution presented in the paper of Tadeusz Prucnal. (shrink)
Although formal thought disorder (FTD) has been for long a clinical label in the assessment of some psychiatric disorders, in particular of schizophrenia, it remains a source of controversy, mostly because it is hard to say what exactly the “formal” in FTD refers to. We see anomalous processing of terminological knowledge, a core construct of human knowledge in general, behind FTD symptoms and we approach this anomaly from a strictly formal perspective. More specifically, we present here a symbolic computational model (...) of storage in, and activation of, a human semantic network, or semantic memory, whose core element is logical form; this is normalized by description logic (DL), namely by CL, a DL-based language – Conception Language – designed to formalize conceptualization from the viewpoint of individual cognitive agency. In this model, disruptions in the rule-based implementation of the logical form account for the apparently semantic anomalies symptomatic of FTD, which are detected by means of a CL-based algorithmic assessment. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of (...) this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
In this manuscript, published here for the first time, Tarski explores the concept of logical notion. He draws on Klein's Erlanger Programm to locate the logical notions of ordinary geometry as those invariant under all transformations of space. Generalizing, he explicates the concept of logical notion of an arbitrary discipline.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.