The paper discusses the origin of dark matter and dark energy from the concepts of time and the totality in the final analysis. Though both seem to be rather philosophical, nonetheless they are postulated axiomatically and interpreted physically, and the corresponding philosophical transcendentalism serves heuristically. The exposition of the article means to outline the “forest for the trees”, however, in an absolutely rigorous mathematical way, which to be explicated in detail in a future paper. The “two deductions” are two successive (...) stage of a single conclusion mentioned above. The concept of “transcendental invariance” meaning ontologically and physically interpreting the mathematical equivalence of the axiom of choice and the well-ordering “theorem” is utilized again. Then, time arrow is a corollary from that transcendental invariance, and in turn, it implies quantum information conservation as the Noether correlate of the linear “increase of time” after time arrow. Quantum information conservation implies a few fundamental corollaries such as the “conservation of energy conservation” in quantum mechanics from reasons quite different from those in classical mechanics and physics as well as the “absence of hidden variables” (versus Einstein’s conjecture) in it. However, the paper is concentrated only into the inference of another corollary from quantum information conservation, namely, dark matter and dark energy being due to entanglement, and thus and in the final analysis, to the conservation of quantum information, however observed experimentally only on the “cognitive screen” of “Mach’s principle” in Einstein’s general relativity. therefore excluding any other source of gravitational field than mass and gravity. Then, if quantum information by itself would generate a certain nonzero gravitational field, it will be depicted on the same screen as certain masses and energies distributed in space-time, and most presumably, observable as those dark energy and dark matter predominating in the universe as about 96% of its energy and matter quite unexpectedly for physics and the scientific worldview nowadays. Besides on the cognitive screen of general relativity, entanglement is available necessarily on still one “cognitive screen” (namely, that of quantum mechanics), being furthermore “flat”. Most probably, that projection is confinement, a mysterious and ad hoc added interaction along with the fundamental tree ones of the Standard model being even inconsistent to them conceptually, as far as it need differ the local space from the global space being definable only as a relation between them (similar to entanglement). So, entanglement is able to link the gravity of general relativity to the confinement of the Standard model as its projections of the “cognitive screens” of those two fundamental physical theories. (shrink)
Hay doble pulsión en el centro de la discusión del razonamiento deductivo. Una conduce aparentemente a la abstracción y dominios arbitrarios, mientras que la otra conduce a la concreción y la dependencia del contenido. El objetivo de esta investigación es diseñar, aplicar y validar un instrumento de evaluación que nos permita corroborar si el razonamiento deductivo maneja reglas lógicas o contenidos. La muestra de estudio se compuso de 80 participantes (edad 18-77 años). El test consta de 60 ítems categorizados en: (...) formalidad, integrabilidad, complejidad y modalidad. Los resultados ponen de manifiesto la fiabilidad del test de razonamiento deductivo con un alpha de Cronbach. 775 y con un índice de validación. 850 para el índice de inteligencia no verbal (INV) del RIAS. Cómo conclusión inferimos que el razonador deductivo no dispone de un cuerpo preciso de reglas generales deductivas, sino que usa colecciones de reglas heurísticas cada vez menos abstractas. (shrink)
Chapin reviewed this 1972 ZEITSCHRIFT paper that proves the completeness theorem for the logic of variable-binding-term operators created by Corcoran and his student John Herring in the 1971 LOGIQUE ET ANALYSE paper in which the theorem was conjectured. This leveraging proof extends completeness of ordinary first-order logic to the extension with vbtos. Newton da Costa independently proved the same theorem about the same time using a Henkin-type proof. This 1972 paper builds on the 1971 “Notes on a Semantic Analysis of (...) Variable Binding Term Operators” (Co-author John Herring), Logique et Analyse 55, 646–57. MR0307874 (46 #6989). A variable binding term operator (vbto) is a non-logical constant, say v, which combines with a variable y and a formula F containing y free to form a term (vy:F) whose free variables are exact ly those of F, excluding y. Kalish-Montague 1964 proposed using vbtos to formalize definite descriptions “the x: x+x=2”, set abstracts {x: F}, minimization in recursive function theory “the least x: x+x>2”, etc. However, they gave no semantics for vbtos. Hatcher 1968 gave a semantics but one that has flaws described in the 1971 paper and admitted by Hatcher. In 1971 we give a correct semantic analysis of vbtos. We also give axioms for using them in deductions. And we conjecture strong completeness for the deductions with respect to the semantics. The conjecture, proved in this paper with Hatcher’s help, was proved independently about the same time by Newton da Costa. (shrink)
-/- A variable binding term operator (vbto) is a non-logical constant, say v, which combines with a variable y and a formula F containing y free to form a term (vy:F) whose free variables are exact ly those of F, excluding y. -/- Kalish-Montague proposed using vbtos to formalize definite descriptions, set abstracts {x: F}, minimalization in recursive function theory, etc. However, they gave no sematics for vbtos. Hatcher gave a semantics but one that has flaws. We give a (...) correct semantic analysis of vbtos. We also give axioms for using them in deductions. And we conjecture strong completeness for the deductions with respect to the semantics. The conjecture was later proved independently by the authors and by Newton da Costa. -/- The expression (vy:F) is called a variable bound term (vbt). In case F has only y free, (vy:F) has the syntactic propreties of an individual constant; and under a suitable interpretation of the language vy:F) denotes an individual. By a semantic analysis of vbtos we mean a proposal for amending the standard notions of (1) "an interpretation o f a first -order language" and (2) " the denotation of a term under an interpretation and an assignment", such that (1') an interpretation o f a first -order language associates a set-theoretic structure with each vbto and (2') under any interpretation and assignment each vb t denotes an individual. (shrink)
When evaluating theories of causation, intuitions should not play a decisive role, not even intuitions in flawlessly-designed thought experiments. Indeed, no coherent theory of causation can respect the typical person’s intuitions in redundancy (pre-emption) thought experiments, without disrespecting their intuitions in threat-and-saviour (switching / short-circuit) thought experiments. I provide a deductively sound argument for these claims. Amazingly, this argument assumes absolutely nothing about the nature of causation. I also provide a second argument, whose conclusion is even stronger: the typical person’s (...) causal intuitions are thoroughly unreliable. This argument proceeds by raising the neglected question: in what respects is information about intermediate and enabling variables relevant to reliable causal judgment? (shrink)
We are much better equipped to let the facts reveal themselves to us instead of blinding ourselves to them or stubbornly trying to force them into preconceived molds. We no longer embarrass ourselves in front of our students, for example, by insisting that “Some Xs are Y” means the same as “Some X is Y”, and lamely adding “for purposes of logic” whenever there is pushback. Logic teaching in this century can exploit the new spirit of objectivity, humility, clarity, observationalism, (...) contextualism, and pluralism. Besides the new spirit there have been quiet developments in logic and its history and philosophy that could radically improve logic teaching. One rather conspicuous example is that the process of refining logical terminology has been productive. Future logic students will no longer be burdened by obscure terminology and they will be able to read, think, talk, and write about logic in a more careful and more rewarding manner. Closely related is increased use and study of variable-enhanced natural language as in “Every proposition x that implies some proposition y that is false also implies some proposition z that is true”. Another welcome development is the culmination of the slow demise of logicism. No longer is the teacher blocked from using examples from arithmetic and algebra fearing that the students had been indoctrinated into thinking that every mathematical truth was a tautology and that every mathematical falsehood was a contradiction. A fifth welcome development is the separation of laws of logic from so-called logical truths, i.e., tautologies. Now we can teach the logical independence of the laws of excluded middle and non-contradiction without fear that students had been indoctrinated into thinking that every logical law was a tautology and that every falsehood of logic was a contradiction. This separation permits the logic teacher to apply logic in the clarification of laws of logic. This lecture expands the above points, which apply equally well in first, second, and third courses, i.e. in “critical thinking”, “deductive logic”, and “symbolic logic”. (shrink)
Neuroscience has studied deductive reasoning over the last 20 years under the assumption that deductive inferences are not only de jure but also de facto distinct from other forms of inference. The objective of this research is to verify if logically valid deductions leave any cerebral electrical trait that is distinct from the trait left by non-valid deductions. 23 subjects with an average age of 20.35 years were registered with MEG and placed into a two conditions paradigm (100 (...) trials for each condition) which each presented the exact same relational complexity (same variables and content) but had distinct logical complexity. Both conditions show the same electromagnetic components (P3, N4) in the early temporal window (250–525 ms) and P6 in the late temporal window (500–775 ms). The significant activity in both valid and invalid conditions is found in sensors from medial prefrontal regions, probably corresponding to the ACC or to the medial prefrontal cortex. The amplitude and intensity of valid deductions is significantly lower in both temporal windows (p = 0.0003). The reaction time was 54.37% slower in the valid condition. Validity leaves a minimal but measurable hypoactive electrical trait in brain processing. The minor electrical demand is attributable to the recursive and automatable character of valid deductions, suggesting a physical indicator of computational deductive properties. It is hypothesized that all valid deductions are recursive and hypoactive. (shrink)
In the 17th century, Hobbes stated that we reason by addition and subtraction. Historians of logic note that Hobbes thought of reasoning as “a ‘species of computation’” but point out that “his writing contains in fact no attempt to work out such a project.” Though Leibniz mentions the plus/minus character of the positive and negative copulas, neither he nor Hobbes say anything about a plus/minus character of other common logical words that drive our deductive judgments, words like ‘some’, ‘all’, (...) ‘if’, and ‘and’, each of which actually turns out to have an oppositive, character that allows us, “in our silent reasoning,” to ignore its literal meaning and to reckon with it as one reckons with a plus or a minus operator in elementary algebra or arithmetic. These ‘logical constants’ of natural language figure crucially in our everyday reasoning. Because Hobbes and Leibniz did not identify them as the plus and minus words we reason with, their insight into what goes on in ‘ratiocination’ did not provide a guide for a research program that could develop a +/- logic that actually describes how we reason deductively. I will argue that such a +/- logic provides a way back from modern predicate logic—the logic of quantifiers and bound variables that is now ‘standard logic’—to an Aristotelian term logic of natural language that had been the millennial standard logic. (shrink)
In the article I deal with some paradoxes and errors caused by improper usage of logical and philosophical terms appearing in the arguments for existence of god and other philosophical issues. I point at rst some paradoxes coming om improper usage of propositional calculus as an instrument for analysis of a natural language. this language is actually not using simple sentences but rather propositional functions, their logical connections, and some replacements for variables in them. We still have to deal (...) with so called paradox of material implication. the second paragraph provides formal and metatheoretical critics of Charles Sanders Peirce’s theory of deduction, induction and abduction. I argue that what Peirce and his followers call abduction is actually deduction or some reasoning unable to describe in terms of the logic used by them. Both syllogistic and inferential theory of abduction generate some paradoxes and contradictions. In the last paragraph also some paradoxes and contradictions resulting om the theory of causation by Jan Łukasiewicz are presented. the central issue of the article is erroneous usage of the implication: in logical paraphrases of a natural language, in description of the scienti c reasoning, and in description of causality. However, my objective is not to solve all problems mentioned above but rather to open a discussion over them. (shrink)
We investigate an enrichment of the propositional modal language L with a "universal" modality ■ having semantics x ⊧ ■φ iff ∀y(y ⊧ φ), and a countable set of "names" - a special kind of propositional variables ranging over singleton sets of worlds. The obtained language ℒ $_{c}$ proves to have a great expressive power. It is equivalent with respect to modal definability to another enrichment ℒ(⍯) of ℒ, where ⍯ is an additional modality with the semantics x ⊧ (...) ⍯φ iff Vy(y ≠ x → y ⊧ φ). Model-theoretic characterizations of modal definability in these languages are obtained. Further we consider deductive systems in ℒ $_{c}$ . Strong completeness of the normal ℒ $_{c}$ logics is proved with respect to models in which all worlds are named. Every ℒ $_{c}$ -logic axiomatized by formulae containing only names (but not propositional variables) is proved to be strongly frame-complete. Problems concerning transfer of properties ([in]completeness, filtration, finite model property etc.) from ℒ to ℒ $_{c}$ are discussed. Finally, further perspectives for names in multimodal environment are briefly sketched. (shrink)
This paper presents rules of inference for a binary quantifier I for the formalisation of sentences containing definite descriptions within intuitionist positive free logic. I binds one variable and forms a formula from two formulas. Ix[F, G] means ‘The F is G’. The system is shown to have desirable proof-theoretic properties: it is proved that deductions in it can be brought into normal form. The discussion is rounded up by comparisons between the approach to the formalisation of definite descriptions recommended (...) here and the more usual approach that uses a term-forming operator ι, where ιxF means ‘the F’. (shrink)
We introduce and study hierarchies of extensions of the propositional modal and temporal languages with pairs of new syntactic devices: point of reference-reference pointer which enable semantic references to be made within a formula. We propose three different but equivalent semantics for the extended languages, discuss and compare their expressiveness. The languages with reference pointers are shown to have great expressive power (especially when their frugal syntax is taken into account), perspicuous semantics, and simple deductive systems. For instance, Kamp's (...) and Stavi's temporal operators, as well as nominals (names, clock variables), are definable in them. Universal validity in these languages is proved undecidable. The basic modal and temporal logics with reference pointers are uniformly axiomatized and a strong completeness theorem is proved for them and extended to some classes of their extensions. (shrink)
Behaviorism was a peculiarly American phenomenon. As a school of psychology it was founded by John B. Watson (1878-1958) and grew into the neobehaviorisms of the 1920s, 30s and 40s. Philosophers were involved from the start, prefiguring the movement and endeavoring to define or redefine its tenets. Behaviorism expressed the naturalistic bent in American thought, which came in response to the prevailing philosophical idealism and was inspired by developments in natural science itself. There were several versions of naturalism in American (...) philosophy, and also several behaviorisms. Most behaviorists paid homage to Darwinian functionalism; all forswore introspection and made learned changes in behavior the primary subject matter and explanatory domain of psychology. They differed in their descriptions of behavior, modes of explanation, and attitudes toward mentalistic concepts. Watson was a strict materialist who wanted to eliminate all mentalistic talk from psychology. Edward Chace Tolman (1886-1959) regarded mind as a biological function of the organism. He permitted mentalistic terms such as 'purpose' in behavioral description, and posited intervening processes that included 'representations' of the environment, while requiring such processes be studied only as expressed in behavior. Clark L. Hull (1884-1952) developed a hypothetical-deductive version of behaviorism, akin to Tolman's functionalism in positing intervening variables but without his cognitivist constructs. B. F. Skinner (1904-90) rejected intervening variables and developed his own account of the behavior of the whole organism, based on the laws of operant conditioning. The naturalism in American philosophy of the early twentieth century showed respect for the natural sciences, especially biology and psychology. John Dewey (1896, 1911), George Santayana (1905, 1920), and F. J. E. Woodbridge (1909, 1913) expressed this attitude. It animated the neorealism of E. B. Holt and Ralph Barton Perry, who gave special attention to psychology, and the evolutionary naturalism and critical realism of Roy Wood Sellars. This naturalism differed from Watson's in regarding mind as part of nature from a Darwinian and functionalist perspective, and treating behavior as the product of the mental functioning. It fed Tolman's version of behaviorism. It was not materialistic or physical-reductionist. Only later, with Quine and logical empiricism, was behaviorism seen as essentially physicalistic. (shrink)
Deductive Cogency holds that the set of propositions towards which one has, or is prepared to have, a given type of propositional attitude should be consistent and closed under logical consequence. While there are many propositional attitudes that are not subject to this requirement, e.g. hoping and imagining, it is at least prima facie plausible that Deductive Cogency applies to the doxastic attitude involved in propositional knowledge, viz. belief. However, this thought is undermined by the well-known preface paradox, (...) leading a number of philosophers to conclude that Deductive Cogency has at best a very limited role to play in our epistemic lives. I argue here that Deductive Cogency is still an important epistemic requirement, albeit not as a requirement on belief. Instead, building on a distinction between belief and acceptance introduced by Jonathan Cohen and recent developments in the epistemology of understanding, I propose that Deductive Cogency applies to the attitude of treating propositions as given in the context of attempting to understand a given phenomenon. I then argue that this simultaneously accounts for the plausibility of the considerations in favor of Deductive Cogency and avoids the problematic consequences of the preface paradox. (shrink)
This paper will focus on a philosophically significant construction whose semantics brings together two important notions in Kit Fine’s philosophy, the notion of truthmaking and the notion of a variable embodiment, or its extension, namely what I call a ‘variable object’. This is the construction of definite NPs like 'the number of people that can fit into the bus', 'the book John needs to write', and 'the gifted mathematician John claims to be'. Such NPs are analysed as standing for variable (...) objects, which are part of the 'shallow', construction-driven ontology of natural language, yet are real. (shrink)
The thesis of this paper is that we can justify induction deductively relative to one end, and deduction inductively relative to a different end. I will begin by presenting a contemporary variant of Hume ’s argument for the thesis that we cannot justify the principle of induction. Then I will criticize the responses the resulting problem of induction has received by Carnap and Goodman, as well as praise Reichenbach ’s approach. Some of these authors compare induction to deduction. Haack compares (...) deduction to induction, and I will critically discuss her argument for the thesis that we cannot justify the principles of deduction next. In concluding I will defend the thesis that we can justify induction deductively relative to one end, and deduction inductively relative to a different end, and that we can do so in a non-circular way. Along the way I will show how we can understand deductive and inductive logic as normative theories, and I will briefly sketch an argument to the effect that there are only hypothetical, but no categorical imperatives. (shrink)
The present article illustrates a conflict between the claim that rational belief sets are closed under deductive consequences, and a very inclusive claim about the factors that are sufficient to determine whether it is rational to believe respective propositions. Inasmuch as it is implausible to hold that the factors listed here are insufficient to determine whether it is rational to believe respective propositions, we have good reason to deny that rational belief sets are closed under deductive consequences.
In section 1, I develop epistemic communism, my view of the function of epistemically evaluative terms such as ‘rational’. The function is to support the coordination of our belief-forming rules, which in turn supports the reliable acquisition of beliefs through testimony. This view is motivated by the existence of valid inferences that we hesitate to call rational. I defend the view against the worry that it fails to account for a function of evaluations within first-personal deliberation. In the rest of (...) the paper, I then argue, on the basis of epistemic communism, for a view about rationality itself. I set up the argument in section 2 by saying what a theory of rational deduction is supposed to do. I claim that such a theory would provide a necessary, sufficient, and explanatorily unifying condition for being a rational rule for inferring deductive consequences. I argue in section 3 that, given epistemic communism and the conventionality that it entails, there is no such theory. Nothing explains why certain rules for deductive reasoning are rational. (shrink)
Current theories of grammar handle both extraction and anaphorization by introducing variables into syntactic representations. Combinatory categorial grammar eliminates variables corresponding to gaps. Using the combinator W, the paper extends this approach to anaphors, which appear to act as overt bound variables. [Slightly extended version in Bartsch et al 1989.].
We present a sound and complete Fitch-style natural deduction system for an S5 modal logic containing an actuality operator, a diagonal necessity operator, and a diagonal possibility operator. The logic is two-dimensional, where we evaluate sentences with respect to both an actual world (first dimension) and a world of evaluation (second dimension). The diagonal necessity operator behaves as a quantifier over every point on the diagonal between actual worlds and worlds of evaluation, while the diagonal possibility quantifies over some point (...) on the diagonal. Thus, they are just like the epistemic operators for apriority and its dual. We take this extension of Fitch’s familiar derivation system to be a very natural one, since the new rules and labeled lines hereby introduced preserve the structure of Fitch’s own rules for the modal case. (shrink)
Methods available for the axiomatization of arbitrary finite-valued logics can be applied to obtain sound and complete intelim rules for all truth-functional connectives of classical logic including the Sheffer stroke and Peirce’s arrow. The restriction to a single conclusion in standard systems of natural deduction requires the introduction of additional rules to make the resulting systems complete; these rules are nevertheless still simple and correspond straightforwardly to the classical absurdity rule. Omitting these rules results in systems for intuitionistic versions of (...) the connectives in question. (shrink)
The study aimed at identifying the personal variables and their effect in promoting job creation in Gaza Strip through business incubators. The researchers used the descriptive analytical approach to achieve the study objectives. The study population consisted of 92 of the pilot projects benefiting from the three business incubators in Gaza Strip (Palestinian Information Technology Incubator, UCAS Technology Incubator and Business and Technology Incubator). The study reached a number of results, the most important of which are the existence of (...) statistically significant differences on entrepreneurship attributed to each age as most of them are between the ages of 22-30, Gender for males, business incubator, scientific qualification for the specialties of information technology and engineering, and years of experience. Based on the findings, the researchers recommend focusing on university students in guiding them towards entrepreneurship and helping new graduates to start entrepreneurship. And to guide students to scientific disciplines that help them in entrepreneurship after graduation, whether starting a small business or self-employment, support females to entrepreneurship as most of the entrepreneurs are male, in addition to stimulating males as well. (shrink)
James Van Cleve has argued that Kant’s Transcendental Deduction of the categories shows, at most, that we must apply the categories to experience. And this falls short of Kant’s aim, which is to show that they must so apply. In this discussion I argue that once we have noted the differences between the first and second editions of the Deduction, this objection is less telling. But Van Cleve’s objection can help illuminate the structure of the B Deduction, and it suggests (...) an interesting reason why the rewriting might have been thought necessary. (shrink)
This presentation of Aristotle's natural deduction system supplements earlier presentations and gives more historical evidence. Some fine-tunings resulted from conversations with Timothy Smiley, Charles Kahn, Josiah Gould, John Kearns,John Glanvillle, and William Parry.The criticism of Aristotle's theory of propositions found at the end of this 1974 presentation was retracted in Corcoran's 2009 HPL article "Aristotle's demonstrative logic".
It is tempting to think that multi premise closure creates a special class of paradoxes having to do with the accumulation of risks, and that these paradoxes could be escaped by rejecting the principle, while still retaining single premise closure. I argue that single premise deduction is also susceptible to risks. I show that what I take to be the strongest argument for rejecting multi premise closure is also an argument for rejecting single premise closure. Because of the symmetry between (...) the principles, they come as a package: either both will have to be rejected or both will have to be revised. (shrink)
In this paper, I consider a family of three-valued regular logics: the well-known strong and weak S.C. Kleene’s logics and two intermedi- ate logics, where one was discovered by M. Fitting and the other one by E. Komendantskaya. All these systems were originally presented in the semantical way and based on the theory of recursion. However, the proof theory of them still is not fully developed. Thus, natural deduction sys- tems are built only for strong Kleene’s logic both with one (...) (A. Urquhart, G. Priest, A. Tamminga) and two designated values (G. Priest, B. Kooi, A. Tamminga). The purpose of this paper is to provide natural deduction systems for weak and intermediate regular logics both with one and two designated values. (shrink)
The aim of this paper is to evaluate the critical success factors and investigate the benefits that might be gained once implementing Electronic Customer Relationship Management at HEI from employee perspective. The study conducted at Al Quds Open University in Palestine and data collected from (300) employee through a questionnaire which consist of four variables. A number of statistical tools were intended for hypotheses testing and data analysis, including Spearman correlation coefficient for Validity, reliability correlation using Cronbach’s alpha, and (...) Frequency and Descriptive analysis. The overall findings of the current study show that all the features were important for staff and it was critical success factors, at the same time, websites were providing all the features discussed by the theory whereas staff showed their willingness to use those features if provided. It is also discovered that implementing Electronic Customer Relationship Management can cause staff retention, were provided efficiently and needed to be improved. Research limitations: The survey findings were based on QOU employee in Palestine, UAE and KSA branches not included in the study. (shrink)
Harold Hodes in [1] introduces an extension of first-order modal logic featuring a backtracking operator, and provides a possible worlds semantics, according to which the operator is a kind of device for ‘world travel’; he does not provide a proof theory. In this paper, I provide a natural deduction system for modal logic featuring this operator, and argue that the system can be motivated in terms of a reading of the backtracking operator whereby it serves to indicate modal scope. I (...) prove soundness and completeness theorems with respect to Hodes’ semantics, as well as semantics with fewer restrictions on the accessibility relation. (shrink)
For deductive reasoning to be justified, it must be guaranteed to preserve truth from premises to conclusion; and for it to be useful to us, it must be capable of informing us of something. How can we capture this notion of information content, whilst respecting the fact that the content of the premises, if true, already secures the truth of the conclusion? This is the problem I address here. I begin by considering and rejecting several accounts of informational content. (...) I then develop an account on which informational contents are indeterminate in their membership. This allows there to be cases in which it is indeterminate whether a given deduction is informative. Nevertheless, on the picture I present, there are determinate cases of informative (and determinate cases of uninformative) inferences. I argue that the model I offer is the best way for an account of content to respect the meaning of the logical constants and the inference rules associated with them without collapsing into a classical picture of content, unable to account for informative deductive inferences. (shrink)
This paper describes a cubic water tank equipped with a movable partition receiving various amounts of liquid used to represent joint probability distributions. This device is applied to the investigation of deductive inferences under uncertainty. The analogy is exploited to determine by qualitative reasoning the limits in probability of the conclusion of twenty basic deductive arguments (such as Modus Ponens, And-introduction, Contraposition, etc.) often used as benchmark problems by the various theoretical approaches to reasoning under uncertainty. The probability (...) bounds imposed by the premises on the conclusion are derived on the basis of a few trivial principles such as "a part of the tank cannot contain more liquid than its capacity allows", or "if a part is empty, the other part contains all the liquid". This stems from the equivalence between the physical constraints imposed by the capacity of the tank and its subdivisions on the volumes of liquid, and the axioms and rules of probability. The device materializes de Finetti's coherence approach to probability. It also suggests a physical counterpart of Dutch book arguments to assess individuals' rationality in probability judgments in the sense that individuals whose degrees of belief in a conclusion are out of the bounds of coherence intervals would commit themselves to executing physically impossible tasks. (shrink)
ABSTRACT Quine insisted that the satisfaction of an open modalised formula by an object depends on how that object is described. Kripke's ‘objectual’ interpretation of quantified modal logic, whereby variables are rigid, is commonly thought to avoid these Quinean worries. Yet there remain residual Quinean worries for epistemic modality. Theorists have recently been toying with assignment-shifting treatments of epistemic contexts. On such views an epistemic operator ends up binding all the variables in its scope. One might worry that (...) this yields the undesirable result that any attempt to ‘quantify in’ to an epistemic environment is blocked. If quantifying into the relevant constructions is vacuous, then such views would seem hopelessly misguided and empirically inadequate. But a famous alternative to Kripke's semantics, namely Lewis' counterpart semantics, also faces this worry since it also treats the boxes and diamonds as assignment-shifting devices. As I'll demonstrate, the mere fact that a variable is bound is no obstacle to binding it. This provides a helpful lesson for those modelling de re epistemic contexts with assignment sensitivity, and perhaps leads the way toward the proper treatment of binding in both metaphysical and epistemic contexts: Kripke for metaphysical modality, Lewis for epistemic modality. (shrink)
In the transcendental deduction, the central argument of the Critique of Pure Reason, Kant seeks to secure the objective validity of our basic categories of thought. He distinguishes objective and subjective sides of this argument. The latter side, the subjective deduction, is normally understood as an investigation of our cognitive faculties. It is identified with Kant’s account of a threefold synthesis involved in our cognition of objects of experience, and it is said to precede and ground Kant’s proof of the (...) validity of the categories in the objective deduction. I challenge this standard reading of the subjective deduction, arguing, first, that there is little textual evidence for it, and, second, that it encourages a problematic conception of how the deduction works. In its place, I present a new reading of the subjective deduction. Rather than being a broad investigation of our cognitive faculties, it should be seen as addressing a specific worry that arises in the course of the objective deduction. The latter establishes the need for a necessary connection between our capacities for thinking and being given objects, but Kant acknowledges that his readers might struggle to comprehend how these seemingly independent capacities are coordinated. Even worse, they might well believe that in asserting this necessary connection, Kant’s position amounts to an implausible subjective idealism. The subjective deduction ismeant to allay these concerns by showing that they rest on a misunderstanding of the relation between these faculties. This new reading of the subjective deduction offers a better fit with Kant’s text. It also has broader implications, for it reveals the more philosophically plausible account of our relation to the world as thinkers that Kant is defending – an account that is largely obscured by the standard reading of the subjective deduction. (shrink)
One of the strongest motivations for conceptualist readings of Kant is the belief that the Transcendental Deduction is incompatible with nonconceptualism. In this article, I argue that this belief is simply false: the Deduction and nonconceptualism are compatible at both an exegetical and a philosophical level. Placing particular emphasis on the case of non-human animals, I discuss in detail how and why my reading diverges from those of Ginsborg, Allais, Gomes and others. I suggest ultimately that it is only by (...) embracing nonconceptualism that we can fully recognise the delicate calibration of the trap which the Critique sets for Hume. (shrink)
Variables is a project at the intersection of the philosophies of language and logic. Frege, in the Begriffsschrift, crystalized the modern notion of formal logic through the first fully successful characterization of the behaviour of quantifiers. In Variables, I suggest that the logical tradition we have inherited from Frege is importantly flawed, and that Frege's move from treating quantifiers as noun phrases bearing word-world connection to sentential operators in the guise of second-order predicates leaves us both philosophically and (...) technically wanting. (shrink)
Fixed-rate versions of rule-consequentialism and rule-utilitarianism evaluate rules in terms of the expected net value of one particular level of social acceptance, but one far enough below 100% social acceptance to make salient the complexities created by partial compliance. Variable-rate versions of rule-consequentialism and rule-utilitarianism instead evaluate rules in terms of their expected net value at all different levels of social acceptance. Brad Hooker has advocated a fixed-rate version. Michael Ridge has argued that the variable-rate version is better. The debate (...) continues here. Of particular interest is the difference between the implications of Hooker's and Ridge's rules about doing good for others. (shrink)
The definitions of ‘deduction’ found in virtually every introductory logic textbook would encourage us to believe that the inductive/deductive distinction is a distinction among kinds of arguments and that the extension of ‘deduction’ is a determinate class of arguments. In this paper, we argue that that this approach is mistaken. Specifically, we defend the claim that typical definitions of ‘deduction’ operative in attempts to get at the induction/deduction distinction are either too narrow or insufficiently precise. We conclude by presenting (...) a deflationary understanding of the inductive/deductive distinction; in our view, its content is nothing over and above the answers to two fundamental sorts of questions central to critical thinking. (shrink)
Deductive inference is usually regarded as being “tautological” or “analytical”: the information conveyed by the conclusion is contained in the information conveyed by the premises. This idea, however, clashes with the undecidability of first-order logic and with the (likely) intractability of Boolean logic. In this article, we address the problem both from the semantic and the proof-theoretical point of view. We propose a hierarchy of propositional logics that are all tractable (i.e. decidable in polynomial time), although by means of (...) growing computational resources, and converge towards classical propositional logic. The underlying claim is that this hierarchy can be used to represent increasing levels of “depth” or “informativeness” of Boolean reasoning. Special attention is paid to the most basic logic in this hierarchy, the pure “intelim logic”, which satisfies all the requirements of a natural deduction system (allowing both introduction and elimination rules for each logical operator) while admitting of a feasible (quadratic) decision procedure. We argue that this logic is “analytic” in a particularly strict sense, in that it rules out any use of “virtual information”, which is chiefly responsible for the combinatorial explosion of standard classical systems. As a result, analyticity and tractability are reconciled and growing degrees of computational complexity are associated with the depth at which the use of virtual information is allowed. (shrink)
Kit Fine has reawakened a puzzle about variables with a long history in analytic philosophy, labeling it “the antinomy of the variable”. Fine suggests that the antinomy demands a reconceptualization of the role of variables in mathematics, natural language semantics, and first-order logic. The difficulty arises because: (i) the variables ‘x’ and ‘y’ cannot be synonymous, since they make different contributions when they jointly occur within a sentence, but (ii) there is a strong temptation to say that (...) distinct variables ‘x’ and ‘y’ are synonymous, since sentences differing by the total, proper substitution of ‘x’ for ‘y’ always agree in meaning. We offer a precise interpretation of the challenge posed by (i) and (ii). We then develop some neglected passages of Tarski to show that his semantics for variables has the resources to resolve the antinomy without abandoning standard compositional semantics. (shrink)
Variable-Value axiologies propose solutions to the challenges of population ethics. These views avoid Parfit's Repugnant Conclusion, while satisfying some weak instances of the Mere Addition principle (for example, at small population sizes). We apply calibration methods to Variable-Value views while assuming: first, some very weak instances of Mere Addition, and, second, some plausible empirical assumptions about the size and welfare of the intertemporal world population. We find that Variable-Value views imply conclusions that should seem repugnant to anyone who opposes Total (...) Utilitarianism due to the Repugnant Conclusion. So, any wish to avoid repugnant conclusions is not a good reason to choose a Variable-Value view. More broadly, these calibrations teach us something about the effort to avoid the Repugnant Conclusion. Our results join a recent literature arguing that prior efforts to avoid the Repugnant Conclusion hinge on inessential features of the formalization of repugnance. Some of this effort may therefore be misplaced. (shrink)
Deductive reasoning is the kind of reasoning in which, roughly, the truth of the input propositions (the premises) logically guarantees the truth of the output proposition (the conclusion), provided that no mistake has been made in the reasoning. The premises may be propositions that the reasoner believes or assumptions that the reasoner is exploring. Deductive reasoning contrasts with inductive reasoning, the kind of reasoning in which the truth of the premises need not guarantee the truth of the conclusion.
How is moral knowledge possible? This paper defends the anti-Humean thesis that we can acquire moral knowledge by deduction from wholly non-moral premises. According to Hume’s Law, as it has become known, we cannot deduce an ‘ought’ from an ‘is’, since it is “altogether inconceivable how this new relation can be a deduction from others, which are entirely different from it” (Hume, 1739, 3.1.1). This paper explores the prospects for a deductive theory of moral knowledge that rejects Hume’s Law.
The deduction of categories in the 1781 edition of the Critique of the Pure Reason (A Deduction) has “two sides”—the “objective deduction” and the “subjective deduction”. Kant seems ambivalent about the latter deduction. I treat it as a significant episode of Kant’s thinking about categories that extended from the early 1770s to around 1790. It contains his most detailed answer to the question about the origin of categories that he formulated in the 1772 letter to Marcus Herz. The answer is (...) that categories are generated a priori through a kind of intellectual “epigenesis”. This account leaves unexplained why precisely such and such categories should be generated. While this observation caused Kant to worry about the hypothetical status of the subjective deduction in 1781, he would come to acquiesce in the recognition that the ground of the possibility of categories is itself inscrutable. I call this his “methodological skepticism”. (shrink)
This essay presents deductive arguments to an introductory-level audience via a discussion of Aristotle's three types of rhetoric, the goals of and differences between deductive and non-deductive arguments, and the major features of deductive arguments (e.g., validity and soundness).
Juhani Yli-Vakkuri has argued that the Twin Earth thought experiments offered in favour of semantic externalism can be replaced by a straightforward deductive argument from premisses widely accepted by both internalists and externalists alike. The deductive argument depends, however, on premisses that, on standard formulations of internalism, cannot be satisfied by a single belief simultaneously. It does not therefore, constitute a proof of externalism. The aim of this article is to explain why.
The purpose of this paper was to examine institutional variables and the supervision of security in secondary schools in Cross River State. The study specifically sought to determine whether there was a significant influence of school population, school type and school location, on the supervision of security in public secondary schools in Cross River State. Three null hypotheses were formulated accordingly to guide the study. 360 students and 120 teachers resulting in a total of 480 respondents, constituted the sample (...) for the study. The instrument used for data collection was a questionnaire while Independent t-test was used to analyze data and test the hypotheses at .05 level of significance using Microsoft Excel version 2013. The results of the findings revealed that school population, school type and school location, all have an influence in the supervision of security in public secondary schools of Cross River State. It was also revealed that lowly populated, mixed-gender, and urban public secondary schools were more efficient in the supervision of security than their counterparts such as highly populated, single-gender and rural secondary schools. Based on the findings of this study, conclusions were drawn and recommendations were made. (shrink)
This work gives an extended presentation of the treatment of variable-binding operators adumbrated in [3:1993d]. Illustrative examples include elementary languages with quantifiers and lambda-equipped categorial languages. Some remarks are also offered to illustrate the philosophical import of the resulting picture. Particularly, a certain conception of logic emerges from the account: the view that logics are true theories in the model-theoretic sense, i.e. the result of selecting a certain class of models as the only “admissible” interpretation structures (for a given language).
It is commonly held that Kant ventured to derive morality from freedom in Groundwork III. It is also believed that he reversed this strategy in the second Critique, attempting to derive freedom from morality instead. In this paper, I set out to challenge these familiar assumptions: Kant’s argument in Groundwork III rests on a moral conception of the intelligible world, one that plays a similar role as the ‘fact of reason’ in the second Critique. Accordingly, I argue, there is no (...) reversal in the proof-structure of Kant’s two works. (shrink)
It is often assumed that Fichte's aim in Part I of the System of Ethics is to provide a deduction of the moral law, the very thing that Kant – after years of unsuccessful attempts – deemed impossible. On this familiar reading, what Kant eventually viewed as an underivable 'fact' (Factum), the authority of the moral law, is what Fichte traces to its highest ground in what he calls the principle of the 'I'. However, scholars have largely overlooked a passage (...) in the System of Ethics where Fichte explicitly invokes Kant's doctrine of the fact of reason with approval, claiming that consciousness of the moral law grounds our belief in freedom (GA I/5:65). On the reading I defend, Fichte's invocation of the Factum is consistent with the structure of Part I when we distinguish (a) the feeling of moral compulsion from (b) the moral law itself. (shrink)
According to non-conceptualist interpretations, Kant held that the application of concepts is not necessary for perceptual experience. Some have motivated non-conceptualism by noting the affinities between Kant's account of perception and contemporary relational theories of perception. In this paper I argue (i) that non-conceptualism cannot provide an account of the Transcendental Deduction and thus ought to be rejected; and (ii) that this has no bearing on the issue of whether Kant endorsed a relational account of perceptual experience.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.