The two principal models of design in methodological circles in architecture—analysis/synthesis and conjecture/analysis—have their roots in philosophy of science, in different conceptions of scientific method. This paper explores the philosophical origins of these models and the reasons for rejecting analysis/synthesis in favour of conjecture/analysis, the latter being derived from Karl Popper’s view of scientific method. I discuss a fundamental problem with Popper’s view, however, and indicate a framework for conjecture/analysis to avoid this problem.
intro to Part 1 - -/- Most people disliked mathematics when they were at school and they were absolutely correct to do so. This is because maths as we know it is severely incomplete. No matter how elaborated and complicated mathematical equations become, in today's world they're based on 1+1=2. This certainly conforms to the world our physical senses perceive and to the world scientific instruments detect. It has been of immeasurable value to all knowledge throughout history and has elevated (...) science to the lofty status it enjoys. Science is now striving towards Unification - where the subatomic realm, all matter, energy, forces, space and time will be seen as entangled parts of one universe. While 1+1=2 has been vital in getting humanity to this point, it's time to suppress our attachments to the past and realize that whereas 1+1 will always equal 2, it's also capable of equalling the 1 which represents unification. -/- intro to Part 2 -/- b) Division by zero is accepted, in Newtonian maths, to be impossible. But we can regard division by zero as division by nothing i.e. division that has no effect. In this case, 1 divided by 0 is 1. However, to a physicist there is no such thing as nothing (even empty space contains energy). What could the something called 0 actually be? It could be a binary digit. If we use the base of ten (for simplicity) and attach one and zero to it as exponents, we get 10^1 divided by 10^0 = 10^1. If we then cancel 10 from each factor in the expression, we get 1 divided by 0 = 1. At the start of the paragraph, this was referred to as division by nothing. Then 0 was called a binary digit and division by nothing became division by something. The 1 that the division equals is the unified field of space-time. Division by 0 is impossible in Newtonian maths because the result can be infinity. But the word “infinity” can, as the last section of this book shows, apply to the unified field of spacetime. So division by zero is not impossible because it results in the universe, which is obviously possible … a possibility that has always been, and always will be, realized. -/- intro to Part 3 -/- If quantum entanglement has existed in the entire universe forever, everything would be everywhere and everywhen. Space, time and 5th-dimensional hyperspace would not be restricted to certain parts of the Mobius Universe but would exist in every particle. Past, present and future would not exist as the distinct periods which everyday life assumes. All instants of all periods would exist eternally, permitting time travel to any point in the past and to any point in the future. Entanglement may be created by simply zipping along at close to the speed of light - “Quantum entanglement of moving bodies” by Robert M. Gingrich and Christoph Adami in Physical Review Letters 89, 270402 (issue of 30 December 2002) – which might be achieved, according to this book, by warping space so it’s either a fraction of the 90 degrees allowing instantaneous travel or almost at 270 degrees to space as we know it. (shrink)
Determinism is established in quantum mechanics by tracing the probabilities in the Born rules back to the absolute (overall) phase constants of the wave functions and recognizing these phase constants as pseudorandom numbers. The reduction process (collapse) is independent of measurement. It occurs when two wavepackets overlap in ordinary space and satisfy a certain criterion, which depends on the phase constants of both wavepackets. Reduction means contraction of the wavepackets to the place of overlap. The measurement apparatus fans out the (...) incoming wavepacket into spatially separated eigenpackets of the chosen observable. When one of these eigenpackets together with a wavepacket located in the apparatus satisfy the criterion, the reduction associates the place of contraction with an eigenvalue of the observable. The theory is nonlocal and contextual. Keywords:. (shrink)
How were reliable predictions made before Pascal and Fermat's discovery of the mathematics of probability in 1654? What methods in law, science, commerce, philosophy, and logic helped us to get at the truth in cases where certainty was not attainable? The book examines how judges, witch inquisitors, and juries evaluated evidence; how scientists weighed reasons for and against scientific theories; and how merchants counted shipwrecks to determine insurance rates. Also included are the problem of induction before Hume, design arguments for (...) the existence of God, and theories on how to evaluate scientific and historical hypotheses. It is explained how Pascal and Fermat's work on chance arose out of legal thought on aleatory contracts. The book interprets pre-Pascalian unquantified probability in a generally objective Bayesian or logical probabilist sense. (shrink)
In this paper, I propose to look closely at certain crucial aspects of the logic of Rawls' argument in Political Liberalism and related subsequent writings. Rawls' argument builds on the notion of comprehensiveness, whereby a doctrine encompasses the full spectrum of the life of its adherents. In order to show the mutual conflict and irreconcilability of comprehensive doctrines, Rawls needs to emphasise the comprehensiveness of doctrines, as their irreconcilability to a large extent emanates from that comprehensiveness. On the other hand, (...) in order to show the possibility and plausibility of the political liberal solution he needs to emphasise that most of these doctrines are reasonable: i.e., they are willing to cede a portion of their authority to political liberalism for the right reasons. Yet, if they are willing to cede a portion of their authority to a political conception they cannot be as comprehensive as we initially thought they were. All these elements highlight the tension in the argument itself. I suggest that many of these tensions can be removed by making Rawls' account more flexible. In this context I propose certain amendments to Rawls' account, which may overcome some of the tensions mentioned above. (shrink)
I introduce a formalization of probability which takes the concept of 'evidence' as primitive. In parallel to the intuitionistic conception of truth, in which 'proof' is primitive and an assertion A is judged to be true just in case there is a proof witnessing it, here 'evidence' is primitive and A is judged to be probable just in case there is evidence supporting it. I formalize this outlook by representing propositions as types in Martin-Lof type theory (MLTT) and defining a (...) 'probability type' on top of the existing machinery of MLTT, whose inhabitants represent pieces of evidence in favor of a proposition. One upshot of this approach is the potential for a mathematical formalism which treats 'conjectures' as mathematical objects in their own right. Other intuitive properties of evidence occur as theorems in this formalism. (shrink)
Religious conservatives in the U.S. have frequently opposed public-health measures designed to combat STDs among minors, such as sex education, condom distribution, and HPV vaccination. Using Rawls’s method of conjecture, I will clear up what I take to be a misunderstanding on the part of religious conservatives: even if we grant their premises regarding the nature and source of sexual norms, the wide-ranging authority of parents to enforce these norms against their minor children, and the potential sexual-disinhibition effects of (...) the above public-health measures, their opposition to at least one of these measures, HPV vaccination, cannot be justified. In fact, their comprehensive doctrines, when properly interpreted, should lead them to back this measure and thereby draw closer to a policy consensus with other citizens regarding teenage sexual health. (shrink)
An isomorphism is built between the separable complex Hilbert space (quantum mechanics) and Minkowski space (special relativity) by meditation of quantum information (i.e. qubit by qubit). That isomorphism can be interpreted physically as the invariance between a reference frame within a system and its unambiguous counterpart out of the system. The same idea can be applied to Poincaré’s conjecture (proved by G. Perelman) hinting another way for proving it, more concise and meaningful physically. Mathematically, the isomorphism means the invariance (...) to choice, the axiom of choice, well-ordering, and well-ordering “theorem” (or “principle”) and can be defined generally as “information invariance”. (shrink)
Guessing the outcome of iterations of even most simple arithmetical functions could be an extremely hazardous experience. Not less harder, if at all possible, might be to prove the veracity of even a "sure" guess concerning iterations : this is the case of the famous 3x+1 conjecture. Our purpose here is to study and conceptualize some intuitive insights related to the ultimate (un)solvability of this conjecture.
Jerry Fodor deemed informational encapsulation ‘the essence’ of a system’s modularity and argued that human perceptual processing comprises modular systems, thus construed. Nowadays, his conclusion is widely challenged. Often, this is because experimental work is seen to somehow demonstrate the cognitive penetrability of perceptual processing, where this is assumed to conflict with the informational encapsulation of perceptual systems. Here, I deny the conflict, proposing that cognitive penetration need not have any straightforward bearing on (a) the conjecture that perceptual processing (...) is composed of nothing but informationally encapsulated modules, (b) the conjecture that each and every perceptual computation is performed by an informationally encapsulated module, and (c) the consequences perceptual encapsulation was traditionally expected to have for a perception-cognition border, the epistemology of perception and cognitive science. With these points in view, I propose that particularly plausible cases of cognitive penetration would actually seem to evince the encapsulation of perceptual systems rather than refute/problematize this conjecture. (shrink)
We discuss a well-known puzzle about the lexicalization of logical operators in natural language, in particular connectives and quantifiers. Of the many logically possible functions of the relevant type, only few appear in the lexicon of natural languages: the connectives in English, for example, are only 'and', 'or', and perhaps 'nor' (expressing negated disjunction). The logically possible 'nand' (negated conjunction) is not expressed by a lexical entry of English, or of any natural language. The explanation we propose is based on (...) the “dynamic” behaviour of connectives and quantifiers: we define update potentials for logical operator, under the assumption that the logical structure of a sentence p defines what type of update p contributes to context, together with the speech act performed (assertion or denial). We conjecture that the adequacy of update potentials determines the limits of lexicalizability for logical operators in natural language. (shrink)
Some philosophers have argued that truth is a norm of judgement and have provided a variety of formulations of this general thesis. In this paper, I shall side with these philosophers and assume that truth is a norm of judgement. What I am primarily interested in here are two core questions concerning the judgement-truth norm: (i) what are the normative relationships between truth and judgement? And (ii) do these relationships vary or are they constant? I argue for a pluralist picture—what (...) I call Normative Alethic Pluralism (NAP)—according to which (i) there is more than one correct judgement-truth norm and (ii) the normative relationships between truth and judgement vary in relation to the subject matter of the judgement. By means of a comparative analysis of disagreement in three areas of the evaluative domain—refined aesthetics, basic taste and morality—I show that there is an important variability in the normative significance of disagreement—I call this the variability conjecture. By presenting a variation of Lynch’s scope problem for alethic monism, I argue that a monistic approach to the normative function of truth is unable to vindicate the conjecture. I then argue that normative alethic pluralism provides us with a promising model to account for it. (shrink)
In this paper the logic of broad necessity is explored. Definitions of what it means for one modality to be broader than another are formulated, and it is proven, in the context of higher-order logic, that there is a broadest necessity, settling one of the central questions of this investigation. It is shown, moreover, that it is possible to give a reductive analysis of this necessity in extensional language. This relates more generally to a conjecture that it is not (...) possible to define intensional connectives from extensional notions. This conjecture is formulated precisely in higher-order logic, and concrete cases in which it fails are examined. The paper ends with a discussion of the logic of broad necessity. It is shown that the logic of broad necessity is a normal modal logic between S4 and Triv, and that it is consistent with a natural axiomatic system of higher-order logic that it is exactly S4. Some philosophical reasons to think that the logic of broad necessity does not include the S5 principle are given. (shrink)
A number of Bayesians claim that, if one has no evidence relevant to a proposition P, then one's credence in P should be spread over the interval [0, 1]. Against this, I argue: first, that it is inconsistent with plausible claims about comparative levels of confidence; second, that it precludes inductive learning in certain cases. Two motivations for the view are considered and rejected. A discussion of alternatives leads to the conjecture that there is an in-principle limitation on formal (...) representations of belief: they cannot be both fully accurate and maximally specific. (shrink)
The aim of this article is to investigate the roles of commutative diagrams (CDs) in a speciﬁc mathematical domain, and to unveil the reasons underlying their effectiveness as a mathematical notation; this will be done through a case study. It will be shown that CDs do not depict spatial relations, but represent mathematical structures. CDs will be interpreted as a hybrid notation that goes beyond the traditional bipartition of mathematical representations into diagrammatic and linguistic. It will be argued that one (...) of the reasons why CDs form a good notation is that they are highly mathematically tractable: experts can obtain valid results by ‘calculating’ with CDs. These calculations, take the form of ‘diagram chases’. In order to draw inferences, experts move algebraic elements around the diagrams. It will be argued that these diagrams are dynamic. It is thanks to their dynamicity that CDs can externalize the relevant reasoning and allow experts to draw conclusions directly by manipulating them. Lastly, it will be shown that CDs play essential roles in the context of proof as well as in other phases of the mathematical enterprise, such as discovery and conjecture formation. (shrink)
Set-theoretic pluralism is an increasingly influential position in the philosophy of set theory (Balaguer [1998], Linksy and Zalta [1995], Hamkins [2012]). There is considerable room for debate about how best to formulate set-theoretic pluralism, and even about whether the view is coherent. But there is widespread agreement as to what there is to recommend the view (given that it can be formulated coherently). Unlike set-theoretic universalism, set-theoretic pluralism affords an answer to Benacerraf’s epistemological challenge. The purpose of this paper is (...) to determine what Benacerraf’s challenge could be such that this view is warranted. I argue that it could not be any of the challenges with which it has been traditionally identified by its advocates, like of Benacerraf and Field. Not only are none of the challenges easier for the pluralist to meet. None satisfies a key constraint that has been placed on Benacerraf’s challenge. However, I argue that Benacerraf’s challenge could be the challenge to show that our set-theoretic beliefs are safe – i.e., to show that we could not have easily had false ones. Whether the pluralist is, in fact, better positioned to show that our set-theoretic beliefs are safe turns on a broadly empirical conjecture which is outstanding. If this conjecture proves to be false, then it is unclear what the epistemological argument for set-theoretic pluralism is supposed to be. (shrink)
In this paper, we draw on developmental findings to provide a nuanced understanding of background emotions, particularly those in depression. We demonstrate how they reflect our basic proximity (feeling of interpersonal connectedness) to others and defend both a phenomenological and a functional claim. First, we substantiate a conjecture by Fonagy & Target (International Journal of Psychoanalysis 88(4):917–937, 2007) that an important phenomenological aspect of depression is the experiential recreation of the infantile loss of proximity to significant others. Second, we (...) argue that proximity has a particular cognitive function that allows individuals to morph into a cohesive dyadic system able to carry out distributed emotion regulation. We show that elevated levels of psychological suffering connected to depressive background emotions may be explained not only in terms of a psychological loss, but also as the felt inability to enter into dyadic regulatory relations with others—an experiential constraint that decreases the individual’s ability to adapt to demanding situations. (shrink)
We reconsider the pragmatic interpretation of intuitionistic logic [21] regarded as a logic of assertions and their justi cations and its relations with classical logic. We recall an extension of this approach to a logic dealing with assertions and obligations, related by a notion of causal implication [14, 45]. We focus on the extension to co-intuitionistic logic, seen as a logic of hypotheses [8, 9, 13] and on polarized bi-intuitionistic logic as a logic of assertions and conjectures: looking at the (...) S4 modal translation, we give a de nition of a system AHL of bi-intuitionistic logic that correctly represents the duality between intuitionistic and co-intuitionistic logic, correcting a mistake in previous work [7, 10]. A computational interpretation of cointuitionism as a distributed calculus of coroutines is then used to give an operational interpretation of subtraction.Work on linear co-intuitionism is then recalled, a linear calculus of co-intuitionistic coroutines is de ned and a probabilistic interpretation of linear co-intuitionism is given as in [9]. Also we remark that by extending the language of intuitionistic logic we can express the notion of expectation, an assertion that in all situations the truth of p is possible and that in a logic of expectations the law of double negation holds. Similarly, extending co-intuitionistic logic, we can express the notion of conjecture that p, de ned as a hypothesis that in some situation the truth of p is epistemically necessary. (shrink)
This paper elaborates a new solution to the lottery paradox, according to which the paradox arises only when we lump together two distinct states of being confident that p under one general label of ‘belief that p’. The two-state conjecture is defended on the basis of some recent work on gradable adjectives. The conjecture is supported by independent considerations from the impossibility of constructing the lottery paradox both for risk-tolerating states such as being afraid, hoping or hypothesizing, and (...) for risk-averse, certainty-like states. The new proposal is compared to views within the increasingly popular debate opposing dualists to reductionists with respect to the relation between belief and degrees of belief. (shrink)
In this paper we investigate with a case study from chemistry under what conditions a simulation can serve as a surrogate for an experiment. The case-study concerns a simulation of H2-formation in outer space. We find that in this case the simulation can act as a surrogate for an experiment, because there exists comprehensive theoretical background knowledge in form of quantum mechanics about the range of phenomena to which the investigated process belongs and because any particular modelling assumptions as can (...) be justified. If these requirements are met then direct empirical validation may even be dispensable. We conjecture that this is not the case in the absence of comprehensive theoretical background knowledge. (shrink)
What sort of entities are electrons, photons and atoms given their wave-like and particle-like properties? Is nature fundamentally deterministic or probabilistic? Orthodox quantum theory evades answering these two basic questions by being a theory about the results of performing measurements on quantum systems. But this evasion results in OQT being a seriously defective theory. A rival, somewhat ignored strategy is to conjecture that the quantum domain is fundamentally probabilistic. This means quantum entities, interacting with one another probabilistically, must differ (...) radically from the entities of deterministic classical physics, the classical wave or particle. It becomes possible to conceive of quantum entities as a new kind of fundamentally probabilistic entity, the “propensiton”, neither wave nor particle. A fully micro realistic, testable rival to OQT results. (shrink)
Husserl’s Logical Grammar is intended to explain how complex expressions can be constructed out of simple ones so that their meaning turns out to be determined by the meanings of their constituent parts and the way they are put together. Meanings are thus understood as structured contents and classified into formal categories to the effect that the logical properties of expressions reflect their grammatical properties. As long as linguistic meaning reduces to the intentional content of pre-linguistic representations, however, it is (...) not trivial to account for how semantics relates to syntax in this context. In this paper, I analyze Husserl’s Logical Grammar as a system of recursive rules operating on representations and suggest that the syntactic form of representations contributes to their semantics because it carries information about semantic role. I further discuss Husserl’s syntactic account of the unity of propositions and argue that, on this account, logical form supervenes on syntactic form. In the last section I draw some implications for the phenomenology of thought and conjecture that the structural features it displays are likely to convey the syntactic structures of an underlying language-like representational system. (shrink)
In this paper, I introduce an intrinsic account of the quantum state. This account contains three desirable features that the standard platonistic account lacks: (1) it does not refer to any abstract mathematical objects such as complex numbers, (2) it is independent of the usual arbitrary conventions in the wave function representation, and (3) it explains why the quantum state has its amplitude and phase degrees of freedom. -/- Consequently, this account extends Hartry Field’s program outlined in Science Without Numbers (...) (1980), responds to David Malament’s long-standing impossibility conjecture (1982), and establishes an important first step towards a genuinely intrinsic and nominalistic account of quantum mechanics. I will also compare the present account to Mark Balaguer’s (1996) nominalization of quantum mechanics and discuss how it might bear on the debate about “wave function realism.” In closing, I will suggest some possible ways to extend this account to accommodate spinorial degrees of freedom and a variable number of particles (e.g. for particle creation and annihilation). -/- Along the way, I axiomatize the quantum phase structure as what I shall call a “periodic difference structure” and prove a representation theorem as well as a uniqueness theorem. These formal results could prove fruitful for further investigation into the metaphysics of phase and theoretical structure. (shrink)
This paper introduces a model for evidence denial that explains this behavior as a manifestation of rationality and it is based on the contention that social values (measurable as utilities) often underwrite these sorts of responses. Moreover, it is contended that the value associated with group membership in particular can override epistemic reason when the expected utility of a belief or belief system is great. However, it is also true that it appears to be the case that it is still (...) possible for such unreasonable believers to reverse this sort of dogmatism and to change their beliefs in a way that is epistemically rational. The conjecture made here is that we should expect this to happen only when the expected utility of the beliefs in question dips below a threshold where the utility value of continued dogmatism and the associated group membership is no longer sufficient to motivate defusing the counter-evidence that tells against such epistemically irrational beliefs. (shrink)
Samuel Scheffler defends “The Afterlife Conjecture”: the view that the continued existence of humanity after our deaths—“the afterlife”—lies in the background of our valuing; were we to lose confidence in it, many of the projects we engage in would lose their meaning. The Afterlife Conjecture, in his view, also brings out the limits of our egoism, showing that we care more about yet unborn strangers than about personal survival. But why does the afterlife itself matter to us? Examination (...) of Scheffler’s second argument helps answer this question, thereby undermining his argument. Our concern for the afterlife involves bootstrapping: we care more about the afterlife than about personal survival precisely because the latter has such salient limits that our lives are structured by adaptation to mortality, and it is only because the afterlife does provide a measure of personal survival that it can give meaning to our projects. (shrink)
In Truth and Objectivity, Crispin Wright argues that because truth is a distinctively normative property, it cannot be as metaphysically insubstantive as deflationists claim.1 This argument has been taken, together with the scope problem,2 as one of the main motivations for alethic pluralism.3 We offer a reconstruction of Wright’s Inflationary Argument (henceforth IA) aimed at highlighting what are the steps required to establish its inflationary conclusion. We argue that if a certain metaphysical and epistemological view of a given subject matter (...) is accepted, a local counterexample to IA can be constructed. We focus on the domain of basic taste and we develop two variants of a subjectivist and relativist metaphysics and epistemology that seems palatable in that domain. Although we undertake no commitment to this being the right metaphysical cum epistemological package for basic taste, we contend that if the metaphysics and the epistemology of basic taste are understood along these lines, they call for a truth property whose nature is not distinctively normative—contra what IA predicts. This result shows that the success of IA requires certain substantial metaphysical and epistemological principles and that, consequently, a proper assessment of IA cannot avoid taking a stance on the metaphysics and the epistemology of the domain where it is claimed to be successful. Although we conjecture that IA might succeed in other domains, in this paper we don’t take a stand on this issue. We conclude by briefly discussing the significance of this result for the debate on alethic pluralism. (shrink)
Timothy Williamson has shown that the B axiom for 'definitely' (α → Δ¬Δ¬α) guarantees that if a sentence is second-order vague in a Kripke model, it is nth order vague for every n. More recently, Anna Mahtani has argued that Williamson's epistemicist theory of vagueness does not support the B axiom, and conjectured that if we consider models in which the “radius of accessibility” varies between different points, we will be able to find sentences that are nth-order vague but (n+1)th-order (...) precise, for any n. This paper bolsters Mahtani's argument, shows her conjecture to be true, and shows that imposing certain further natural constraints on "variable radius" models does not change the situation. (shrink)
One type of deflationism about metaphysical modality suggests that it can be analysed strictly in terms of linguistic or conceptual content and that there is nothing particularly metaphysical about modality. Scott Soames is explicitly opposed to this trend. However, a detailed study of Soames’s own account of modality reveals that it has striking similarities with the deflationary account. In this paper I will compare Soames’s account of a posteriori necessities concerning natural kinds with the deflationary one, specifically Alan Sidelle’s account, (...) and suggest that Soames’s account is vulnerable to the deflationist’s critique. Furthermore, I conjecture that both the deflationary account and Soames’s account fail to fully explicate the metaphysical content of a posteriori necessities. Although I will focus on Soames, my argument may have more general implications towards the prospects of providing a meaning-based account of metaphysical modality. (shrink)
I focus on a key argument for global external world scepticism resting on the underdetermination thesis: the argument according to which we cannot know any proposition about our physical environment because sense evidence for it equally justifies some sceptical alternative (e.g. the Cartesian demon conjecture). I contend that the underdetermination argument can go through only if the controversial thesis that conceivability is per se a source of evidence for metaphysical possibility is true. I also suggest a reason to doubt (...) that conceivability is per se a source of evidence for metaphysical possibility, and thus to doubt the underdetermination argument. (shrink)
Modal logic is one of philosophy’s many children. As a mature adult it has moved out of the parental home and is nowadays straying far from its parent. But the ties are still there: philosophy is important to modal logic, modal logic is important for philosophy. Or, at least, this is a thesis we try to defend in this chapter. Limitations of space have ruled out any attempt at writing a survey of all the work going on in our field—a (...) book would be needed for that. Instead, we have tried to select material that is of interest in its own right or exemplifies noteworthy features in interesting ways. Here are some themes that have guided us throughout the writing: • The back-and-forth between philosophy and modal logic. There has been a good deal of give-and-take in the past. Carnap tried to use his modal logic to throw light on old philosophical questions, thereby inspiring others to continue his work and still others to criticise it. He certainly provoked Quine, who in his turn provided—and continues to provide—a healthy challenge to modal logicians. And Kripke’s and David Lewis’s philosophies are connected, in interesting ways, with their modal logic. Analytic philosophy would have been a lot different without modal logic! • The interpretation problem. The problem of providing a certain modal logic with an intuitive interpretation should not be conflated with the problem of providing a formal system with a model-theoretic semantics. An intuitively appealing model-theoretic semantics may be an important step towards solving the interpretation problem, but only a step. One may compare this situation with that in probability theory, where definitions of concepts like ‘outcome space’ and ‘random variable’ are orthogonal to questions about “interpretations” of the concept of probability. • The value of formalisation. Modal logic sets standards of precision, which are a challenge to—and sometimes a model for—philosophy. Classical philosophical questions can be sharpened and seen from a new perspective when formulated in a framework of modal logic. On the other hand, representing old questions in a formal garb has its dangers, such as simplification and distortion. • Why modal logic rather than classical (first or higher order) logic? The idioms of modal logic—today there are many!—seem better to correspond to human ways of thinking than ordinary extensional logic. (Cf. Chomsky’s conjecture that the NP + VP pattern is wired into the human brain.) In his An Essay in Modal Logic (1951) von Wright distinguished between four kinds of modalities: alethic (modes of truth: necessity, possibility and impossibility), epistemic (modes of being known: known to be true, known to be false, undecided), deontic (modes of obligation: obligatory, permitted, forbidden) and existential (modes of existence: universality, existence, emptiness). The existential modalities are not usually counted as modalities, but the other three categories are exemplified in three sections into which this chapter is divided. Section 1 is devoted to alethic modal logic and reviews some main themes at the heart of philosophical modal logic. Sections 2 and 3 deal with topics in epistemic logic and deontic logic, respectively, and are meant to illustrate two different uses that modal logic or indeed any logic can have: it may be applied to already existing (non-logical) theory, or it can be used to develop new theory. (shrink)
Gaisi Takeuti (1926–2017) is one of the most distinguished logicians in proof theory after Hilbert and Gentzen. He extensively extended Hilbert's program in the sense that he formulated Gentzen's sequent calculus, conjectured that cut-elimination holds for it (Takeuti's conjecture), and obtained several stunning results in the 1950–60s towards the solution of his conjecture. Though he has been known chiefly as a great mathematician, he wrote many papers in English and Japanese where he expressed his philosophical thoughts. In particular, (...) he used several keywords such as "active intuition" and "self-reflection" from Nishida's philosophy. In this paper, we aim to describe a general outline of our project to investigate Takeuti's philosophy of mathematics. In particular, after reviewing Takeuti's proof-theoretic results briefly, we describe some key elements in Takeuti's texts. By explaining these texts, we point out the connection between Takeuti's proof theory and Nishida's philosophy and explain the future goals of our project. (shrink)
Ned Markosian has recently defended a new theory of composition, which he calls regionalism : some material objects xx compose something if and only if there is a material object located at the fusion of the locations of xx. Markosian argues that regionalism follows from what he calls the subregion theory of parthood. Korman and Carmichael agree. We provide countermodels to show that regionalism does not follow from, even together with fourteen potentially implicit background principles. We then show that regionalism (...) does follow from five of those background principles together with and two additional principles connecting parthood and location, which we call and. While the additional principles are not uncontroversial, our conjecture is that many will find them attractive. We conclude by mentioning that fills a previously unnoticed gap in the formal theory of location presented in Parsons. (shrink)
In the writings of Daniel Dennett and Donald Davidson we find something like the following bold conjecture: it is an a priori truth that there is no gap between our best judgements of a subject's beliefs and desires and the truth about the subject's beliefs and desires. Under ideal conditions a subject's belief-box and desire-box become transparent.
Cosmological speculation about the ultimate nature of the universe, being necessary for science to be possible at all, must be regarded as a part of scientific knowledge itself, however epistemologically unsound it may be in other respects. The best such speculation available is that the universe is comprehensible in some way or other and, more specifically, in the light of the immense apparent success of modern natural science, that it is physically comprehensible. But both these speculations may be false; in (...) order to take this possibility into account, we need to adopt an hierarchy of increasingly contentless cosmological conjectures until we arrive at the conjecture that the universe is such that it is possible for us to acquire some knowledge of something, a conjecture which we are justified in accepting as knowledge since doing so cannot harm the pursuit of knowledge in any circumstances whatsoever. As a result of adopting such an hierarchy of increasingly contentless cosmological conjectures in this way, we maximize our chances of adopting conjectures that promote the growth of knowledge, and minimize our chances of taking some cosmological assumption for granted that is false and impedes the growth of knowledge. The hope is that as we increase our knowledge about the world we improve (lower level) cosmological assumptions implicit in our methods, and thus in turn improve our methods. As a result of improving our knowledge we improve our knowledge about how to improve knowledge. Science adapts its own nature to what it learns about the nature of the universe, thus increasing its capacity to make progress in knowledge about the world. This aim-oriented empiricist conception of science solves outstanding problems in the philosophy of science such as the problems of induction, simplicity and verisimilitude. (shrink)
Bayesians often assume, suppose, or conjecture that for any reasonable explication of the notion of simplicity a prior can be designed that will enforce a preference for hypotheses simpler in just that sense. But it is shown here that there are simplicity-driven approaches to curve-fitting problems that cannot be captured within the orthodox Bayesian framework.
Most mental disorders affect only a small segment of the population. On the reasonable assumption that minds or brains are prone to occasional malfunction, these disorders do not seem to pose distinctive explanatory problems. Depression, however, because it is so prevalent and costly, poses a conundrum that some try to explain by characterizing it as an adaptation—a trait that exists because it performed fitness-enhancing functions in ancestral populations. Heretofore, proposed evolutionary explanations of depression did not focus on thought processes; instead, (...) they emphasized that it facilitates navigation of adverse social circumstances or promotes immune response to infectious agents. According to a new hypothesis, the “analytical rumination hypothesis” (ARH), however, depression’s crucial adaptive trait is rumination—negative, intrusive thought. ARH holds that, (i) social dilemmas trigger depressed mood; (ii) depressed mood induces changes in body systems that facilitate ruminative analysis aimed at solving dilemmas; and, (iii) depressive rumination is a fitness-enhancing trait that was selected for in evolutionary time. Jointly, (i)~(iii) imply that we should not think of rumination as a disorder; instead, it is a trade-off, an eminently rational one. In the same way that fever solves a problem—coordination of the immune system in response to infection—so too does depressive rumination solve a problem, a social dilemma, albeit at the cost of inducing anhedonia and other maladies. But they argue that the cost is worthwhile, something that should be endured “until the problem is solved.” First, we argue that there are two distinct types of rumination, brooding and pondering; the former is associated with a disposition for depression, not the latter. But only the latter has the problem-solving capabilities that ARH requires. Second, recent brain imaging studies of depression reveal resting state hypoactivity in lateral regions and hyperactivity in paralimbic regions; this asymmetric pattern correlates with heightened levels of brooding, self-focused rumination. In other words, on the personal level, patients are trapped within self, isolated from the external world and suffused with negative affect; on the subpersonal level, this pattern is reflected by an asymmetric pattern of lateral vs. paralimbic resting state activity. Third, we proceed to conjecture that rational responses (e.g., pondering) to social dilemmas are those that strike a balance between internal and external considerations in the process of belief formation. Fourth, because the asymmetric resting state activity blocks those who suffer with depression from accessing and processing potentially positive stimuli from the external world, the capacity for rational, analytic response—hence, problem-solving—is constrained. Fifth, it follows that, although there might be conditions for which suffering should be endured rather than pharmacologically alleviated, depression is not one of those. Indeed, in view of the effects of the asymmetric resting state pattern, it is unlikely that depressive rumination would have been useful even for ancestral populations. (shrink)
Chapin reviewed this 1972 ZEITSCHRIFT paper that proves the completeness theorem for the logic of variable-binding-term operators created by Corcoran and his student John Herring in the 1971 LOGIQUE ET ANALYSE paper in which the theorem was conjectured. This leveraging proof extends completeness of ordinary first-order logic to the extension with vbtos. Newton da Costa independently proved the same theorem about the same time using a Henkin-type proof. This 1972 paper builds on the 1971 “Notes on a Semantic Analysis of (...) Variable Binding Term Operators” (Co-author John Herring), Logique et Analyse 55, 646–57. MR0307874 (46 #6989). A variable binding term operator (vbto) is a non-logical constant, say v, which combines with a variable y and a formula F containing y free to form a term (vy:F) whose free variables are exact ly those of F, excluding y. Kalish-Montague 1964 proposed using vbtos to formalize definite descriptions “the x: x+x=2”, set abstracts {x: F}, minimization in recursive function theory “the least x: x+x>2”, etc. However, they gave no semantics for vbtos. Hatcher 1968 gave a semantics but one that has flaws described in the 1971 paper and admitted by Hatcher. In 1971 we give a correct semantic analysis of vbtos. We also give axioms for using them in deductions. And we conjecture strong completeness for the deductions with respect to the semantics. The conjecture, proved in this paper with Hatcher’s help, was proved independently about the same time by Newton da Costa. (shrink)
This article offers an explanation of perhaps Wittgenstein’s strangest and least intuitive thesis – the semantical mutation thesis – according to which one can never answer a mathematical conjecture because the new proof alters the very meanings of the terms involved in the original question. Instead of basing our justification on the distinction between mere calculation and proofs of isolated propositions, characteristic of Wittgenstein’s intermediary period, we generalize it to include conjectures involving effective procedures as well.
The existence of object-dependent thoughts has been doubted on the grounds that reference to such thoughts is unnecessary or 'redundant' in the psychological explanation of intentional action. This paper argues to the contrary that reference to object-dependent thoughts is necessary to the proper psychological explanation of intentional action upon objects. Section I sets out the argument for the alleged explanatory redundancy of object-dependent thoughts; an argument which turns on the coherence of an alternative 'dual-component' model of explanation. Section II rebuts (...) this argument by showing the dual-component model to be incoherent precisely because of its exclusion of object-dependent thoughts. Section III concludes with a conjecture about the further possible significance of object-dependent thoughts for the prediction of action. (shrink)
While epistemic democrats have claimed that majority rule recruits the wisdom of the crowd to identify correct answers to political problems, the conjecture remains abstract. This article illustrates how majority rule leverages the epistemic capacity of the electorate to practically enhance the instrumental value of elections. To do so, we identify a set of sufficient conditions that effect such a majority rule mechanism, even when the decision in question is multidimensional. We then look to the case of sociotropic economic (...) voting in US presidential elections to provide empirical tractability for these conditions. We find that absent such an epistemic capacity a number of presidential elections might well have been decided differently. By generating clear conditions for the plausibility of claims made by epistemic democrats, and demonstrating their correspondence to empirical data, this article strengthens the broader instrumental grounds recommending democracy. (shrink)
This paper is divided in four parts. In the first part we introduce the method of internal critique of philosophical theories by examination of their external consistency with scientific theories. In the second part two metaphysical and one epistemological postulate of Wittgenstein's Tractatus are made explicit and formally expressed. In the third part we examine whether Tractarian metaphysical and epistemological postulates (the independence of simple states of affairs, the unique mode of their composition, possibility of complete empirical knowledge) are externally (...) consistent with the theory of quantum mechanics. The result of the inquiry is negative: Tractarian postulates ought to be be revised. Relying on the result we approach the question of the empirical character of logic in the fourth part. The description of theoretical transformations of the notion of disjunction, in its ontological, epistemological, and logical sense, is a common element of in all parts of the text. The conjecture on the existence of different types of disjunctive connectives in the language of quantum mechanics concludes the paper. (shrink)
Critical examination of Alchourrón and Bulygin’s set-theoretic definition of normative system shows that deductive closure is not an inevitable property. Following von Wright’s conjecture that axioms of standard deontic logic describe perfection-properties of a norm-set, a translation algorithm from the modal to the set-theoretic language is introduced. The translations reveal that the plausibility of metanormative principles rests on different grounds. Using a methodological approach that distinguishes the actor roles in a norm governed interaction, it has been shown that metanormative (...) principles are directed second-order obligations and, in particular, that the requirement related to deductive closure is directed to the norm-applier role rather than to the norm-giver role. The approach has been applied to the case of pure derogation yielding a new result, namely, that an independence property is a perfection-property of a norm-set in view of possible derogation. This paper in a polemical way touches upon several points raised by Kristan in his recent paper. (shrink)
D O N A L D D AV I D S O N’S “ Meaning and Truth,” re vo l u t i o n i zed our conception of how truth and meaning are related (Davidson ). In that famous art i c l e , Davidson put forw a rd the bold conjecture that meanings are satisfaction conditions, and that a Tarskian theory of truth for a language is a theory of meaning for that (...) language. In “Meaning and Truth,” Davidson proposed only that a Tarskian truth theory is a theory of meaning. But in “Theories of Me a n i n g and Learnable Languages,” he argued that the ﬁnite base of a Tarskian theory, together with the now familiar combinatorics, would explain how a language with unbounded expre s s i ve capacity could be learned with finite means ( Davidson ). This certainly seems to imply that learning a language is, in p a rt at least, learning a Tarskian truth theory for it, or, at least, learning what is speciﬁed by such a theory. Davisdon was cagey about committing to the view that meanings actually a re satisfaction conditions, but subsequent followers had no such scru p l e s . We can sum this up in a trio of claims: Davidson’s Conjecture () A theory of meaning for L is a truth-conditional semantics for L. () To know the meaning of an expression in L is to know a satisfaction condition for that expression. () Meanings are satisfaction conditions. For the most part, it will not matter in what follows which of these claims is at stake. I will simply take the three to be different ways of formulating what I will call Davidson’s Conjecture (or sometimes just The Conjecture). Davidson’s Conjecture was a very bold conjecture. I think we are now in a.. (shrink)
Crisis prevention plans are usually evaluated based on their effects in terms of preventing or limiting organizational crisis. In this survey-based study, the focus was instead on how such plans influence employees’ reactions in terms of risk perception and well-being. Five different organizations were addressed in the study. Hypothesis 1 tested the assumption that leadership crisis preparation would lead to lower perceived risk among the employees. Hypothesis 2 tested the conjecture that it would also lead to a higher degree (...) of well-being. Both hypotheses were supported. The results and their implications are discussed. (shrink)
Management involves change. The aim of this paper is to introduce a threefold classification of change with the purpose of making clear how the third type, creational change, is distinctive compared to the other two types. Four types of management situation are introduced, based on the type of change involved in the managed domain and in the management system. The role of creational change in management is discussed and a number of guidelines or suggestions relevant to this sort of management (...) are outlined. One feature of the notion of creational change is the conjecture that such change is not amenable to scientific investigation and understanding. Once creational change has produced whatever it does produce then the product may be amenable to scientific investigation and understanding but the actual unique and open process of its production will not be. One of the aims of this paper is to heighten our general awareness of creational change as different from other sorts of change. (shrink)
Locke and Leibniz are often classified as proponents of compatibilist theories of human freedom, since both maintain that freedom is consistent with determinism and that the difference between being and not being free turns on how one is determined. However, we will argue in this paper that their versions of compatibilism are essentially different and that they have significantly distinct commitments to compatibilism. To this end, we will first analyze the definitions and examples for freedom and necessity that Locke and (...) Leibniz present in sections 8-13 of chapter 21 of the Essay on Human Understanding and the Nouveaux essais respectively, and then conjecture how Locke and Leibniz would have continued the discussion, if they had had the opportunity to engage in an exchange of opinions. In this way, we believe, one will be in a position to understand why Leibniz thinks that Locke’s discussion of freedom “est un des plus prolixes et des plus subtils de son ouvrage.”. (shrink)
This paper looks at an argument strategy for assessing the epistemic closure principle. This is the principle that says knowledge is closed under known entailment; or (roughly) if S knows p and S knows that p entails q, then S knows that q. The strategy in question looks to the individual conditions on knowledge to see if they are closed. According to one conjecture, if all the individual conditions are closed, then so too is knowledge. I give a deductive (...) argument for this conjecture. According to a second conjecture, if one (or more) condition is not closed, then neither is knowledge. I give an inductive argument for this conjecture. In sum, I defend the strategy by defending the claim that knowledge is closed if, and only if, all the conditions on knowledge are closed. After making my case, I look at what this means for the debate over whether knowledge is closed. (shrink)
-/- A variable binding term operator (vbto) is a non-logical constant, say v, which combines with a variable y and a formula F containing y free to form a term (vy:F) whose free variables are exact ly those of F, excluding y. -/- Kalish-Montague proposed using vbtos to formalize definite descriptions, set abstracts {x: F}, minimalization in recursive function theory, etc. However, they gave no sematics for vbtos. Hatcher gave a semantics but one that has flaws. We give a correct (...) semantic analysis of vbtos. We also give axioms for using them in deductions. And we conjecture strong completeness for the deductions with respect to the semantics. The conjecture was later proved independently by the authors and by Newton da Costa. -/- The expression (vy:F) is called a variable bound term (vbt). In case F has only y free, (vy:F) has the syntactic propreties of an individual constant; and under a suitable interpretation of the language vy:F) denotes an individual. By a semantic analysis of vbtos we mean a proposal for amending the standard notions of (1) "an interpretation o f a first -order language" and (2) " the denotation of a term under an interpretation and an assignment", such that (1') an interpretation o f a first -order language associates a set-theoretic structure with each vbto and (2') under any interpretation and assignment each vb t denotes an individual. (shrink)
Karl Popper is famous for having proposed that science advances by a process of conjecture and refutation. He is also famous for defending the open society against what he saw as its arch enemies – Plato and Marx. Popper’s contributions to thought are of profound importance, but they are not the last word on the subject. They need to be improved. My concern in this book is to spell out what is of greatest importance in Popper’s work, what its (...) failings are, how it needs to be improved to overcome these failings, and what implications emerge as a result. The book consists of a collection of essays which dramatically develop Karl Popper’s views about natural and social science, and how we should go about trying to solve social problems. Criticism of Popper’s falsificationist philosophy of natural science leads to a conception of science that I call aim-oriented empiricism. This makes explicit metaphysical theses concerning the comprehensibility and knowability of the universe that are an implicit part of scientific knowledge – implicit in the way science excludes all theories that are not explanatory, even those that are more successful empirically than accepted theories. Aim-oriented empiricism has major implications, not just for the academic discipline of philosophy of science, but for science itself. Popper generalized his philosophy of science of falsificationism to arrive at a new conception of rationality – critical rationalism – the key methodological idea of Popper’s profound critical exploration of political and social issues in his The Open Society and Its Enemies, and The Poverty of Historicism. This path of Popper, from scientific method to rationality and social and political issues is followed here, but the starting point is aim-oriented empiricism rather than falsificationism. Aim-oriented empiricism is generalized to form a conception of rationality I call aim-oriented rationalism. This has far-reaching implications for political and social issues, for the nature of social inquiry and the humanities, and indeed for academic inquiry as a whole. The strategies for tackling social problems that arise from aim-oriented rationalism improve on Popper’s recommended strategies of piecemeal social engineering and critical rationalism, associated with Popper’s conception of the open society. This book thus sets out to develop Popper’s philosophy in new and fruitful directions. The theme of the book, in short, is to discover what can be learned from scientific progress about how to achieve social progress towards a better world. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.