A Bayesian measure of evidence for precise hypotheses is presented. The intention is to give a Bayesian alternative to significance tests or, equivalently, to p-values. In fact, a set is defined in the parameter space and the posterior probability, its credibility, is evaluated. This set is the “Highest Posterior Density Region” that is “tangent” to the set that defines the null hypothesis. Our measure of evidence is the complement of the credibility of the “tangent” region.
This essay develops a interpretation of Leibniz’ theory of motion that strives to integrate his metaphysics of force with his doctrine of the equivalence of hypotheses, but which also supports a realist, as opposed to a fully idealist, interpretation of his natural philosophy. Overall, the modern approaches to Leibniz’ physics that rely on a fixed spacetime backdrop, classical mechanical constructions, or absolute speed, will be revealed as deficient, whereas a more adequate interpretation will be advanced that draws inspiration (...) from an invariantist conception of reality and recent non-classical theories of physics. (shrink)
In this article I attempt to show conclusively that the apparent intrinsic difference between causing collateral damage and directly attacking innocents is an illusion. I show how eleven morally irrelevant alterations can transform an apparently permissible case of harming as a side-effect into an apparently impermissible case of harming as a means. The alterations are as obviously irrelevant as the victims’ skin colour, and consistently treating them as relevant would have unacceptable implications for choices between more and less harmful ways (...) of securing greater goods. This shows not only how the principles philosophers have proposed for distinguishing between these cases cannot withstand scrutiny, but how we can be sure that there are no relevant differences yet to be discovered. I conclude by considering reasons to think that there are deontological constraints against harming, but that they apply just as forcefully against collateral harms as they do against intended harms. (shrink)
Background: how mind functions is subject to continuing scientific discussion. A simplistic approach says that, since no convincing way has been found to model subjective experience, mind cannot exist. A second holds that, since mind cannot be described by classical physics, it must be described by quantum physics. Another perspective concerns mind's hypothesized ability to interact with the world of quanta: it should be responsible for reduction of quantum wave packets; physics producing 'Objective Reduction' is postulated to form the basis (...) for mind-matter interactions. This presentation describes results derived from a new approach to these problems. It is based on well-established biology involving physics not previously applied to the fields of mind, or consciousness studies, that of critical feedback instability. -/- Methods: 'self-organized criticality' in complexity biology places system loci of control at critical instabilities, physical properties of which, including information properties, are presented. Their elucidation shows that they can model hitherto unexplained properties of experience. -/- Results: All results depend on physical properties of critical instabilities. First, at least one feed-back or feed-forward loop must have feedback gain, g = 1: information flows round the loop impress perfect images of system states back on themselves: they represent processes of perfect self-observation. This annihilates system quanta: system excitations are instability fluctuations, which cannot be quantized. Major results follow: -/- 1. Information vectors representing criticality states must include at least one attached information loop denoting self-observation. -/- 2. Such loop structures are attributed a function, 'registering the state's own existence', explaining -/- a. Subjective 'awareness of one's own presence' -/- b. How content-free states of awareness can be remembered (Jon Shear) -/- c. Subjective experience of time duration (Immanuel Kant) -/- d. The 'witness' property of experience – often mentioned by athletes 'in the zone' -/- e. The natural association between consciousness and intelligence -/- This novel, physically and biologically sound approach seems to satisfactorily model subjectivity. -/- Further significant results follow: -/- 1. Registration of external information in excited states of systems at criticality reduces external wave-packets: the new model exhibits 'Objective Reduction' of wave packets. -/- 2. High internal coherence (postulated by Domash & Penrose) leading to a. Non-separable information vector bundles. b. Non-reductive states (Chalmers's criterion for experience). -/- 3. Information that is: a. encoded in coherence negentropy; b. non-digitizable, and therefore c. computationally without digital equivalent (posited by Penrose). -/- Discussion and Conclusions: instability physics implies anharmonic motion, preventing excitation quantization, and totally different from the quantum physics of simple harmonic motion at stability. Instability excitations are different from anything hitherto conceived in information science. They can model aspects of mind never previously treated, including genuine subjectivity, objective reduction of wave-packets, and inter alia all properties given above. (shrink)
This paper is the first part of a three-part project ‘How the principle of energy conservation evolved between 1842 and 1870: the view of a participant’. This paper aims at showing how the new ideas of Mayer and Joule were received, what constituted the new theory in the period under study, and how it was supported experimentally. A connection was found between the new theory and thermodynamics which benefited both of them. Some considerations are offered about the desirability of taking (...) a historical approach to teaching energy and its conservation. (shrink)
In the texts of the middle years (roughly, the 1680s and 90s), Leibniz appears to endorse two incompatible approaches to motion, one a realist approach, the other a phenomenalist approach. I argue that once we attend to certain nuances in his account we can see that in fact he has only one, coherent approach to motion during this period. I conclude by considering whether the view of motion I want to impute to Leibniz during his middle years ranks as a (...) kind of realism or rather as some kind of phenomenalism or idealism. (shrink)
The open-domain Frame Problem is the problem of determining what features of an open task environment need to be updated following an action. Here we prove that the open-domain Frame Problem is equivalent to the Halting Problem and is therefore undecidable. We discuss two other open-domain problems closely related to the Frame Problem, the system identification problem and the symbol-grounding problem, and show that they are similarly undecidable. We then reformulate the Frame Problem as a quantum decision problem, and show (...) that it is undecidable by any finite quantum computer. (shrink)
This study was aimed at examining “practicum exercise and the attitudes of pre-service educational administrators in Cross River State.” Pre-administrators’ attitudes were assessed in the area of self-discipline, time management, and record keeping. Three null hypotheses formulated offered direction to the study. The study adopted a quasi-experimental research design. Pre-service administrators with practicum experience were the experimental group while those without practicum experience were the control group. Cluster and simple random sampling techniques were adopted in selecting 60 final year (...) students and 60 year three students or its equivalent out of a population of 220 final year and 208 year three students or its equivalent, from both NUC and CES programmes respectively. The instrument used for data collection was Practicum Exercise, Study Habits, and Record-Keeping Abilities Questionnaire (PESDTMARKAQ). Independent t-test was used to test the null hypotheses at .05 level of significance using Microsoft Excel 2013 Data analysis tool pack. The results of the study showed that practicum exercise had no effect on pre-service administrators’ self-discipline and record keeping attitudes; Practicum exercise was found to affect time management attitudes of pre-service educational administrators. Based on these findings, it was recommended among several others that; pre-service administrators with practicum experience should make efforts to develop the level of their self-discipline by enacting and obeying personal policies that are favorable to their academic growth and progress. Keywords:. (shrink)
In recent essays John Bishop proposes a model of religious faith. This author notices that a so-called doxastic venture model of theistic faith is self-defeating for the following reason: a venture suggests a process with an outcome; by definition a venture into Christian faith denies itself an outcome in virtue of the transcendent character of its claims – for what is claimed cannot be settled. Taking instruction from logical positivism, I stress the nonsensical character of religious claims while attacking Bishop's (...) model. However, I wish to avail myself of this same model to describe a state of belief among certain parties which does not refer to transcendent matters, in order to show that a doxastic venture is indeed a valid description of a state of belief, and that pursuing this model shows in relief the transformative nature of belief, along with its essentially scientific status. It is my ambition to show, turning Bishop's model against itself, that a state of religious belief suffers from a precise logical equivalence to a condition of agnosticism. I ask whether we are justified in believing in belief. (shrink)
A practical viewpoint links reality, representation, and language to calculation by the concept of Turing (1936) machine being the mathematical model of our computers. After the Gödel incompleteness theorems (1931) or the insolvability of the so-called halting problem (Turing 1936; Church 1936) as to a classical machine of Turing, one of the simplest hypotheses is completeness to be suggested for two ones. That is consistent with the provability of completeness by means of two independent Peano arithmetics discussed in Section (...) I. Many modifications of Turing machines cum quantum ones are researched in Section II for the Halting problem and completeness, and the model of two independent Turing machines seems to generalize them. Then, that pair can be postulated as the formal definition of reality therefore being complete unlike any of them standalone, remaining incomplete without its complementary counterpart. Representation is formal defined as a one-to-one mapping between the two Turing machines, and the set of all those mappings can be considered as “language” therefore including metaphors as mappings different than representation. Section III investigates that formal relation of “reality”, “representation”, and “language” modeled by (at least two) Turing machines. The independence of (two) Turing machines is interpreted by means of game theory and especially of the Nash equilibrium in Section IV. Choice and information as the quantity of choices are involved. That approach seems to be equivalent to that based on set theory and the concept of actual infinity in mathematics and allowing of practical implementations. (shrink)
Historically, Nelson Goodman’s paradox involving the predicates ‘grue’ and ‘bleen’ has been taken to furnish a serious blow to Carl Hempel’s theory of confirmation in particular and to purely formal theories of confirmation in general. In this paper, I argue that Goodman’s paradox is no more serious of a threat to Hempel’s theory of confirmation than is Hempel’s own paradox of the ravens. I proceed by developing a suggestion from R. D. Rosenkrantz into an argument for the conclusion that these (...) paradoxes are, in fact, equivalent. My argument, if successful, is of both historical and philosophical interest. Goodman himself maintained that Hempel’s theory of confirmation was capable of handling the paradox of the ravens. And Hempel eventually conceded that Goodman’s paradox showed that there could be no adequate, purely syntactical theory of confirmation. The conclusion of my argument entails, by contrast, that Hempel’s theory of confirmation is incapable of handling Goodman’s paradox if and only if it is incapable of handling the paradox of the ravens. It also entails that for any adequate solution to one of these paradoxes, there is a corresponding and equally adequate solution to the other. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. That theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather a metamathematical axiom about the relation of mathematics and reality. Its investigation needs (...) philosophical means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. A comparison to Mach’s doctrine is used to be revealed the fundamental and philosophical reductionism of Husserl’s phenomenology leading to a kind of Pythagoreanism in the final analysis. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. Thus the problem which of the two mathematics is more relevant to our being is discussed. An information interpretation of the Schrödinger equation is involved to illustrate the above problem. (shrink)
The possibility of empirical test is discussed with respect to three issues: (1) What is the ontological relationship between consciousness and the brain/physical world? (2) What physical characteristics are associated with the mind/brain interface? (3) Can consciousness act on the brain independently of any brain process?
The Concept of Life and Death of Chuang-tzu have inherited and developed Confucianism and Taoism thoughts, establishing Ontological foundation of "Life - Body", distinguishing the transcendental concept of "Dead Heart" and the empirical concept of "Death Body", as well as proposing the thought of "Equivalence of Life and Death" finally. The logic Reasoning of Chuang-tzu "Equivalence of Life and Death", start from constructing the equal status of "Life" and “Death" from ontological argument. Life and Death then are reduced (...) to be a natural phenomenon to dispel its mystery. With emphasizing the social connotation of life and death, the difference between them has been removed, and finally the Thought experiment of "Chuang-tzu dreaming butterfly" has deepened the idea of "Equivalence of Life and Death". The Ideological Characteristic of the Concept of Life and Death of Chuang-tzu mainly reflects in the aspects of Ontology, Epistemology and Ethical practice. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. Social science, liberal arts, history, and philosophy are meant first of all. That kind of theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted (...) rather a metamathematical axiom about the relation of mathematics and reality. The main statement is formulated as follows: Any scientific theory admits isomorphism to some mathematical structure in a way constructive. Its investigation needs philosophical means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. The sketch of the proof is organized in five steps: a generalization of epoché; involving transfinite induction in the transition between Peano arithmetic and set theory; discussing the finiteness of Peano arithmetic; applying transfinite induction to Peano arithmetic; discussing an arithmetical model of reality. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. The present paper follows a pathway grounded on Husserl’s phenomenology and “bracketing reality” to achieve the generalized arithmetic necessary for the principle to be founded in alternative ontology, in which there is no reality external to mathematics: reality is included within mathematics. That latter mathematics is able to self-found itself and can be called Hilbert mathematics in honour of Hilbert’s program for self-founding mathematics on the base of arithmetic. The principle of universal mathematizability is consistent to Hilbert mathematics, but not to Gödel mathematics. Consequently, its validity or rejection would resolve the problem which mathematics refers to our being; and vice versa: the choice between them for different reasons would confirm or refuse the principle as to the being. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. The Schrödinger equation in quantum mechanics is involved to illustrate that ontology. Thus the problem which of the two mathematics is more relevant to our being is discussed again in a new way A few directions for future work can be: a rigorous formal proof of the principle as an independent axiom; the further development of information ontology consistent to both kinds of mathematics, but much more natural for Hilbert mathematics; the development of the information interpretation of quantum mechanics as a mathematical one for information ontology and thus Hilbert mathematics; the description of consciousness in terms of information ontology. (shrink)
Fisher criticised the Neyman-Pearson approach to hypothesis testing by arguing that it relies on the assumption of “repeated sampling from the same population.” The present article considers the responses to this criticism provided by Pearson and Neyman. Pearson interpreted alpha levels in relation to imaginary replications of the original test. This interpretation is appropriate when test users are sure that their replications will be equivalent to one another. However, by definition, scientific researchers do not possess sufficient knowledge about the relevant (...) and irrelevant aspects of their tests and populations to be sure that their replications will be equivalent to one another. Pearson also interpreted the alpha level as a personal rule that guides researchers’ behavior during hypothesis testing. However, this interpretation fails to acknowledge that the same researcher may use different alpha levels in different testing situations. Addressing this problem, Neyman proposed that the average alpha level adopted by a particular researcher can be viewed as an indicator of that researcher’s typical Type I error rate. Researchers’ average alpha levels may be informative from a metascientific perspective. However, they are not useful from a scientific perspective. Scientists are more concerned with the error rates of specific tests of specific hypotheses, rather than the error rates of their colleagues. It is concluded that neither Neyman nor Pearson adequately rebutted Fisher’s “repeated sampling” criticism. Fisher’s significance testing approach is briefly considered as an alternative to the Neyman-Pearson approach. (shrink)
Our evidence can be about different subject matters. In fact, necessarily equivalent pieces of evidence can be about different subject matters. Does the hyperintensionality of ‘aboutness’ engender any hyperintensionality at the level of rational credence? In this paper, I present a case which seems to suggest that the answer is ‘yes’. In particular, I argue that our intuitive notions of independent evidence and inadmissible evidence are sensitive to aboutness in a hyperintensional way. We are thus left with a paradox. While (...) there is strong reason to think that rational credence cannot make such hyperintensional distinctions, our intuitive judgements about certain cases seem to demand that it does. (shrink)
We approach the topic of solution equivalence of propositional problems from the perspective of non-constructive procedural theory of problems based on Transparent Intensional Logic (TIL). The answer we put forward is that two solutions are equivalent if and only if they have equivalent solution concepts. Solution concepts can be understood as a generalization of the notion of proof objects from the Curry-Howard isomorphism.
This paper argues for a particular view about in what metaphysical equivalence consists: namely, that any two metaphysical theories are metaphysically equivalent if and only if those theories are strongly hyperintensionally equivalent. It is consistent with this characterisation that said theories are weakly hyperintensionally distinct, thus affording us the resources to model the content of propositional attitudes directed towards metaphysically equivalent theories in such a way that non-ideal agents can bear different propositional attitudes towards metaphysically equivalent theories.
Formal criteria of theoretical equivalence are mathematical mappings between specific sorts of mathematical objects, notably including those objects used in mathematical physics. Proponents of formal criteria claim that results involving these criteria have implications that extend beyond pure mathematics. For instance, they claim that formal criteria bear on the project of using our best mathematical physics as a guide to what the world is like, and also have deflationary implications for various debates in the metaphysics of physics. In this (...) paper, I investigate whether there is a defensible view according to which formal criteria have significant non-mathematical implications, of these sorts or any other, reaching a chiefly negative verdict. Along the way, I discuss various foundational issues concerning how we use mathematical objects to describe the world when doing physics, and how this practice should inform metaphysics. I diagnose the prominence of formal criteria as stemming from contentious views on these foundational issues, and endeavor to motivate some alternative views in their stead. (shrink)
We give a complete axiomatization of the identities of the basic game algebra valid with respect to the abstract game board semantics. We also show that the additional conditions of termination and determinacy of game boards do not introduce new valid identities. En route we introduce a simple translation of game terms into plain modal logic and thus translate, while preserving validity both ways game identities into modal formulae. The completeness proof is based on reduction of game terms to a (...) certain 'minimal canonical form', by using only the axiomatic identities, and on showing that the equivalence of two minimal canonical terms can be established from these identities. (shrink)
This paper introduces pragmatic hypotheses and relates this concept to the spiral of scientific evolution. Previous works determined a characterization of logically consistent statistical hypothesis tests and showed that the modal operators obtained from this test can be represented in the hexagon of oppositions. However, despite the importance of precise hypothesis in science, they cannot be accepted by logically consistent tests. Here, we show that this dilemma can be overcome by the use of pragmatic versions of precise hypotheses. (...) These pragmatic versions allow a level of imprecision in the hypothesis that is small relative to other experimental conditions. The introduction of pragmatic hypotheses allows the evolution of scientific theories based on statistical hypothesis testing to be interpreted using the narratological structure of hexagonal spirals, as defined by Pierre Gallais. (shrink)
In this article I examine two mathematical definitions of observational equivalence, one proposed by Charlotte Werndl and based on manifest isomorphism, and the other based on Ornstein and Weiss’s ε-congruence. I argue, for two related reasons, that neither can function as a purely mathematical definition of observational equivalence. First, each definition permits of counterexamples; second, overcoming these counterexamples will introduce non-mathematical premises about the systems in question. Accordingly, the prospects for a broadly applicable and purely mathematical definition of (...) observational equivalence are unpromising. Despite this critique, I suggest that Werndl’s proposals are valuable because they clarify the distinction between provable and unprovable elements in arguments for observational equivalence. (shrink)
I investigate syntactic notions of theoretical equivalence between logical theories and a recent objection thereto. I show that this recent criticism of syntactic accounts, as extensionally inadequate, is unwarranted by developing an account which is plausibly extensionally adequate and more philosophically motivated. This is important for recent anti-exceptionalist treatments of logic since syntactic accounts require less theoretical baggage than semantic accounts.
The purpose of this essay is to shed some light on a certain type of sentence, which I call a borderline contradiction. A borderline contradiction is a sentence of the form F a ∧ ¬F a, for some vague predicate F and some borderline case a of F , or a sentence equivalent to such a sentence. For example, if Jackie is a borderline case of ‘rich’, then ‘Jackie is rich and Jackie isn’t rich’ is a borderline contradiction. Many theories (...) of vague language have entailments about borderline contradictions; correctly describing the behavior of borderline contradictions is one of the many tasks facing anyone offering a theory of vague language. Here, I first briefly review claims made by various theorists about these borderline contradictions, attempting to draw out some predictions about the behavior of ordinary speakers. Second, I present an experiment intended to gather relevant data about the behavior of ordinary speakers. Finally, I discuss the experimental results in light of several different theories of vagueness, to see what explanations are available. My conclusions are necessarily tentative; I do not attempt to use the present experiment to demonstrate that any single theory is incontrovertibly true. Rather, I try to sketch the auxiliary hypotheses that would need to be conjoined to several extant theories of vague language to predict the present result, and offer some considerations regarding the plausibility of these various hypotheses. In the end, I conclude that two of the theories I consider are better-positioned to account for the observed data than are the others. But the field of logically-informed research on people’s actual responses to vague predicates is young; surely as more data come in we will learn a great deal more about which (if any) of these theories best accounts for the behavior of ordinary speakers. (shrink)
This study examined school administrators’ counselling and conceptual skills for effective secondary school administration in Rivers State. Two (2) research questions and 2 hypotheses were answered and tested in the study, respectively. The population of the study was the 245 public secondary schools in Rivers State, with a teacher population of 8196, from which 414 (equivalent of 5% from the population) were selected as the sample, using the stratified random sampling technique. The instrument for data collection was a validated (...) 11-item instrument titled ‘School Administrators’ Counselling and Conceptual Skills for Effective Secondary School Administration Scale’ (SACCSESSAS), with a reliability index of 0.87, designed by the researchers in the modified 4-point Likert scale model. Mean scores and standard deviations were used in answering the research questions while z.test statistics were performed and used in testing the hypotheses at 0.05 level of significance. The findings of the study show that school administrators’ counselling skills enhance effective secondary school administration through presenting opportunities for active listening to building rapport and empathy and that administrators conceptual skills enhance school effectiveness through presenting opportunities for school members to think creatively to understanding issues and solving problems. The study also found non-significant differences between the mean ratings of teachers with teaching and non-teaching qualifications on the ways school administrators’ counselling and conceptual skills enhance effective secondary school administration, respectively. It was therefore concluded that school administrators’ counselling and conceptual skills are indispensable in running effective school administration and that as ways forward adequate counselling skills should be employed by school administrators while also making sure that conceptual skills should be in regular use to achieve school effectiveness. (shrink)
Hypothesizing after the results are known, or HARKing, occurs when researchers check their research results and then add or remove hypotheses on the basis of those results without acknowledging this process in their research report (Kerr, 1998). In the present article, I discuss three forms of HARKing: (1) using current results to construct post hoc hypotheses that are then reported as if they were a priori hypotheses; (2) retrieving hypotheses from a post hoc literature search and (...) reporting them as a priori hypotheses; and (3) failing to report a priori hypotheses that are unsupported by the current results. These three types of HARKing are often characterized as being bad for science and a potential cause of the current replication crisis. In the present article, I use insights from the philosophy of science to present a more nuanced view. Specifically, I identify the conditions under which each of these three types of HARKing is most and least likely to be bad for science. I conclude with a brief discussion about the ethics of each type of HARKing. (shrink)
Is there some general reason to expect organisms that have beliefs to have false beliefs? And after you observe that an organism occasionally occupies a given neural state that you think encodes a perceptual belief, how do you evaluate hypotheses about the semantic content that that state has, where some of those hypotheses attribute beliefs that are sometimes false while others attribute beliefs that are always true? To address the first of these questions, we discuss evolution by natural (...) selection and show how organisms that are risk-prone in the beliefs they form can be fitter than organisms that are risk-free. To address the second question, we discuss a problem that is widely recognized in statistics – the problem of over-fitting – and one influential device for addressing that problem, the Akaike Information Criterion (AIC). We then use AIC to solve epistemological versions of the disjunction and distality problems, which are two key problems concerning what it is for a belief state to have one semantic content rather than another. (shrink)
Moral skeptics maintain that we do not have moral knowledge. Traditionally they haven’t argued via skeptical hypotheses like those provided by perceptual skeptics about the external world, such as Descartes’ deceiving demon. But some believe this can be done by appealing to hypotheses like moral nihilism. Moreover, some claim that skeptical hypotheses have special force in the moral case. But I argue that skeptics have failed to specify an adequate skeptical scenario, which reveals a general lesson: such (...) arguments are not a promising avenue for moral skeptics to take. They’re ultimately weaker when applied to morality compared to perception. (shrink)
Why is there female under-representation among philosophy majors? We survey the hypotheses that have been proposed so far, grouping similar hypotheses together. We then propose a chronological taxonomy that distinguishes hypotheses according to the stage in undergraduates’ careers at which the hypotheses predict an increase in female under-representation. We then survey the empirical evidence for and against various hypotheses. We end by suggesting future avenues for research.
This paper is concerned with the relation between two notions: that of two solutions or models of a theory being related by a symmetry of the theory and that of solutions or models being physically equivalent. A number of authors have recently discussed this relation, some taking an optimistic view, on which there is a suitable concept of the symmetry of a theory relative to which these two notions coincide, others taking a pessimistic view, on which there is no such (...) concept. The present paper arrives at a cautiously pessimistic conclusion. (shrink)
It is shown how the schema of equivalence can be used to obtain short proofs of tautologies A , where the depth of proofs is linear in the number of variables in A .
For many years I have maintained that I learned to philosophize by translating Francisco Suárez’s Metaphysical Disputation V from Latin into English. This surely is a claim that must sound extraordinary to the members of this audience or even to most twentieth century philosophers. Who reads Suárez these days? And what could I learn from a sixteenth century scholastic writer that would help me in the twentieth century? I would certainly be surprised if one were to find any references to (...) some of Suárez’s works in any of the works of twentieth-century major philosophers. One of the reasons for my claim is the great difficulty I had in figuring out what Suárez’s text means and how to render it understandable to English readers. Translating the text forced me to think in ways that were quite different from those I was used to think in Spanish, my native tongue, or English, my adoptive tongue. In fact, the translation I produced after having completed many drafts continued, and still continues to this day, to appear to me unsatisfactory, and that dissatisfaction was the key to understanding things I had understood very differently before. I hope to make clear why in what follows. The thesis that I defend is that semantic equivalence between texts of philosophy in different languages is difficult, if not impossible in some cases, to achieve and, therefore, that it is a mistake to restrict doing analytic philosophy to English, as Gustavo Rodríguez-Pereyra argues we should do in a recent article (2013). (shrink)
This chapter explores the relationship between knowing-how and skill, as well other success-in-action notions like dispositions and abilities. I offer a new view of knowledge-how which combines elements of both intellectualism and Ryleanism. According to this view, knowing how to perform an action is both a kind of knowing-that (in accord with intellectualism) and a complex multi-track dispositional state (in accord with Ryle’s view of knowing-how). I argue that this new view—what I call practical attitude intellectualism—offers an attractive set of (...) solutions to various puzzles concerning the connections between knowing-how and abilities and skills to perform intentional actions. (shrink)
It is commonly assumed by philosophers of science that logically equivalent theories are in fact theoretically equivalent. I argue that two theses, anti-exceptionalism about logic (which says, roughly, that logic is not a priori, that it is revisable, and that it is not special or set apart from other human inquiry) and logical realism (which says, roughly, that differences in logic reflect genuine metaphysical differences in the world), make trouble for both this commitment and the closely related commitment to theories (...) being closed under logical consequence. I provide three arguments. The first two show that anti-exceptionalism about logic provides an epistemic challenge to both the closure and the equivalence claims; the third shows that logical realism provides a metaphysical challenge to both the closure and the equivalence claims. Along the way, I show that there are important methodological upshots for metaphysicians and philosophers of logic. In particular, there are lessons to be drawn about certain conceptions of naturalism as constraining the possibilities for metaphysics and the philosophy of logic. (shrink)
Some theorists have recently raised doubts about much of the experimental evidence purporting to demonstrate the existence of unconscious perception. In our (2019) in this journal, we argued some of these considerations are not decisive. Phillips (forthcoming a) replies thoughtfully to our paper, concluding that he is unconvinced by our arguments. Phillips maintains that the view that perception is invariably conscious remains, as he puts it, the “default” hypothesis both within the folk understanding and experimental study of perception. There is (...) much to agree with in Phillips’ piece, but there remain some substantive points of disagreement, which we outline here. (shrink)
Reformulating a scientific theory often leads to a significantly different way of understanding the world. Nevertheless, accounts of both theoretical equivalence and scientific understanding have neglected this important aspect of scientific theorizing. This essay provides a positive account of how reformulating theories changes our understanding. My account simultaneously addresses a serious challenge facing existing accounts of scientific understanding. These accounts have failed to characterize understanding in a way that goes beyond the epistemology of scientific explanation. By focusing on cases (...) where we have differences in understanding without differences in explanation, I show that understanding cannot be reduced to explanation. (shrink)
Abstract Theories are metaphysically equivalent just if there is no fact of the matter that could render one theory true and the other false. In this paper I argue that if we are judiciously to resolve disputes about whether theories are equivalent or not, we need to develop testable criteria that will give us epistemic access to the obtaining of the relation of metaphysical equivalence holding between those theories. I develop such ?diagnostic? criteria. I argue that correctly inter-translatable theories (...) are metaphysically equivalent, and what we need are ways of determining whether a putative translation is correct or not. To that end I develop a number of tools we can employ to discern whether a translation is a correct one. (shrink)
The notion of space is one of the most discussed within classical physics concepts. The works of Copernicus and Galileo, as well as Gassendi´s ideas led to Newton to regard it as substance. This conception of space, allows the notion of symmetry is present in an indirect or implied, within the laws of physics, formed through the notions of equivalence and balance. The aim of this study is to identify the symmetry, through such notions, under the study of indistinction (...) between the repose state and the state of uniform translational movement known as the law of inertia. The philosophical approach is framed in the link between these notions and the problem of indistinguishability between states of different movement. We will discuss first the conception of space Telesio and Gassendi that led Newton to his absolutist conception of space. The conception of space and its relationship with Newton symmetry is the second part of our work. As a third part present linking of the Newtonian space with notions of balance, symmetry and equivalence. Subsequently they give our conclusions. (shrink)
This article describes the Full Bayesian Significance Test for the problem of separate hypotheses. Numerical experiments are performed for the Gompertz vs. Weibull life span test.
This essay examines the relationship between some key elements of Peirce’s general theory of scientific inquiry (such as final causality, real possibility, methodological convergence, abductive reasoning, hypothesis formation, diagrammatic idealization) and some prominent issues discussed in the current philosophy of history, especially those pertaining to the role of generalizations in historical explanation. The claim is that, appropriately construed, Peirce’s recommendations with respect to rational inquiry in general can provide a reasonable basis for formulating a productive critical method for a responsible (...) philosophy of history. The essay further seeks to reduce the tension between Peirce’s interest in epistemic convergence and the arguments that champion the value of historical distance and perspectival pluralism. On the account offered, the kind of methodological convergence envisioned by Peirce need not conflict necessarily with a responsibly construed historical pluralism. On the other hand, the critical perspective of an epistemically disciplined philosophical inquiry may prove indispensable in weeding out wishful but unrealistic ideological perspectives from the writing of history. Hence, the resulting proposal envisions the critique of historical imagination as one potentially viable modality for the pragmatist philosophy of history. (shrink)
This paper corrects a mistake I saw students make but I have yet to see in print. The mistake is thinking that logically equivalent propositions have the same counterexamples—always. Of course, it is often the case that logically equivalent propositions have the same counterexamples: “every number that is prime is odd” has the same counterexamples as “every number that is not odd is not prime”. The set of numbers satisfying “prime but not odd” is the same as the set of (...) numbers satisfying “not odd but not not-prime”. The mistake is thinking that every two logically-equivalent false universal propositions have the same counterexamples. Only false universal propositions have counterexamples. A counterexample for “every two logically-equivalent false universal propositions have the same counterexamples” is two logically-equivalent false universal propositions not having the same counterexamples. The following counterexample arose naturally in my sophomore deductive logic course in a discussion of inner and outer converses. “Every even number precedes every odd number” is counterexemplified only by even numbers, whereas its equivalent “Every odd number is preceded by every even number” is counterexemplified only by odd numbers. Please let me know if you see this mistake in print. Also let me know if you have seen these points discussed before. I learned them in my own course: talk about learning by teaching! (shrink)
Answering a question formulated by Halbach (2009), I show that a disquotational truth theory, which takes as axioms all positive substitutions of the sentential T-schema, together with all instances of induction in the language with the truth predicate, is conservative over its syntactical base.
Those of us who take skepticism seriously typically have two relevant beliefs: (a) it’s plausible (even if false) that in order to know that I have hands I have to be able to epistemically neutralize, to some significant degree, some skeptical hypotheses, such as the brain-in-a-vat (BIV) one; and (b) it’s also plausible (even if false) that I can’t so neutralize those hypotheses. There is no reason for us to also think (c) that the BIV hypothesis, for instance, (...) is plausible or probably true. In order to take skepticism seriously it’s sufficient to hold (a) and (b); one need not hold (c). Indeed, philosophers who accept (a) and (b) never endorse (c). Show me a philosopher who suspects that he is a brain in a vat and I’ll show you someone who is deranged! That’s one thing that bothers undergraduates in philosophy. They object: why on earth do some philosophers take the BIV hypothesis to pose any threat at all to our beliefs given that those very same philosophers think that there’s no real chance that the BIV hypothesis is true? Sure, the BIV hypothesis is formally inconsistent with my belief that I have hands, so if the former is true then my belief is false. But so what? Why should that bare inconsistency matter so much? Is this strange attitude amongst philosophers the result of some logic fetish infecting the philosophical community? It is sometimes said that the skeptical hypotheses are not only inconsistent with our beliefs but are explanatory of our experiences, which is supposed to make them more of a threat. But students aren’t fooled: although the skeptical hypotheses may attempt to explain why our experience is as it is, it’s the kind of attempt appropriate for science fiction movies that are all special effects and virtually no plot. No one with any sense of reality will take the evil demon hypothesis to be even tenuously explanatory. (shrink)
The presence of symmetries in physical theories implies a pernicious form of underdetermination. In order to avoid this theoretical vice, philosophers often espouse a principle called Leibniz Equivalence, which states that symmetry-related models represent the same state of affairs. Moreover, philosophers have claimed that the existence of non-trivial symmetries motivates us to accept the Invariance Principle, which states that quantities that vary under a theory’s symmetries aren’t physically real. Leibniz Equivalence and the Invariance Principle are often seen as (...) part of the same package. I argue that this is a mistake: Leibniz Equivalence and the Invariance Principle are orthogonal to each other. This means that it is possible to hold that symmetry-related models represent the same state of affairs whilst having a realist attitude towards variant quantities. Various arguments have been presented in favour of the Invariance Principle: a rejection of the Invariance Principle is inter alia supposed to cause indeterminism, undetectability or failure of reference. I respond that these arguments at best support Leibniz Equivalence. (shrink)
In this paper I discuss the rule of inference proposed by Kuipers under the name of Inference to the Best Theory. In particular, I argue that the rule needs to be strengthened if it is to serve realist purposes. I further describe a method for testing, and perhaps eventually justifying, a suitably strengthened version of it.
Do theories of quantum mechanics and quantum gravity require spacetime to be a basic, ground level feature, or can spacetime be seen as an emergent element of these theories? While several commentators have raised serious doubts about the prospects of forgoing the standard spacetime backdrop, it will be argued that a defense of these emergent spacetime interpretations of quantum mechanics and quantum gravity hypotheses can be made, whether as an inference to the best explanation or using another strategy. Furthermore, (...) the idea that space and time can arise from a quite different, non-spatiotemporal level of reality will be shown to have various historical precedents, especially in the seventeenth and eighteenth centuries, a realization that may help dispel some of the mystery associated with these types of hypotheses. (shrink)
The History of scepticism from Erasmus to Spinoza is often called upon to support three theses: first, that Descartes had a dogmatic notion of systematic knowledge, and therefore of physics; second, that the hypothetical epistemology of physics which spread during the xviith century was the result of a general sceptical crisis; third, that this epistemology was more successful in England than in France. I reject these three theses: I point first to the tension in Descartes’ works between the ideal of (...) a completely certain science and a physics replete with hypotheses; further, I argue that the use of hypotheses by mechanical philosophers cannot be separated from their conception of physics; finally I show that, at the end of the xviith century, physicists in France as well as in England spoke through hypotheses and I examine different ways of explaining this shared practice. Richard H. Popkin’s book serves therefore as a starting point for insights into the general problem: to what extent and for what reasons some propositions in physics have been presented as hypotheses in the xviith century? (shrink)
ABSTRACT: In this paper I argue (i) that the hypothetical arguments about which the Stoic Chrysippus wrote numerous books (DL 7.196) are not to be confused with the so-called hypothetical syllogisms" but are the same hypothetical arguments as those mentioned five times in Epictetus (e.g. Diss. 1.25.11-12); and (ii) that these hypothetical arguments are formed by replacing in a non-hypothetical argument one (or more) of the premisses by a Stoic "hypothesis" or supposition. Such "hypotheses" or suppositions differ from propositions (...) in that they have a specific logical form and no truth-value. The reason for the introduction of a distinct class of hypothetical arguments can be found in the context of dialectical argumentation. The paper concludes with the discussion of some evidence for the use of Stoic hypothetical arguments in ancient texts. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.