A fundamental problem in science is how to make logical inferences from scientiﬁc data. Mere data does not suﬃce since additional information is necessary to select a domain of models or hypotheses and thus determine the likelihood of each model or hypothesis. Thomas Bayes’ Theorem relates the data and prior information to posterior probabilities associated with diﬀering models or hypotheses and thus is useful in identifying the roles played by the known data and the assumed prior information when making (...) inferences. Scientists, philosophers, and theologians accumulate knowledge when analyzing diﬀerent aspects of reality and search for particular hypotheses or models to ﬁt their respective subject matters. Of course, a main goal is then to integrate all kinds of knowledge into an all-encompassing worldview that would describe the whole of reality. A generous description of the whole of reality would span, in the order of complexity, from the purely physical to the supernatural. These two extreme aspects of reality are bridged by a nonphysical realm, which would include elements of life, man, consciousness, rationality, mental and mathematical abstractions, etc. An urgent problem in the theory of knowledge is what science is and what it is not. Albert Einstein’s notion of science in terms of sense perception is reﬁned by deﬁning operationally the data that makes up the subject matter of science. It is shown, for instance, that theological considerations included in the prior information assumed by Isaac Newton is irrelevant in relating the data logically to the model or hypothesis. In addition, the concepts of naturalism, intelligent design, and evolutionary theory are critically analyzed. Finally, Eugene P. Wigner’s suggestions concerning the nature of human consciousness, life, and the success of mathematics in the natural sciences is considered in the context of the creative power endowed in humans by God. (shrink)
Medical diagnosis has been traditionally recognized as a privileged field of application for so called probabilistic induction. Consequently, the Bayesian theorem, which mathematically formalizes this form of inference, has been seen as the most adequate tool for quantifying the uncertainty surrounding the diagnosis by providing probabilities of different diagnostic hypotheses, given symptomatic or laboratory data. On the other side, it has also been remarked that differential diagnosis rather works by exclusion, e.g. by modus tollens, i.e. deductively. By drawing on (...) a case history, this paper aims at clarifying some points on the issue. Namely: 1) Medical diagnosis does not represent, strictly speaking, a form of induction, but a type, of what in Peircean terms should be called ‘abduction’ (identifying a case as the token of a specific type); 2) in performing the single diagnostic steps, however, different inferential methods are used for both inductive and deductive nature: modus tollens, hypothetical-deductive method, abduction; 3) Bayes’ theorem is a probabilized form of abduction which uses mathematics in order to justify the degree of confidence which can be entertained on a hypothesis given the available evidence; 4) although theoretically irreconcilable, in practice, both the hypothetical- deductive method and the Bayesian one, are used in the same diagnosis with no serious compromise for its correctness; 5) Medical diagnosis, especially differential diagnosis, also uses a kind of “probabilistic modus tollens”, in that, signs (symptoms or laboratory data) are taken as strong evidence for a given hypothesis not to be true: the focus is not on hypothesis confirmation, but instead on its refutation [Pr (¬ H/E1, E2, …, En)]. Especially at the beginning of a complicated case, odds are between the hypothesis that is potentially being excluded and a vague “other”. This procedure has the advantage of providing a clue of what evidence to look for and to eventually reduce the set of candidate hypotheses if conclusive negative evidence is found. 6) Bayes’ theorem in the hypothesis-confirmation form can more faithfully, although idealistically, represent the medical diagnosis when the diagnostic itinerary has come to a reduced set of plausible hypotheses after a process of progressive elimination of candidate hypotheses; 7) Bayes’ theorem is however indispensable in the case of litigation in order to assess doctor’s responsibility for medical error by taking into account the weight of the evidence at his disposal. (shrink)
I argue that when we use ‘probability’ language in epistemic contexts—e.g., when we ask how probable some hypothesis is, given the evidence available to us—we are talking about degrees of support, rather than degrees of belief. The epistemic probability of A given B is the mind-independent degree to which B supports A, not the degree to which someone with B as their evidence believes A, or the degree to which someone would or should believe A if they had B as (...) their evidence. My central argument is that the degree-of-support interpretation lets us better model good reasoning in certain cases involving old evidence. Degree-of-belief interpretations make the wrong predictions not only about whether old evidence confirms new hypotheses, but about the values of the probabilities that enter into Bayes’ Theorem when we calculate the probability of hypotheses conditional on old evidence and new background information. (shrink)
The paper argues that the two best known formal logical fallacies, namely denying the antecedent (DA) and affirming the consequent (AC) are not just basic and simple errors, which prove human irrationality, but rather informational shortcuts, which may provide a quick and dirty way of extracting useful information from the environment. DA and AC are shown to be degraded versions of Bayes’ theorem, once this is stripped of some of its probabilities. The less the probabilities count, the closer these (...) fallacies become to a reasoning that is not only informationally useful but also logically valid. (shrink)
The paper argues that the two best known formal logical fallacies, namely denying the antecedent (DA) and affirming the consequent (AC) are not just basic and simple errors, which prove human irrationality, but rather informational shortcuts, which may provide a quick and dirty way of extracting useful information from the environment. DA and AC are shown to be degraded versions of Bayes’ theorem, once this is stripped of some of its probabilities. The less the probabilities count, the closer these (...) fallacies become to a reasoning that is not only informationally useful but also logically valid. (shrink)
Likelihoodism is the view that the degree of evidential support should be analysed and measured in terms of likelihoods alone. The paper considers and responds to a popular criticism that a likelihoodist framework is too restrictive to guide belief. First, I show that the most detailed and rigorous version of this criticism, as put forward by Gandenberger (2016), is unsuccessful. Second, I provide a positive argument that a broadly likelihoodist framework can accommodate guidance for comparative belief, even when objectively well-grounded (...) prior probabilities are not available. As I show, the shift from non-relational to comparative probabilities opens up a new space for addressing the belief guidance problem for likelihoodism. (shrink)
English abstract: This paper discusses the delicate relationship between traditional epistemology and the increasingly influential probabilistic (or ‘Bayesian’) approach to epistemology. The paper introduces some of the key ideas of probabilistic epistemology, including credences or degrees of belief, Bayes’ theorem, conditionalization, and the Dutch Book argument. The tension between traditional and probabilistic epistemology is brought out by considering the lottery and preface paradoxes as they relate to rational (binary) belief and credence respectively. It is then argued that this tension (...) can be alleviated by rejecting the requirement that rational (binary) beliefs must be consistent and closed under logical entailment. Instead, it is suggested that this logical requirement applies to a different type of binary propositional attitude, viz. acceptance. (shrink)
This is a series of lectures on formal decision theory held at the University of Bayreuth during the summer terms 2008 and 2009. It largely follows the book from Michael D. Resnik: Choices. An Introduction to Decision Theory, 5th ed. Minneapolis London 2000 and covers the topics: -/- Decisions under ignorance and risk Probability calculus (Kolmogoroff Axioms, Bayes' Theorem) Philosophical interpretations of probability (R. v. Mises, Ramsey-De Finetti) Neuman-Morgenstern Utility Theory Introductory Game Theory Social Choice Theory (Sen's Paradox of (...) Liberalism, Arrow's Theorem) . (shrink)
This paper offers a probabilistic treatment of the conditions for argument cogency as endorsed in informal logic: acceptability, relevance, and sufficiency. Treating a natural language argument as a reason-claim-complex, our analysis identifies content features of defeasible argument on which the RSA conditions depend, namely: change in the commitment to the reason, the reason’s sensitivity and selectivity to the claim, one’s prior commitment to the claim, and the contextually determined thresholds of acceptability for reasons and for claims. Results contrast with, and (...) may indeed serve to correct, the informal understanding and applications of the RSA criteria concerning their conceptual dependence, their function as update-thresholds, and their status as obligatory rather than permissive norms, but also show how these formal and informal normative approachs can in fact align. (shrink)
I present a solution to the epistemological or characterisation problem of induction. In part I, Bayesian Confirmation Theory (BCT) is discussed as a good contender for such a solution but with a fundamental explanatory gap (along with other well discussed problems); useful assigned probabilities like priors require substantive degrees of belief about the world. I assert that one does not have such substantive information about the world. Consequently, an explanation is needed for how one can be licensed to act as (...) if one has substantive information about the world when one does not. I sketch the outlines of a solution in part I, showing how it differs from others, with full details to follow in subsequent parts. The solution is pragmatic in sentiment (though differs in specifics to arguments from, for example, William James); the conceptions we use to guide our actions are and should be at least partly determined by preferences. This is cashed out in a reformulation of decision theory motivated by a non-reductive formulation of hypotheses and logic. A distinction emerges between initial assumptions--that can be non-dogmatic--and effective assumptions that can simultaneously be substantive. An explanation is provided for the plausibility arguments used to explain assigned probabilities in BCT. -/- In subsequent parts, logic is constructed from principles independent of language and mind. In particular, propositions are defined to not have form. Probabilities are logical and uniquely determined by assumptions. The problems considered fatal to logical probabilities--Goodman's `grue' problem and the uniqueness of priors problem are dissolved due to the particular formulation of logic used. Other problems such as the zero-prior problem are also solved. -/- A universal theory of (non-linguistic) meaning is developed. Problems with counterfactual conditionals are solved by developing concepts of abstractions and corresponding pictures that make up hypotheses. Spaces of hypotheses and the version of Bayes' theorem that utilises them emerge from first principles. -/- Theoretical virtues for hypotheses emerge from the theory. Explanatory force is explicated. The significance of effective assumptions is partly determined by combinatoric factors relating to the structure of hypotheses. I conjecture that this is the origin of simplicity. (shrink)
According to a simple Bayesian argument from evil, the evil we observe is less likely given theism than given atheism, and therefore lowers the probability of theism. I consider the most common skeptical theist response to this argument, according to which our cognitive limitations make the probability of evil given theism inscrutable. I argue that if skeptical theists are right about this, then the probability of theism given evil is itself largely inscrutable, and that if this is so, we ought (...) to be agnostic about whether God exists. (shrink)
In a recent article, David Kyle Johnson has claimed to have provided a ‘refutation’ of skeptical theism. Johnson’s refutation raises several interesting issues. But in this note, I focus on only one—an implicit principle Johnson uses in his refutation to update probabilities after receiving new evidence. I argue that this principle is false. Consequently, Johnson’s refutation, as it currently stands, is undermined.
Several scholars, including Martin Hengel, R. Alan Culpepper, and Richard Bauckham, have argued that Papias had knowledge of the Gospel of John on the grounds that Papias’s prologue lists six of Jesus’s disciples in the same order that they are named in the Gospel of John: Andrew, Peter, Philip, Thomas, James, and John. In “A Note on Papias’s Knowledge of the Fourth Gospel” (JBL 129 [2010]: 793–794), Jake H. O’Connell presents a statistical analysis of this argument, according to which the (...) probability of this correspondence occurring by chance is lower than 1%. O’Connell concludes that it is more than 99% probable that this correspondence is the result of Papias copying John, rather than chance. I show that O’Connell’s analysis contains multiple mistakes, both substantive and mathematical: it ignores relevant evidence, overstates the correspondence between John and Papias, wrongly assumes that if Papias did not know John he ordered the disciples randomly, and conflates the probability of A given B with the probability of B given A. In discussing these errors, I aim to inform both Johannine scholarship and the use of probabilistic methods in historical reasoning. (shrink)
We use Bayesian tools to assess Law’s skeptical argument against the historicity of Jesus. We clarify and endorse his sub-argument for the conclusion that there is good reason to be skeptical about the miracle claims of the New Testament. However, we dispute Law’s contamination principle that he claims entails that we should be skeptical about the existence of Jesus. There are problems with Law’s defense of his principle, and we show, more importantly, that it is not supported by Bayesian considerations. (...) Finally, we show that Law’s principle is false in the specific case of Jesus and thereby show, contrary to the main conclusion of Law’s argument, that biblical historians are entitled to remain confident that Jesus existed. (shrink)
"A Little More Logical" is the perfect guide for anyone looking to improve their critical thinking and logical reasoning skills. With chapters on everything from logic basics to fallacies of weak induction to moral reasoning, this book covers all the essential concepts you need to become a more logical thinker. You'll learn about influential figures in the field of logic, such as Rudolph Carnap, Betrrand Russell, and Ada Lovelace, and how to apply your newfound knowledge to real-world situations. Whether you're (...) looking to engage in debates with others, make better decisions in your personal and professional life, or simply want to improve your overall critical thinking skills, "A Little More Logical" has you covered. So why wait? Start learning and become a little more logical today! -/- "A Little More Logical" differs from typical logical textbooks in a number of ways. One key difference is its emphasis on engaging and relatable examples and case studies. Rather than simply presenting dry definitions and concepts, the book uses fables, stories, and real-world situations to illustrate key ideas and make them more relatable for readers. -/- Another unique aspect of "A Little More Logical" is its inclusion of "Minds that Mattered" sections, which highlight the contributions and insights of influential figures in the field of logic and critical thinking. These sections provide readers with a deeper understanding of the history and development of logical principles and offer valuable context for the concepts being discussed. -/- Additionally, "A Little More Logical" covers a wide range of topics beyond the basics of logic and argument evaluation. Chapters on moral reasoning, probability and inductive logic, scientific reasoning, conspiracy theories, statistical reasoning, and the history of formal logic offer a more comprehensive and well-rounded understanding of logic and critical thinking. -/- Overall, "A Little More Logical" stands out as a dynamic and engaging resource for anyone looking to improve their logical reasoning abilities. Its relatable examples, historical context, and broad coverage make it a valuable resource for anyone interested in mastering the principles of logic. -/- This is a free, Creative-Commons-licensed book. (shrink)
A group is often construed as one agent with its own probabilistic beliefs (credences), which are obtained by aggregating those of the individuals, for instance through averaging. In their celebrated “Groupthink”, Russell et al. (2015) require group credences to undergo Bayesian revision whenever new information is learnt, i.e., whenever individual credences undergo Bayesian revision based on this information. To obtain a fully Bayesian group, one should often extend this requirement to non-public or even private information (learnt by not all or (...) just one individual), or to non-representable information (not representable by any event in the domain where credences are held). I pro- pose a taxonomy of six types of ‘group Bayesianism’. They differ in the information for which Bayesian revision of group credences is required: public representable information, private representable information, public non-representable information, etc. Six corre- sponding theorems establish how individual credences must (not) be aggregated to ensure group Bayesianism of any type, respectively. Aggregating through standard averaging is never permitted; instead, different forms of geometric averaging must be used. One theorem—that for public representable information—is essentially Russell et al.’s central result (with minor corrections). Another theorem—that for public non-representable information—fills a gap in the theory of externally Bayesian opinion pooling. (shrink)
Racial profiling has come under intense public scrutiny especially since the rise of the Black Lives Matter movement. This article discusses two questions: whether racial profiling is sometimes rational, and whether it can be morally permissible. It is argued that under certain circumstances the affirmative answer to both questions is justified.
Lately in the past couple of years, there are an increasing in the normal rate of playing computer games or video games compared to the E-learning content that are introduced for the safety of our children, and the impact of the video game addictiveness that ranges from (Musculoskeletal issues, Vision problems and Obesity). Furthermore, this paper introduce an intelligent tutoring system for both parent and their children for enhancement the experience of gaming and tell us about the health problems and (...) how we can solve them, with an easy user interface that way can our children be happy and excited about the information and their health. (shrink)
Despite their popularity, relatively scant attention has been paid to the upshot of Bayesian and predictive processing models of cognition for views of overall cognitive architecture. Many of these models are hierarchical ; they posit generative models at multiple distinct "levels," whose job is to predict the consequences of sensory input at lower levels. I articulate one possible position that could be implied by these models, namely, that there is a continuous hierarchy of perception, cognition, and action control comprising levels (...) of generative models. I argue that this view is not entailed by a general Bayesian/predictive processing outlook. Bayesian approaches are compatible with distinct formats of mental representation. Focusing on Bayesian approaches to motor control, I argue that the junctures between different types of mental representation are places where the transitivity of hierarchical prediction may be broken, and I consider the upshot of this conclusion for broader discussions of cognitive architecture. (shrink)
Representation theorems are often taken to provide the foundations for decision theory. First, they are taken to characterize degrees of belief and utilities. Second, they are taken to justify two fundamental rules of rationality: that we should have probabilistic degrees of belief and that we should act as expected utility maximizers. We argue that representation theorems cannot serve either of these foundational purposes, and that recent attempts to defend the foundational importance of representation theorems are unsuccessful. As a result, we (...) should reject these claims, and lay the foundations of decision theory on firmer ground. (shrink)
Jury theorems are mathematical theorems about the ability of collectives to make correct decisions. Several jury theorems carry the optimistic message that, in suitable circumstances, ‘crowds are wise’: many individuals together (using, for instance, majority voting) tend to make good decisions, outperforming fewer or just one individual. Jury theorems form the technical core of epistemic arguments for democracy, and provide probabilistic tools for reasoning about the epistemic quality of collective decisions. The popularity of jury theorems spans across various disciplines such (...) as economics, political science, philosophy, and computer science. This entry reviews and critically assesses a variety of jury theorems. It first discusses Condorcet's initial jury theorem, and then progressively introduces jury theorems with more appropriate premises and conclusions. It explains the philosophical foundations, and relates jury theorems to diversity, deliberation, shared evidence, shared perspectives, and other phenomena. It finally connects jury theorems to their historical background and to democratic theory, social epistemology, and social choice theory. (shrink)
Plagiarism detection is the process of finding similarities on electronic based documents. Recently, this process is highly required because of the large number of available documents on the internet and the ability to copy and paste the text of relevant documents with simply Control+C and Control+V commands. The proposed solution is to investigate and develop an easy, fast, and multi-language support plagiarism detector with the easy of one click to detect the document plagiarism. This process will be done with the (...) support of intelligent system that can learn, change and adapt to the input document and make a cross-fast search for the content on the local repository and the online repository and link the content of the file with the matching content everywhere found. Furthermore, the supported document type that we will use is word, text and in some cases, the pdf files –where is the text can be extracting from them- and this made possible by using the DLL file from Word application that Microsoft provided on OS. The using of DLL will let us to not constrain on how to get the text from files; and will help us to apply the file on our Delphi project and walk throw our methodology and read the file word by word to grantee the best working scenarios for the calculation. In the result, this process will help in the uprising the documents quality and enhance the writer experience related to his work and will save the copyrights for the official writer of the documents by providing a new alternative tool for plagiarism detection problem for easy and fast use to the concerned Institutions for free. (shrink)
We give a review and critique of jury theorems from a social-epistemology perspective, covering Condorcet’s (1785) classic theorem and several later refinements and departures. We assess the plausibility of the conclusions and premises featuring in jury theorems and evaluate the potential of such theorems to serve as formal arguments for the ‘wisdom of crowds’. In particular, we argue (i) that there is a fundamental tension between voters’ independence and voters’ competence, hence between the two premises of most jury theorems; (...) (ii) that the (asymptotic) conclusion that ‘huge groups are infallible’, reached by many jury theorems, is an artifact of unjustified premises; and (iii) that the (nonasymptotic) conclusion that ‘larger groups are more reliable’, also reached by many jury theorems, is not an artifact and should be regarded as the more adequate formal rendition of the ‘wisdom of crowds’. (shrink)
Nếu nền kinh tế quốc gia là một cơ thể sống, thì hệ thống tài chính là cơ chế tạo, cung cấp và lưu thông máu tới từng tế bào, bộ phận. Thiếu hay thừa đều phát sinh các vấn đề cần giải quyết. Với quá trình chuyển đổi kinh tế và hội nhập quốc tế mạnh mẽ, liên tục giám sát, kịp thời dự đoán sát thực các dấu hiệu và biến động của thị trường để từ đó xây (...) dựng chính sách điều tiết thị trường tài chính và toàn bộ nền kinh tế một cách hợp lý là việc làm cần thiết bảo đảm tăng trưởng kinh tế bền vững, đúng định hướng. Đối với nền tài chính của Việt Nam, hiện đang tồn tại bảy dấu hiệu cảnh báo cần được các nhà nghiên cứu và hoạch định chính sách quan tâm. (shrink)
The text is a continuation of the article of the same name published in the previous issue of Philosophical Alternatives. The philosophical interpretations of the Kochen- Specker theorem (1967) are considered. Einstein's principle regarding the,consubstantiality of inertia and gravity" (1918) allows of a parallel between descriptions of a physical micro-entity in relation to the macro-apparatus on the one hand, and of physical macro-entities in relation to the astronomical mega-entities on the other. The Bohmian interpretation ( 1952) of quantum mechanics (...) proposes that all quantum systems be interpreted as dissipative ones and that the theorem be thus derstood. The conclusion is that the continual representation, by force or (gravitational) field between parts interacting by means of it, of a system is equivalent to their mutual entanglement if representation is discrete. Gravity (force field) and entanglement are two different, correspondingly continual and discrete, images of a single common essence. General relativity can be interpreted as a superluminal generalization of special relativity. The postulate exists of an alleged obligatory difference between a model and reality in science and philosophy. It can also be deduced by interpreting a corollary of the heorem. On the other hand, quantum mechanics, on the basis of this theorem and of V on Neumann's (1932), introduces the option that a model be entirely identified as the modeled reality and, therefore, that absolutely reality be recognized: this is a non-standard hypothesis in the epistemology of science. Thus, the true reality begins to be understood mathematically, i.e. in a Pythagorean manner, for its identification with its mathematical model. A few linked problems are highlighted: the role of the axiom of choice forcorrectly interpreting the theorem; whether the theorem can be considered an axiom; whether the theorem can be considered equivalent to the negation of the axiom. (shrink)
Non-commuting quantities and hidden parameters – Wave-corpuscular dualism and hidden parameters – Local or nonlocal hidden parameters – Phase space in quantum mechanics – Weyl, Wigner, and Moyal – Von Neumann’s theorem about the absence of hidden parameters in quantum mechanics and Hermann – Bell’s objection – Quantum-mechanical and mathematical incommeasurability – Kochen – Specker’s idea about their equivalence – The notion of partial algebra – Embeddability of a qubit into a bit – Quantum computer is not Turing machine (...) – Is continuality universal? – Diffeomorphism and velocity – Einstein’s general principle of relativity – „Mach’s principle“ – The Skolemian relativity of the discrete and the continuous – The counterexample in § 6 of their paper – About the classical tautology which is untrue being replaced by the statements about commeasurable quantum-mechanical quantities – Logical hidden parameters – The undecidability of the hypothesis about hidden parameters – Wigner’s work and и Weyl’s previous one – Lie groups, representations, and psi-function – From a qualitative to a quantitative expression of relativity − psi-function, or the discrete by the random – Bartlett’s approach − psi-function as the characteristic function of random quantity – Discrete and/ or continual description – Quantity and its “digitalized projection“ – The idea of „velocity−probability“ – The notion of probability and the light speed postulate – Generalized probability and its physical interpretation – A quantum description of macro-world – The period of the as-sociated de Broglie wave and the length of now – Causality equivalently replaced by chance – The philosophy of quantum information and religion – Einstein’s thesis about “the consubstantiality of inertia ant weight“ – Again about the interpretation of complex velocity – The speed of time – Newton’s law of inertia and Lagrange’s formulation of mechanics – Force and effect – The theory of tachyons and general relativity – Riesz’s representation theorem – The notion of covariant world line – Encoding a world line by psi-function – Spacetime and qubit − psi-function by qubits – About the physical interpretation of both the complex axes of a qubit – The interpretation of the self-adjoint operators components – The world line of an arbitrary quantity – The invariance of the physical laws towards quantum object and apparatus – Hilbert space and that of Minkowski – The relationship between the coefficients of -function and the qubits – World line = psi-function + self-adjoint operator – Reality and description – Does a „curved“ Hilbert space exist? – The axiom of choice, or when is possible a flattening of Hilbert space? – But why not to flatten also pseudo-Riemannian space? – The commutator of conjugate quantities – Relative mass – The strokes of self-movement and its philosophical interpretation – The self-perfection of the universe – The generalization of quantity in quantum physics – An analogy of the Feynman formalism – Feynman and many-world interpretation – The psi-function of various objects – Countable and uncountable basis – Generalized continuum and arithmetization – Field and entanglement – Function as coding – The idea of „curved“ Descartes product – The environment of a function – Another view to the notion of velocity-probability – Reality and description – Hilbert space as a model both of object and description – The notion of holistic logic – Physical quantity as the information about it – Cross-temporal correlations – The forecasting of future – Description in separable and inseparable Hilbert space – „Forces“ or „miracles“ – Velocity or time – The notion of non-finite set – Dasein or Dazeit – The trajectory of the whole – Ontological and onto-theological difference – An analogy of the Feynman and many-world interpretation − psi-function as physical quantity – Things in the world and instances in time – The generation of the physi-cal by mathematical – The generalized notion of observer – Subjective or objective probability – Energy as the change of probability per the unite of time – The generalized principle of least action from a new view-point – The exception of two dimensions and Fermat’s last theorem. (shrink)
Any intermediate propositional logic can be extended to a calculus with epsilon- and tau-operators and critical formulas. For classical logic, this results in Hilbert’s $\varepsilon $ -calculus. The first and second $\varepsilon $ -theorems for classical logic establish conservativity of the $\varepsilon $ -calculus over its classical base logic. It is well known that the second $\varepsilon $ -theorem fails for the intuitionistic $\varepsilon $ -calculus, as prenexation is impossible. The paper investigates the effect of adding critical $\varepsilon $ (...) - and $\tau $ -formulas and using the translation of quantifiers into $\varepsilon $ - and $\tau $ -terms to intermediate logics. It is shown that conservativity over the propositional base logic also holds for such intermediate ${\varepsilon \tau }$ -calculi. The “extended” first $\varepsilon $ -theorem holds if the base logic is finite-valued Gödel–Dummett logic, and fails otherwise, but holds for certain provable formulas in infinite-valued Gödel logic. The second $\varepsilon $ -theorem also holds for finite-valued first-order Gödel logics. The methods used to prove the extended first $\varepsilon $ -theorem for infinite-valued Gödel logic suggest applications to theories of arithmetic. (shrink)
The problem addressed in this paper is “the main epistemic problem concerning science”, viz. “the explication of how we compare and evaluate theories [...] in the light of the available evidence” (van Fraassen, BC, 1983, Theory comparison and relevant Evidence. In J. Earman (Ed.), Testing scientific theories (pp. 27–42). Minneapolis: University of Minnesota Press). Sections 1– 3 contain the general plausibility-informativeness theory of theory assessment. In a nutshell, the message is (1) that there are two values a theory should exhibit: (...) truth and informativeness—measured respectively by a truth indicator and a strength indicator; (2) that these two values are conflicting in the sense that the former is a decreasing and the latter an increasing function of the logical strength of the theory to be assessed; and (3) that in assessing a given theory by the available data one should weigh between these two conflicting aspects in such a way that any surplus in informativeness succeeds, if the shortfall in plausibility is small enough. Particular accounts of this general theory arise by inserting particular strength indicators and truth indicators. In Section 4 the theory is spelt out for the Bayesian paradigm of subjective probabilities. It is then compared to incremental Bayesian confirmation theory. Section 4 closes by asking whether it is likely to be lovely. Section 5 discusses a few problems of confirmation theory in the light of the present approach. In particular, it is briefly indicated how the present account gives rise to a new analysis of Hempel’s conditions of adequacy for any relation of confirmation (Hempel, CG, 1945, Studies in the logic of comfirmation. Mind, 54, 1–26, 97–121.), differing from the one Carnap gave in § 87 of his Logical foundations of probability (1962, Chicago: University of Chicago Press). Section 6 adresses the question of justification any theory of theory assessment has to face: why should one stick to theories given high assessment values rather than to any other theories? The answer given by the Bayesian version of the account presented in section 4 is that one should accept theories given high assessment values, because, in the medium run, theory assessment almost surely takes one to the most informative among all true theories when presented separating data. The concluding section 7 continues the comparison between the present account and incremental Bayesian confirmation theory. (shrink)
This paper begins with a puzzle regarding Lewis' theory of radical interpretation. On the one hand, Lewis convincingly argued that the facts about an agent's sensory evidence and choices will always underdetermine the facts about her beliefs and desires. On the other hand, we have several representation theorems—such as those of (Ramsey 1931) and (Savage 1954)—that are widely taken to show that if an agent's choices satisfy certain constraints, then those choices can suffice to determine her beliefs and desires. In (...) this paper, I will argue that Lewis' conclusion is correct: choices radically underdetermine beliefs and desires, and representation theorems provide us with no good reasons to think otherwise. Any tension with those theorems is merely apparent, and relates ultimately to the difference between how 'choices' are understood within Lewis' theory and the problematic way that they're represented in the context of the representation theorems. For the purposes of radical interpretation, representation theorems like Ramsey's and Savage's just aren't very relevant after all. (shrink)
Amalgamating evidence of different kinds for the same hypothesis into an overall confirmation is analogous, I argue, to amalgamating individuals’ preferences into a group preference. The latter faces well-known impossibility theorems, most famously “Arrow’s Theorem”. Once the analogy between amalgamating evidence and amalgamating preferences is tight, it is obvious that amalgamating evidence might face a theorem similar to Arrow’s. I prove that this is so, and end by discussing the plausibility of the axioms required for the theorem.
Pettit (2012) presents a model of popular control over government, according to which it consists in the government being subject to those policy-making norms that everyone accepts. In this paper, I provide a formal statement of this interpretation of popular control, which illuminates its relationship to other interpretations of the idea with which it is easily conflated, and which gives rise to a theorem, similar to the famous Gibbard-Satterthwaite theorem. The theorem states that if government policy is (...) subject to popular control, as Pettit interprets it, and policy responds positively to changes in citizens' normative attitudes, then there is a single individual whose normative attitudes unilaterally determine policy. I use the model and theorem as an illustrative example to discuss the role of mathematics in normative political theory. (shrink)
The standard representation theorem for expected utility theory tells us that if a subject’s preferences conform to certain axioms, then she can be represented as maximising her expected utility given a particular set of credences and utilities—and, moreover, that having those credences and utilities is the only way that she could be maximising her expected utility. However, the kinds of agents these theorems seem apt to tell us anything about are highly idealised, being always probabilistically coherent with infinitely precise (...) degrees of belief and full knowledge of all a priori truths. Ordinary subjects do not look very rational when compared to the kinds of agents usually talked about in decision theory. In this paper, I will develop an expected utility representation theorem aimed at the representation of those who are neither probabilistically coherent, logically omniscient, nor expected utility maximisers across the board—that is, agents who are frequently irrational. The agents in question may be deductively fallible, have incoherent credences, limited representational capacities, and fail to maximise expected utility for all but a limited class of gambles. (shrink)
In this short survey article, I discuss Bell’s theorem and some strategies that attempt to avoid the conclusion of non-locality. I focus on two that intersect with the philosophy of probability: (1) quantum probabilities and (2) superdeterminism. The issues they raised not only apply to a wide class of no-go theorems about quantum mechanics but are also of general philosophical interest.
Peer review is often taken to be the main form of quality control on academic research. Usually journals carry this out. However, parts of maths and physics appear to have a parallel, crowd-sourced model of peer review, where papers are posted on the arXiv to be publicly discussed. In this paper we argue that crowd-sourced peer review is likely to do better than journal-solicited peer review at sorting papers by quality. Our argument rests on two key claims. First, crowd-sourced peer (...) review will lead on average to more reviewers per paper than journal-solicited peer review. Second, due to the wisdom of the crowds, more reviewers will tend to make better judgments than fewer. We make the second claim precise by looking at the Condorcet Jury Theorem as well as two related jury theorems developed specifically to apply to peer review. (shrink)
In response to recent work on the aggregation of individual judgments on logically connected propositions into collective judgments, it is often asked whether judgment aggregation is a special case of Arrowian preference aggregation. We argue for the converse claim. After proving two impossibility theorems on judgment aggregation (using "systematicity" and "independence" conditions, respectively), we construct an embedding of preference aggregation into judgment aggregation and prove Arrow’s theorem (stated for strict preferences) as a corollary of our second result. Although we (...) thereby provide a new proof of Arrow’s theorem, our main aim is to identify the analogue of Arrow’s theorem in judgment aggregation, to clarify the relation between judgment and preference aggregation, and to illustrate the generality of the judgment aggregation model. JEL Classi…cation: D70, D71.. (shrink)
To counter a general belief that all the paradoxes stem from a kind of circularity (or involve some self--reference, or use a diagonal argument) Stephen Yablo designed a paradox in 1993 that seemingly avoided self--reference. We turn Yablo's paradox, the most challenging paradox in the recent years, into a genuine mathematical theorem in Linear Temporal Logic (LTL). Indeed, Yablo's paradox comes in several varieties; and he showed in 2004 that there are other versions that are equally paradoxical. Formalizing these (...) versions of Yablo's paradox, we prove some theorems in LTL. This is the first time that Yablo's paradox(es) become new(ly discovered) theorems in mathematics and logic. (shrink)
A proof of Fermat’s last theorem is demonstrated. It is very brief, simple, elementary, and absolutely arithmetical. The necessary premises for the proof are only: the three definitive properties of the relation of equality (identity, symmetry, and transitivity), modus tollens, axiom of induction, the proof of Fermat’s last theorem in the case of n = 3 as well as the premises necessary for the formulation of the theorem itself. It involves a modification of Fermat’s approach of infinite (...) descent. The infinite descent is linked to induction starting from n = 3 by modus tollens. An inductive series of modus tollens is constructed. The proof of the series by induction is equivalent to Fermat’s last theorem. As far as Fermat had been proved the theorem for n = 4, one can suggest that the proof for n ≥ 4 was accessible to him. An idea for an elementary arithmetical proof of Fermat’s last theorem (FLT) by induction is suggested. It would be accessible to Fermat unlike Wiles’s proof (1995). (shrink)
This paper deals with, prepositional calculi with strong negation (N-logics) in which the Craig interpolation theorem holds. N-logics are defined to be axiomatic strengthenings of the intuitionistic calculus enriched with a unary connective called strong negation. There exists continuum of N-logics, but the Craig interpolation theorem holds only in 14 of them.
We present a general framework for representing belief-revision rules and use it to characterize Bayes's rule as a classical example and Jeffrey's rule as a non-classical one. In Jeffrey's rule, the input to a belief revision is not simply the information that some event has occurred, as in Bayes's rule, but a new assignment of probabilities to some events. Despite their differences, Bayes's and Jeffrey's rules can be characterized in terms of the same axioms: "responsiveness", which requires (...) that revised beliefs incorporate what has been learnt, and "conservativeness", which requires that beliefs on which the learnt input is "silent" do not change. To illustrate the use of non-Bayesian belief revision in economic theory, we sketch a simple decision-theoretic application. (shrink)
The proposed paper presents an argument in favor of a Rawlsian approach to ethics for Internet technology companies (den Hoven & Rooksby, 2008; Hoffman, 2017). Ethics statements from such companies are analyzed and shown to be utilitarian and teleological in nature, and therefore in opposition to Rawls’ theories of justice and fairness. The statements are also shown to have traits in common with Confucian virtue ethics (Ames, 2011; Nylan, 2008).
The previous two parts of the paper demonstrate that the interpretation of Fermat’s last theorem (FLT) in Hilbert arithmetic meant both in a narrow sense and in a wide sense can suggest a proof by induction in Part I and by means of the Kochen - Specker theorem in Part II. The same interpretation can serve also for a proof FLT based on Gleason’s theorem and partly similar to that in Part II. The concept of (probabilistic) measure (...) of a subspace of Hilbert space and especially its uniqueness can be unambiguously linked to that of partial algebra or incommensurability, or interpreted as a relation of the two dual branches of Hilbert arithmetic in a wide sense. The investigation of the last relation allows for FLT and Gleason’s theorem to be equated in a sense, as two dual counterparts, and the former to be inferred from the latter, as well as vice versa under an additional condition relevant to the Gödel incompleteness of arithmetic to set theory. The qubit Hilbert space itself in turn can be interpreted by the unity of FLT and Gleason’s theorem. The proof of such a fundamental result in number theory as FLT by means of Hilbert arithmetic in a wide sense can be generalized to an idea about “quantum number theory”. It is able to research mathematically the origin of Peano arithmetic from Hilbert arithmetic by mediation of the “nonstandard bijection” and its two dual branches inherently linking it to information theory. Then, infinitesimal analysis and its revolutionary application to physics can be also re-realized in that wider context, for example, as an exploration of the way for physical quantity of time (respectively, for time derivative in any temporal process considered in physics) to appear at all. Finally, the result admits a philosophical reflection of how any hierarchy arises or changes itself only thanks to its dual and idempotent counterpart. (shrink)
The aggregation of individual judgments over interrelated propositions is a newly arising field of social choice theory. I introduce several independence conditions on judgment aggregation rules, each of which protects against a specific type of manipulation by agenda setters or voters. I derive impossibility theorems whereby these independence conditions are incompatible with certain minimal requirements. Unlike earlier impossibility results, the main result here holds for any (non-trivial) agenda. However, independence conditions arguably undermine the logical structure of judgment aggregation. I therefore (...) suggest restricting independence to premises, which leads to a generalised premise-based procedure. This procedure is proven to be possible if the premises are logically independent. (shrink)
In this paper a symmetry argument against quantity absolutism is amended. Rather than arguing against the fundamentality of intrinsic quantities on the basis of transformations of basic quantities, a class of symmetries defined by the Π-theorem is used. This theorem is a fundamental result of dimensional analysis and shows that all unit-invariant equations which adequately represent physical systems can be put into the form of a function of dimensionless quantities. Quantity transformations that leave those dimensionless quantities invariant are (...) empirical and dynamical symmetries. The proposed symmetries of the original argument fail to be both dynamical and empirical symmetries and are open to counterexamples. The amendment of the original argument requires consideration of the relationships between quantity dimensions. The discussion raises a pertinent issue: what is the modal status of the constants of nature which figure in the laws? Two positions, constant necessitism and constant contingentism, are introduced and their relationships to absolutism and comparativism undergo preliminary investigation. It is argued that the absolutist can only reject the amended symmetry argument by accepting constant necessitism. I argue that the truth of an epistemically open empirical hypothesis would make the acceptance of constant necessitism costly: together they entail that the facts are nomically necessary. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.