It is often claimed that the greatest value of the Bayesian framework in cognitive science consists in its unifying power. Several Bayesian cognitive scientists assume that unification is obviously linked to explanatory power. But this link is not obvious, as unification in science is a heterogeneous notion, which may have little to do with explanation. While a crucial feature of most adequate explanations in cognitive science is that they reveal aspects of the causal mechanism that produces the phenomenon to be (...) explained, the kind of unification afforded by the Bayesian framework to cognitive science does not necessarily reveal aspects of a mechanism. Bayesian unification, nonetheless, can place fruitful constraints on causal–mechanical explanation. 1 Introduction2 What a Great Many Phenomena Bayesian Decision Theory Can Model3 The Case of Information Integration4 How Do Bayesian Models Unify?5 Bayesian Unification: What Constraints Are There on Mechanistic Explanation?5.1 Unification constrains mechanism discovery5.2 Unification constrains the identification of relevant mechanistic factors5.3 Unification constrains confirmation of competitive mechanistic models6 ConclusionAppendix. (shrink)
Bayesianism is our leading theory of uncertainty. Epistemology is defined as the theory of knowledge. So “Bayesian Epistemology” may sound like an oxymoron. Bayesianism, after all, studies the properties and dynamics of degrees of belief, understood to be probabilities. Traditional epistemology, on the other hand, places the singularly non-probabilistic notion of knowledge at centre stage, and to the extent that it traffics in belief, that notion does not come in degrees. So how can there be a Bayesian epistemology?
Scientific theories are used for a variety of purposes. For example, physical theories such as classical mechanics and electrodynamics have important applications in engineering and technology, and we trust that this results in useful machines, stable bridges, and the like. Similarly, theories such as quantum mechanics and relativity theory have many applications as well. Beyond that, these theories provide us with an understanding of the world and address fundamental questions about space, time, and matter. Here we trust that the answers (...) scientific theories give are reliable and that we have good reason to believe that the features of the world are similar to what the theories say about them. But why do we trust scientific theories, and what counts as evidence in favor of them? (shrink)
We argue that social deliberation may increase an agent’s confidence and credence under certain circumstances. An agent considers a proposition H and assigns a probability to it. However, she is not fully confident that she herself is reliable in this assignment. She then endorses H during deliberation with another person, expecting him to raise serious objections. To her surprise, however, the other person does not raise any objections to H. How should her attitudes toward H change? It seems plausible that (...) she should increase the credence she assigns to H and, at the same time, increase the reliability she assigns to herself concerning H. A Bayesian model helps us to investigate under what conditions, if any, this is rational. (shrink)
We construct a probabilistic coherence measure for information sets which determines a partial coherence ordering. This measure is applied in constructing a criterion for expanding our beliefs in the face of new information. A number of idealizations are being made which can be relaxed by an appeal to Bayesian Networks.
Der Begriff ‘Modell’ leitet sich vom Lateinischen ‘modulus’ (das Maß) ab, im Italienischen existiert seit dem 16. Jh. ‘modello’ und R. Descartes verwendet im 17. Jh. ‘modèlle’. Während der Begriff in Architektur und Kunst schon seit der Renaissance gängig ist, wird er in den Naturwissenschaften erst im 19. Jh. verwendet.1 Dort greifen wissenschaftliche Modelle die für eine gegebene Problemstellung als wesentlich erachteten Charakteristika (Eigenschaften, Beziehungen, etc.) eines Untersuchungsgegenstandes heraus und machen diesen so einem Verständnis bzw. einer weiterführenden Untersuchung zugänglich. Es (...) ist üblich, zwischen Skalenmodellen, Analogmodellen und theoretischen Modellen zu unterscheiden. (shrink)
Over the years, mathematics and statistics have become increasingly important in the social sciences1 . A look at history quickly confirms this claim. At the beginning of the 20th century most theories in the social sciences were formulated in qualitative terms while quantitative methods did not play a substantial role in their formulation and establishment. Moreover, many practitioners considered mathematical methods to be inappropriate and simply unsuited to foster our understanding of the social domain. Notably, the famous Methodenstreit also concerned (...) the role of mathematics in the social sciences. Here, mathematics was considered to be the method of the natural sciences from which the social sciences had to be separated during the period of maturation of these disciplines. All this changed by the end of the century. By then, mathematical, and especially statistical, methods were standardly used, and their value in the social sciences became relatively uncontested. The use of mathematical and statistical methods is now ubiquitous: Almost all social sciences rely on statistical methods to analyze data and form hypotheses, and almost all of them use (to a greater or lesser extent) a range of mathematical methods to help us understand the social world. Additional indication for the increasing importance of mathematical and statistical methods in the social sciences is the formation of new subdisciplines, and the establishment of specialized journals and societies. Indeed, subdisciplines such as Mathematical Psychology and Mathematical Sociology emerged, and corresponding journals such as The Journal of Mathematical Psychology (since 1964), The Journal of Mathematical Sociology (since 1976), Mathematical Social Sciences (since 1980) as well as the online journals Journal of Artificial Societies and Social Simulation (since 1998) and Mathematical Anthropology and Cultural Theory (since 2000) were established. What is more, societies such as the Society for Mathematical Psychology (since 1976) and the Mathematical Sociology Section of the American Sociological Association (since 1996) were founded. Similar developments can be observed in other countries. The mathematization of economics set in somewhat earlier (Vazquez 1995; Weintraub 2002). However, the use of mathematical methods in economics started booming only in the second half of the last century (Debreu 1991). Contemporary economics is dominated by the mathematical approach, although a certain style of doing economics became more and more under attack in the last decade or so. Recent developments in behavioral economics and experimental economics can also be understood as a reaction against the dominance (and limitations) of an overly mathematical approach to economics. There are similar debates in other social sciences. It is, however, important to stress that problems of one method (such as axiomatization or the use of set theory) can hardly be taken as a sign of bankruptcy of mathematical methods in the social sciences tout court. This chapter surveys mathematical and statistical methods used in the social sciences and discusses some of the philosophical questions they raise. It is divided into two parts. Sections 1 and 2 are devoted to mathematical methods, and Sections 3 to 7 to statistical methods. As several other chapters in this handbook provide detailed accounts of various mathematical methods, our remarks about the latter will be rather short and general. Statistical methods, on the other hand, will be discussed in-depth. (shrink)
Simulation (von lat. simulare, engl. simulation, franz. simulation, ital. simulazione), Bezeichnung für die Nachahmung eines Prozesses durch einen anderen Prozeß. Beide Prozesse laufen auf einem bestimmten System ab. Simuliertes u. simulierendes System (der Simulator in der Kybernetik) können dabei auf gleichen oder unterschiedlichen Substraten realisiert sein.
Let me first state that I like Antti Revonsuo’s discussion of the various methodological and interpretational problems in neuroscience. It shows how careful and methodologically reflected scientists have to proceed in this fascinating field of research. I have nothing to add here. Furthermore, I am very sympathetic towards Revonsuo’s general proposal to call for a Philosophy of Neuroscience that stresses foundational issues, but also focuses on methodological and explanatory strategies.2 In a footnote of his paper, Revonsuo complains – as many (...) others do today – about what is sometimes called “physics imperialism”. This is the view that physics dominates the philosophy of science. I am not sure if this is still the case nowadays, but it is certainly historically correct that almost all work in the field of methodology centered around cases from physics. Although this has been changing, there are still plenty of special sciences philosophers did not worry about much. Admittedly, I am myself a trained physicist and not a neuroscientist and will therefore probably be biased negatively. As it is, I will discuss some examples from physics in order to illustrate my points. (shrink)
A consistent finding in research on conditional reasoning is that individuals are more likely to endorse the valid modus ponens (MP) inference than the equally valid modus tollens (MT) inference. This pattern holds for both abstract task and probabilistic task. The existing explanation for this phenomenon within a Bayesian framework (e.g., Oaksford & Chater, 2008) accounts for this asymmetry by assuming separate probability distributions for both MP and MT. We propose a novel explanation within a computational-level Bayesian account of reasoning (...) according to which “argumentation is learning”. We show that the asymmetry must appear for certain prior probability distributions, under the assumption that the conditional inference provides the agent with new information that is integrated into the existing knowledge by minimizing the Kullback-Leibler divergence between the posterior and prior probability distribution. We also show under which conditions we would expect the opposite pattern, an MT-MP asymmetry. (shrink)
Das Thema dieses Bandes ist die Frage, ob die Wissenschaftstheorie eine normative Disziplin ist. Zunächst überrascht die Frage, denn für viele Wissenschaftstheoretiker ist die Antwort ein klares „Ja“; sie halten es für einen Allgemeinplatz, dass die Wissenschaftstheorie ein normatives Unternehmen ist. Bei genauerem Hinsehen stellt sich jedoch heraus, dass die Frage unterschiedliche Interpretationen zulässt, die einzeln diskutiert werden müssen. Dies geschieht im ersten Abschnitt. Im zweiten Abschnitt suchen wir nach möglichen Erklärungen dafür, warum die Wissenschaftstheorie bisher bei dem Projekt, eine (...) allseits akzeptable Methodologie wissenschaftlichen Schließens zu formulieren und zu begründen, so wenig Erfolg hatte. Eine mögliche Erklärung für den ausbleibenden Erfolg ist der Partikularismus, wonach Methoden keine allgemeine, sondern nur eine lokale, bereichsabhängige Gültigkeit haben. Im dritten Abschnitt wollen wir an Hand des Bayesianismus zeigen, dass die Methodologie doch nicht so schlecht da steht, wie es zunächst den Anschein hat. Der Bayesianismus ist für viele Wissenschaftstheoretiker der aussichtsreichste Kandidat für eine allgemeine Theorie induktiven Räsonierens. Wir besprechen seine Vorzüge, stellen aber auch dar, welche Konzessionen er an den Partikularismus machen muss. (shrink)
This review is a critical discussion of three main claims in Debs and Redhead’s thought-provoking book Objectivity, Invariance, and Convention. These claims are: (i) Social acts impinge upon formal aspects of scientific representation; (ii) symmetries introduce the need for conventional choice; (iii) perspectival symmetry is a necessary and sufficient condition for objectivity, while symmetry simpliciter fails to be necessary.
Intertheoretic relations are an important topic in the philosophy of science. However, since their classical discussion by Ernest Nagel, such relations have mostly been restricted to relations between pairs of theories in the natural sciences. In this paper, we present a model of a new type of intertheoretic relation, called 'Montague Reduction', which is assumed in Montague's framework for the analysis and interpretation of natural language syntax. To motivate the adoption of our new model, we show that this model extends (...) the scope of application of the Nagelian (or related) models, and that it shares the epistemological advantages of the Nagelian model. The latter is achieved in a Bayesian framework. (shrink)
The chromodielectric soliton model is Lorentz and chirally invariant. It has been demonstrated to exhibit dynamical chiral symmetry breaking and spatial confinement in the locally uniform approximation. We here study the full nonlocal quark self-energy in a color-dielectric medium modeled by a two-parameter Fermi function. Here color confinement is manifest. The self-energy thus obtained is used to calculate quark wave functions in the medium which, in turn, are used to calculate the nucleon and pion masses in the one-gluon-exchange approximation. The (...) nucleon mass is fixed to its empirical value using scaling arguments; the pion mass (for massless current quarks) turns out to be small but nonzero, depending on the model parameters. (shrink)
Die Frage nach dem Verhältnis aufeinanderfolgender Theorien rückte spätestens mit der Publikation von T. S. Kuhns einflußreicher Schrift Die Struktur wissenschaftlicher Revolutionen im Jahre 1961 in den Brennpunkt wissenschaftsphilosophischer Untersuchungen. Dabei gibt es im wesentlichen zwei große Lager. Auf der einen Seite stehen Philosophen wie P. Feyerabend und T. S. Kuhn selbst, die den Aspekt der Diskontinuität...
In his entry on "Quantum Logic and Probability Theory" in the Stanford Encyclopedia of Philosophy, Alexander Wilce (2012) writes that "it is uncontroversial (though remarkable) the formal apparatus quantum mechanics reduces neatly to a generalization of classical probability in which the role played by a Boolean algebra of events in the latter is taken over the 'quantum logic' of projection operators on a Hilbert space." For a long time, Patrick Suppes has opposed this view (see, for example, the paper collected (...) in Suppes and Zanotti (1996). Instead of changing the logic and moving from a Boolean algebra to a non-Boolean algebra, one can also 'save the phenomena' by weakening the axioms of probability theory and work instead with upper and lower probabilities. However, it is fair to say that despite Suppes' efforts upper and lower probabilities are not particularly popular in physics as well as in the foundations of physics, at least so far. Instead, quantum logic is booming again, especially since quantum information and computation became hot topics. Interestingly, however, imprecise probabilities are becoming more and more popular in formal epistemology as recent work by authors such as James Joye (2010) and Roger White (2010) demonstrates. (shrink)
Fundamental aspects of modern life owe their existence to the achievements of scientific reason. In other words, science is an integral element of the modern world and simultaneously the epitome of the rational nature of a technical culture that makes up the essence of the modern world. Without science, the modern world would lose its very nature and modern society its future. Right from the start, physics forms the core of European scientific development. It is the original paradigm of science, (...) the foundation of technology and a constitutive part of a rational culture. It will remain a model methodological discipline in the future and its strengths will be used fruitfully in interdisciplinary and transdisciplinary collaboration. (shrink)
Am 14. Juli 1995 berichteten die angesehene Wissenschaftszeitschrift Science sowie die berühmte amerikanische Tageszeitung New York Times – auf dem Titelblatt – gleichzeitig über die erstmalige experimentelle Erzeugung eines Bose-Einstein-Kondensates aus einem Gas schwach wechselwirkender Alkaliatome am Joint Institute for Laboratory Astrophy- sics (JILA) in Boulder/Colorado (USA). Was war an dieser Leistung so bedeutsam, dass man sich entschloss, sie auf jene Weise bekannt zu geben?
Vacuum (leer, frei) bezeichnete bis zum 19. Jahrhundert allein den körperlosen Raum. Unter dem Einfluss physikalischer (Feld-) Theorien meint der Terminus inzwischen diejenige residuale physische Entiät, die einen vorgegebenen Raum ausfüllt bzw. im Prinzip ausfüllen würde, nachdem alles, was mit physikalischen Mitteln entfernt werden kann, aus dem Raum entfernt wurde. Theorien über das V. sind daher eng mit Theorien über die Struktur des Raumes, die Bewegung, die physikalischen Gegenstände und deren Wechselwirkungen verbunden. In der Quantentheorie bezeichnet V. den Zustand niedrigster (...) Energie (Grundzustand) eines Systems. (shrink)
An Beispielen aus der Entwicklung der Elementarteilchenphysik wird aufgezeigt, welche Rolle Modelle im Entstehungsprozess einer physikalischen Theorie spielen.
In den vergangenen Jahren hat die Europäische Union (EU) wiederholt versucht, ihre Institutionen zu reformieren. Als der Entwurf für eine Europäische Verfassung und später der Vertrag von Lissabon ausgehandelt wurden, betraf einer der meistdiskutiertesten Streitpunkte die Frage, nach welcher Entscheidungsregel der EU-Ministerrat abstimmen sollte. Diese Frage ist eine genuin normative Frage. Deshalb sollten auch politische Philosophen und Ethiker etwas zu dieser Frage beitragen können. Im folgenden wollen wir uns dieser Herausforderung stellen und alternative Entscheidungsregeln für den EU-Ministerrat bewerten. Dabei erweisen (...) sich die Methoden der probablistische Modellierung und der Simulation sozialer Prozesse als unerlässlich.1 Damit wird deutlich, wie Simulationen auch innerhalb der angewandten politischen Philosophie als Methode eingesetzt werden können. (shrink)
Die Frage, was eine wissenschaftliche Erklärung ist, stellt seit mehr als einem halben Jahrhundert ein zentrales Thema der Wissenschaftsphilosophie dar. Die heutige Diskussion begann mit einer richtungsweisenden Arbeit von Carl Hempel im Jahre 1942 über den Erklärungsbegriff in der Geschichtswissenschaft. In dieser Arbeit gab Hempel, frühere Überlegungen von John Stuart Mill, Karl Popper und anderen präzisierend, eine formale Definition der Erklärung eines singulären Faktums.1 Mit seiner dem zugrunde liegenden Auffassung, dass die Wissenschaften sehr wohl in der Lage sind, Erklärungen zu (...) liefern, setzt sich Hempel ab von den zur damaligen Zeit vorherrschenden antimetaphysisch gestimmten Erklärungsskeptikern wie Pierre Duhem. Es ist jedoch wichtig zu betonen, dass Hempels Definition des Konzeptes der wissenschaftlichen Erklärung nicht wesentlich über das Deskriptive hinaus geht und damit auch und gerade von Empiristen akzeptiert werden kann. Bekanntlich versteht man unter einer Erklärung in Hempels Deduktiv-Nomologischen (D-N) Modell ein deduktiv gültiges Argument, zu dessen Prämisse (dem sog. Explanans) mindestens ein universelles Gesetz und eine Menge von singulären Sätzen gehört und dessen Konklusion das zu erklärende Faktum (das sog. Explanandum) ist. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.