Leibniz’s question “why is there something rather than nothing?”, also known as the Primordial Existential Question, has often been the focus of intense philosophical controversy. While some authors take it to pose a profound metaphysical puzzle, others denounce the alleged lack of meaning or the inconceivability of the idea of nothingness. In a series of articles, Adolf Grünbaum develops an empirically informed critique with the aim to demonstrate that the Primordial Existential Question poses a “non-issue” which does not require explanation. (...) Grünbaum’s critique prompted heated debates in the recent literature. In this paper, I examine each step of Grünbaum’s reasoning and argue that it fails to show that the Primordial Existential Question is ill-founded. Moreover, I identify and rebut several strategies that one may employ to amend Grünbaum’s critique. In doing so, I address various issues related to the Primordial Existential Question, including the alleged need for its proponents to rely on contentious metaphysical presuppositions and the purported availability of empirical evidence which answers or dissolves such a question. (shrink)
Composta attorno al 1065-1067, la lettera "Sull’onnipotenza divina" di Pier Damiani si apre con una questione posta da Desiderio, abate di Montecassino: «Sebbene Dio possa ogni cosa, non può restituire la verginità a colei che l’ha perduta. Egli ha certamente il potere di liberarla dalla pena, ma non può ridarle la corona della verginità che ha perduto». Il problema, che Pier Damiani riprendeva dalla lettera XXII di san Gerolamo, è solo in apparenza ozioso: il monaco ravennate ne fa un’autentica questione (...) filosofica, un “esperimento mentale” che solleva domande cruciali sulla natura del tempo e sul rapporto tra necessità e contingenza, leggi divine e princìpi logici, natura divina e natura umana. Il volume, a cura di Roberto Limonta, presenta la traduzione del De divina omnipotentia con note e testo latino a fronte. La prefazione di Mariateresa Fumagalli Beonio Brocchieri e il saggio introduttivo di Roberto Limonta ricostruiscono la lunga durata della questione posta da Pier Damiani, dai dibattiti teologici dei secoli medievali sino alle sue fortune nella filosofia, teologia e letteratura contemporanea. (shrink)
This paper presents a semantical analysis of the Weak Kleene Logics Kw3 and PWK from the tradition of Bochvar and Halldén. These are three-valued logics in which a formula takes the third value if at least one of its components does. The paper establishes two main results: a characterisation result for the relation of logical con- sequence in PWK – that is, we individuate necessary and sufficient conditions for a set.
This paper discusses three relevant logics that obey Component Homogeneity - a principle that Goddard and Routley introduce in their project of a logic of significance. The paper establishes two main results. First, it establishes a general characterization result for two families of logic that obey Component Homogeneity - that is, we provide a set of necessary and sufficient conditions for their consequence relations. From this, we derive characterization results for S*fde, dS*fde, crossS*fde. Second, the paper establishes complete sequent calculi (...) for S*fde, dS*fde, crossS*fde. Among the other accomplishments of the paper, we generalize the semantics from Bochvar, Hallden, Deutsch and Daniels, we provide a general recipe to define containment logics, we explore the single-premise/single-conclusion fragment of S*fde, dS*fde, crossS*fdeand the connections between crossS*fde and the logic Eq of equality by Epstein. Also, we present S*fde as a relevant logic of meaninglessness that follows the main philosophical tenets of Goddard and Routley, and we briefly examine three further systems that are closely related to our main logics. Finally, we discuss Routley's criticism to containment logic in light of our results, and overview some open issues. (shrink)
Grounding contingentism is the doctrine according to which grounds are not guaranteed to necessitate what they ground. In this paper I will argue that the most plausible version of contingentism is incompatible with the idea that the grounding relation is transitive, unless either ‘priority monism’ or ‘contrastivism’ are assumed.
In this paper I will present three arguments (based on the notions of constitution, metaphysical reality, and truth, respectively) with the aim of shedding some new light on the structure of Fine’s (2005, 2006) ‘McTaggartian’ arguments against the reality of tense. Along the way, I will also (i) draw a novel map of the main realist positions about tense, (ii) unearth a previously unnoticed but potentially interesting form of external relativism (which I will label ‘hyper-presentism’) and (iii) sketch a novel (...) interpretation of Fine’s fragmentalism (which I contrast with Lipman’s 2015, 2016b, forthcoming). (shrink)
According to Composition is Identity, a whole is literally identical to the plurality of its parts. According to Mereological Nihilism, nothing has proper parts. In this note, it is argued that Composition is Identity can be shown to entail Mereological Nihilism in a much more simple and direct way than the one recently proposed by Claudio Calosi.
This paper is concerned with certain ontological issues in the foundations of geographic representation. It sets out what these basic issues are, describes the tools needed to deal with them, and draws some implications for a general theory of spatial representation. Our approach has ramifications in the domains of mereology, topology, and the theory of location, and the question of the interaction of these three domains within a unified spatial representation theory is addressed. In the final part we also consider (...) the idea of non-standard geographies, which may be associated with geography under a classical conception in the same sense in which non-standard logics are associated with classical logic. (shrink)
What are the relationships between an entity and the space at which it is located? And between a region of space and the events that take place there? What is the metaphysical structure of localization? What its modal status? This paper addresses some of these questions in an attempt to work out at least the main coordinates of the logical structure of localization. Our task is mostly taxonomic. But we also highlight some of the underlying structural features and we single (...) out the interactions between the notion of localization and nearby notions, such as the notions of part and whole, or of necessity and possibility. A theory of localization—we argue—is needed in order to account for the basic relations between objects and space, and runs afoul a pure part-whole theory. We also provide an axiomatization of the relation of localization and examine cases of localization involving entities different from material objects. (shrink)
The possibility of changing the past by means of time-travel appears to depend on the possibility of distinguishing the past as it is ‘before’ and ‘after’ the time-travel. So far, all the metaphysical models that have been proposed to account for the possibility of past-changing time-travels operate this distinction by conceiving of time as multi-dimensional, and thus by significantly inflating our metaphysics of time. The aim of this article is to argue that there is an intuitive sense in which past-changing (...) time-travels are metaphysically possible also in one-dimensional time. (shrink)
The aim of this study is to address the “Grounding Grounding Problem,” that is, the question as to what, if anything, grounds facts about grounding. I aim to show that, if a seemingly plausible principle of modal recombination between fundamental facts and the principle customarily called “Entailment” are assumed, it is possible to prove not only that grounding facts featuring fundamental, contingent grounds are derivative but also that either they are partially grounded in the grounds they feature or they are (...) “abysses”. (shrink)
Fine (2005, 2006) has presented a ‘trilemma’ concerning the tense-realist idea that reality is constituted by tensed facts. According to Fine, there are only three ways out of the trilemma, consisting in what he takes to be the three main families of tense-realism: ‘presentism’, ‘(external) relativism’, and ‘fragmentalism’. Importantly, although Fine characterises tense-realism as the thesis that reality is constituted (at least in part) by tensed facts, he explicitly claims that tense realists are not committed to their fundamental existence. Recently, (...) Correia and Rosenkranz (2011, 2012) have claimed that Fine’s tripartite map of tense realism is incomplete as it misses a fourth position they call ‘dynamic absolutism’. In this paper, I will argue that dynamic absolutists are committed to the irreducible existence of tensed facts and that, for this reason, they face a similar trilemma concerning the notion of fact-content. I will thus conclude that a generalised version of Fine’s trilemma, concerning both fact-constitution and fact-content, is indeed inescapable. (shrink)
What is the relation between parts taken together and the whole that they compose? The recent literature appears to be dominated by two different answers to this question, which are normally thought of as being incompatible. According to the first, parts taken together are identical to the whole that they compose. According to the second, the whole is grounded in its parts. The aim of this paper is to make some theoretical room for the view according to which parts ground (...) the whole they compose while being, at the same time, identical to it. (shrink)
This chapter analyzes the concept of an event and of event representation as an umbrella notion. It provides an overview of different ways events have been dealt with in philosophy, linguistics, and cognitive science. This variety of positions has been construed in part as the result of different descriptive and explanatory projects. It is argued that various types of notions — common-sense, theoretically revised, scientific, and internalist psychological — be kept apart.
How to achieve responsibility through philosophical dialogue? This is indeed a core issue of today's democracy. In this regard, education plays an important role.
In this article I examine the main conceptions of public reason in contemporary political philosophy in order to set the frame for appreciating the novelty of the pragmatist understanding of public reason as based upon the notion of consequences and upon a theory of rationality as inquiry. The approach is inspired by Dewey but is free from any concern with history of philosophy. The aim is to propose a different understanding of the nature of public reason aimed at overcoming the (...) limitations of the existing approaches. Public reason is presented as the proper basis for discussing contested issues in the broad frame of deep democracy. (shrink)
Ordinary reasoning about space—we argue—is first and foremost reasoning about things or events located in space. Accordingly, any theory concerned with the construction of a general model of our spatial competence must be grounded on a general account of the sort of entities that may enter into the scope of the theory. Moreover, on the methodological side the emphasis on spatial entities (as opposed to purely geometrical items such as points or regions) calls for a reexamination of the conceptual categories (...) required for this task. Building on material presented in an earlier paper, in this work we offer some examples of what this amounts to, of the difficulties involved, and of the main directions along which spatial theories should be developed so as to combine formal sophistication with some affinity with common sense. (shrink)
Alessandro Torza argues that Ted Sider’s Lewisian argument against vague existence is insufficient to rule out the possibility of what he calls ‘super-vague existence’, that is the idea that existence is higher-order vague, for all orders. In this chapter it is argued that the possibility of super-vague existence is ineffective against the conclusion of Sider’s argument since super-vague existence cannot be consistently claimed to be a kind of linguistic vagueness. Torza’s idea of super-vague existence seems to be better suited to (...) model vague existence under the assumption that vague existence is instead a form of ontic indeterminacy, contra what Ted Sider and David Lewis assume. (shrink)
In this paper I start from a definition of “culture of the artificial” which might be stated by referring to the background of philosophical, methodological, pragmatical assumptions which characterizes the development of the information processing analysis of mental processes and of some trends in contemporary cognitive science: in a word, the development of AI as a candidate science of mind. The aim of this paper is to show how (with which plausibility and limitations) the discovery of the mentioned background might (...) be dated back to a period preceding the cybernetic era, the decade 1930–1940 at least. Therefore a somewhat detailed analysis of Hull's “robot approach” is given, as well as of some of its independent and future developments. -/- Reprinted in R.L. Chrisley (ed.), Artificial Intelligence: Critical Concepts in Cognitive Science, vol. 1, Routledge, London and New York, 2000, pp. 301-326. (shrink)
In this paper, we extend the expressive power of the logics K3, LP and FDE with anormality operator, which is able to express whether a for-mula is assigned a classical truth value or not. We then establish classical recapture theorems for the resulting logics. Finally, we compare the approach via normality operator with the classical collapse approach devisedby Jc Beall.
The supervaluationist approach to branching time (‘SBT-theory’) appears to be threatened by the puzzle of retrospective determinacy: if yesterday I uttered the sentence ‘It will be sunny tomorrow’ and only in some worlds overlapping at the context of utterance it is sunny the next day, my utterance is to be assessed as neither true nor false even if today is indeed a sunny day. John MacFarlane (“Truth in the Garden of Forking Paths” 81) has recently criticized a promising solution to (...) this puzzle for falling short of an adequate account of ‘actually’. In this paper, I aim to rebut MacFarlane's criticism. To this effect, I argue that: (i) ‘actually’ can be construed either as an indexical or as a nonindexical operator; (ii) if ‘actually’ is nonindexical, MacFarlane's criticism is invalid; (iii) there appear to be independent reasons for SBT-theorists to claim that ‘actually’ is a nonindexical expression. (shrink)
The book is divided into three parts. The first, containing three papers, focuses on the characterization of the central tenets of previii sentism (by Neil McKinnon) and eternalism (by Samuel Baron and Kristie Miller), and on the ‘sceptical stance’ (by Ulrich Meyer), a view to the effect that there is no substantial difference between presentism and eternalism. The second and main section of the book contains three pairs of papers that bring the main problems with presentism to the fore and (...) outlines its defence strategy. Each pair of papers in this section can be read as a discussion between presentists and eternalists, wherein each directly responds to the arguments and objections offered by the other. This is a discussion that is sometimes absent in the literature, or which is at best carried out in a fragmented way. The first two papers of the section deal with the problem of the compatibility of Special Relativity Theory (SRT) and presentism. SRT is often considered to be a theory that contradicts the main tenet of presentism, thereby rendering presentism at odds with one of our most solid scientific theories. Christian Wüthrich’s paper presents arguments for the incompatibility of the two theories (SRT and presentism) within a new framework that includes a discussion of further complications arising from the theory of Qauantum Mechanics. Jonathan Lowe’s paper, by contrast, develops new general arguments against the incompatibility thesis and replies to Wüthrich’s paper. The second pair of papers focuses on the problem that presentists face, in providing grounds for past tensed truths. In the first (by Matthew Davidson), new arguments are provided to defend the idea that the presentist cannot adequately explain how what is now true about the past is grounded, since for the presentist the past is completely devoid of ontological ground. The second paper (by Brian Kierland) takes up the challenge of developing a presentist explanation of past truths, beginning by outlining some existing views in the literature before advancing an original proposal. (shrink)
The project of a 'naive physics' has been the subject of attention in recent years above all in the artificial intelligence field, in connection with work on common-sense reasoning, perceptual representation and robotics. The idea of a theory of the common-sense world is however much older than this, having its roots not least in the work of phenomenologists and Gestalt psychologists such as K hler, Husserl, Schapp and Gibson. This paper seeks to show how contemporary naive physicists can profit from (...) a knowledge of these historical roots of their discipline, which are shown to imply above alla critique of the set-theory-based models of reality typically presupposed by contemporary work in common-sense ontology [1]. (shrink)
S’inspirant de Memorias. Entre dos mundos, ce texte se veut un témoignage et des commentaires de la perspective pédagogique développée par Mario Bunge dans le cadre de son enseignement de philosophie des sciences à l’Université de Buenos Aires dans les années soixante du siècle dernier. Perspective socratique du métier de philosophe qui ne renvoie cependant pas à une entreprise de dévoilement de la vérité, mais plutôt aux conditions de sa construction.
Since the second half of the XXth century, researchers in cybernetics and AI, neural nets and connectionism, Artificial Life and new robotics have endeavoured to build different machines that could simulate functions of living organisms, such as adaptation and development, problem solving and learning. In this book these research programs are discussed, particularly as regard the epistemological issues of the behaviour modelling. One of the main novelty of this book consists of the fact that certain projects involving the building of (...) simulative machine models before the advent of cybernetics are been investigated for the first time, on the basis of little known, and sometimes completely forgotten or unpublished, texts and figures. These pre-cybernetics projects can be considered as steps toward the “discovery” of a modelling methodology that has been fully developed by those more recent research programs, and that shares some of their central goals and key methodological proposals. -/- More info in Springer link: http://www.springer.com/new+%26+forthcoming+titles+%28default%29/book/978-1-4020-0606-7 -/- This book is the English translation of La scoperta dell'artificiale, Dunod/Masson, Milan, 1998. (shrink)
Classically, truth and falsehood are opposite, and so are logical truth and logical falsehood. In this paper we imagine a situation in which the opposition is so pervasive in the language we use as to threaten the very possibility of telling truth from falsehood. The example exploits a suggestion of Ramsey’s to the effect that negation can be expressed simply by writing the negated sentence upside down. The difference between ‘p’ and ‘~~p’ disappears, the principle of double negation becomes trivial, (...) and the truth/falsehood opposition is up for grabs. Our moral is that this indeterminacy undermines the idea of inferential role semantics. (shrink)
Since the early eighties, computationalism in the study of the mind has been “under attack” by several critics of the so-called “classic” or “symbolic” approaches in AI and cognitive science. Computationalism was generically identified with such approaches. For example, it was identified with both Allen Newell and Herbert Simon’s Physical Symbol System Hypothesis and Jerry Fodor’s theory of Language of Thought, usually without taking into account the fact ,that such approaches are very different as to their methods and aims. Zenon (...) Pylyshyn, in his influential book Computation and Cognition, claimed that both Newell and Fodor deeply influenced his ideas on cognition as computation. This probably added to the confusion, as many people still consider Pylyshyn’s book as paradigmatic of the computational approach in the study of the mind. Since then, cognitive scientists, AI researchers and also philosophers of the mind have been asked to take sides on different “paradigms” that have from time to time been proposed as opponents of (classic or symbolic) computationalism. Examples of such oppositions are: -/- computationalism vs. connectionism, computationalism vs. dynamical systems, computationalism vs. situated and embodied cognition, computationalism vs. behavioural and evolutionary robotics. -/- Our preliminary claim in section 1 is that computationalism should not be identified with what we would call the “paradigm (based on the metaphor) of the computer” (in the following, PoC). PoC is the (rather vague) statement that the mind functions “as a digital computer”. Actually, PoC is a restrictive version of computationalism, and nobody ever seriously upheld it, except in some rough versions of the computational approach and in some popular discussions about it. Usually, PoC is used as a straw man in many arguments against computationalism. In section 1 we look in some detail at PoC’s claims and argue that computationalism cannot be identified with PoC. In section 2 we point out that certain anticomputationalist arguments are based on this misleading identification. In section 3 we suggest that the view of the levels of explanation proposed by David Marr could clarify certain points of the debate on computationalism. In section 4 we touch on a controversial issue, namely the possibility of developing a notion of analog computation, similar to the notion of digital computation. A short conclusion follows in section 5. (shrink)
In a series of ten articles from leading American and European scholars, Pragmatist Epistemologies explores the central themes of epistemology in the pragmatist tradition through a synthesis of new and old pragmatist thought, engaging contemporary issues while exploring from a historical perspective. It opens a new avenue of research in contemporary pragmatism continuous with the main figures of pragmatist tradition and incorporating contemporary trends in philosophy. Students and scholars of American philosophy will find this book indispensable.
This paper discusses the possibility of an absolute vacuum - a space without any substance. The motivation of this study is the contrast between most philosophers, up to Descartes, who stated that a vacuum was impossible, and the 17th century change of outlook, when the possibility and effective existence of the vacuum was accepted after the experiments of Torricelli and Pascal. This article attempts to show that, contrary to the received opinion, the acceptance of an ether is preferable to the (...) acceptance of a vacuum for several reasons. First: it is impossible to provide an empirical proof of the non-existence of the ether; second, an absolute vacuum is unthinkable; third, the ether concept is useful for the understanding of physical phaenomena; and fourth, the hypothesis of an ether in apparently void spaces is useful for the future development of science. The paper also endeavours to show that no recent advance of science changed those conclusions and that no future development can change them. (shrink)
Considering topology as an extension of mereology, this paper analyses topological variants of mereological essentialism (the thesis that an object could not have different parts than the ones it has). In particular, we examine de dicto and de re versions of two theses: (i) that an object cannot change its external connections (e.g., adjacent objects cannot be separated), and (ii) that an object cannot change its topological genus (e.g., a doughnut cannot turn into a sphere). Stronger forms of structural essentialism, (...) such as morphological essentialism (an object cannot change shape) and locative essentialism (an object cannot change position) are also examined. (shrink)
The article deals with present day challenges related to the employ of technology in order to reduce the exposition of the human being to the risks and vulnerability of his or her existential condition. According to certain transhumanist and posthumanist thinkers, as well as some supporters of human enhancement, essential features of the human being, such as vulnerability and mortality, ought to be thoroughly overcome. The aim of this article is twofold: on the one hand, we wish to carry out (...) an enquiry into the ontological and ethical thinking of Hans Jonas, who was among the first to address these very issues with great critical insight; on the other hand, we endeavour to highlight the relevance of Jonas’ reflections to current challenges related to bioscience and biotechnological progress. In this regard, we believe that the transcendent and metaphysical relevance of the «image of man» introduced by Jonas is of paramount importance to understand his criticism against those attempts to ameliorate the human being by endangering his or her essence. (shrink)
Is any unified theory of brain function possible? Following a line of thought dat- ing back to the early cybernetics (see, e.g., Cordeschi, 2002), Clark (in press) has proposed the action-oriented Hierarchical Predictive Coding (HPC) as the account to be pursued in the effort of gain- ing the “Grand Unified Theory of the Mind”—or “painting the big picture,” as Edelman (2012) put it. Such line of thought is indeed appealing, but to be effectively pursued it should be confronted with experimental (...) findings and explana- tory capabilities (Edelman, 2012). The point we are making in this note is that a brain with predictive capa- bilities is certainly necessary to endow the agent situated in the environment with forethought or foresight, a crucial issue to outline the unified account advocated by Clark. But the capacity for fore- thought is deeply entangled with the capacity for emotions and when emotions are brought into the game, cogni- tive functions become part of a large-scale functional brain network. However, for such complex networks a consistent view of hierarchical organization in large-scale functional networks has yet to emerge (Bressler and Menon, 2010), whilst heterarchical organization is likely to play a strategic role (Berntson et al., 2012). This raises the necessity of a multilevel approach that embraces causal relations across levels of explanation in either direc- tion (bottom–up or top–down), endorsing mutual calibration of constructs across levels (Berntson et al., 2012). Which, in turn, calls for a revised perspective on Marr’s levels of analysis framework (Marr, 1982). In the following we highlight some drawbacks of Clark’s proposal in address- ing the above issues. (shrink)
Why is space 3-dimensional? The fi rst answer to this question, entirely based on Physics, was given by Ehrenfest, in 1917, who showed that the stability requirement for n-dimensional two-body planetary system very strongly constrains space dimensionality, favouring 3-d. This kind of approach will be generically called "stability postulate" throughout this paper and was shown by Tangherlini, in 1963, to be still valid in the framework of general relativity as well as for quantum mechanical hydrogen atom, giving the same constraint (...) for space{dimensionality. In the present work, before criticizing this methodology, a brief discussion has been introduced, aimed at stressing and clarifying some general physical aspects of the problem of how to determine the number of space dimensions. Then, the epistemological consequences of Ehrenfest's methodology are critically reviewed. An alternative procedure to get at the proper number of dimensions, in which the stability postulate - and the implicit singularities in three-dimensional physics - are not an essential part of the argument, is proposed. In this way, the main epistemological problems contained in Ehrenfest's original idea are avoided. The alternative methodology proposed in this paper is realized by obtaining and discussing the n-dimensional quantum theory as expressed in Planck's law, de Broglie relation and the Heisenberg uncertainty relation. As a consequence, it is possible to propose an experiment, based on thermal neutron di raction by crystals, to directly measure space dimensionality. Finally the distinguished role of Maxwell's electromagnetic theory in the determination of space dimensionality is stressed. (shrink)
The project of a naive physics has been the subject of attention in recent years above all in the artificial intelligence field, in connection with work on common-sense reasoning, perceptual representation and robotics. The idea of a theory of the common-sense world is however much older than this, having its roots not least in the work of phenomenologists and Gestalt psychologists such as Kohler, Husserl, Schapp and Gibson. This paper seeks to show how contemporary naive physicists can profit from a (...) knowledge of these historical roots of their discipline, which are shown to imply above all a critique of the set-theory-based models of reality typically presupposed by contemporary work in common-sense ontology. (shrink)
This Research Topic aimed at deepening our understanding of the levels and explanations that are of interest for cognitive sci- entists, neuroscientists, psychologists, behavioral scientists, and philosophers of science. Indeed, contemporary developments in neuroscience and psy- chology suggest that scientists are likely to deal with a multiplicity of levels, where each of the different levels entails laws of behavior appropriate to that level (Berntson et al., 2012). Also, gathering and modeling data at the different levels of analysis is not suffi- (...) cient: the integration of information across levels of analysis is a crucial issue. Given such state of affairs, a number of interesting questions arise. How can the autonomy of explanatory levels be properly understood in behavioral explanation? Is reductionism a satis- factory strategy? How can high-level and low-level models be constrained in order to be actually explanatory of both behav- ioral and neurological or molecular evidence? What is the kind of relationship between those models? (shrink)
The present article tries to analyze the role played in Hans Jonas’ ethical reflection by religious—namely, Jewish—tradition. Jonas goes in search of an ultimate foundation for his ethics and his theory of the good in order to face the challenges currently posed by technology’s nihilistic attitude towards life and ethics. Jonas’ ethical investigation enters into the domain of metaphysics, which offers an incomparable contribution to the philosophical endeavour, without undermining its overall independence. In this way, Jewish categories—such as remorse, shame, (...) sacrifice, repentance, and selfrestraint—strengthen the philosopher’s ethical reflection, since he considers them to be essential moral values for the technological epoch. Yet the reference to the Jewish tradition supplies Jonas’ ethical endeavour with a powerful but only hypothetical insight into transcendence. (shrink)
Metascientific criteria used for explaining or constraining physical space dimensionality and their historical relationship to prevailing causal systems are discussed. The important contributions by Aristotle, Kant and Ehrenfest to the dimensionality of space problem are considered and shown to be grounded on different causal explanations: causa materialis for Aristotle, causa efficiens for young Kant and an ingenious combination of causa efficiens and causa formalis for Ehrenfest. The prominent and growing rôle played by causa formalis in modern physical approaches to this (...) problem is emphasized. (shrink)
Il Tractatus de praedestinatione et de praescientia Dei respectu futurorum contingentium, composto da Guglielmo di Ockham tra il 1321 e il 1324, costituisce uno snodo cruciale nelle discussioni medievali sul tema del fatalismo teologico e sulle questioni che vi sono implicate, come la conoscenza dei futuri contingenti e il compatibilismo tra prescienza divina e libero arbitrio. Raccogliendo e ripensando fonti di diversa provenienza, Roberto Grossatesta e Pietro Lombardo in primis, il Venerabilis inceptor sposta il problema sul piano epistemologico e (...) linguistico, affrontandolo dal punto di vista di un’analisi proposizionale degli enunciati che parlano dei contingenti futuri. In questo modo egli affida agli strumenti dell’argomentazione logica e dell’indagine semantica il compito di sciogliere le implicazioni teologiche della questione, in una teoria che garantisca al contempo la prescienza di Dio e la libera volontà umana. Il principio della soluzione ockhamiana, che costituirà un punto di riferimento – pro o contro – nei dibattiti teologici del XIV secolo, consiste nell’intreccio tra analisi proposizionale e logica fidei, in nome di una soluzione pragmatica del dilemma compatibilista: come mostra il caso esemplare della profezia, gli enunciati della scienza divina costituiscono i postulati di una logica della credenza che poi procede da quelle premesse, attraverso una catena argomentativa, a formulare i precetti che guideranno i passi del cristiano nel mondo. Il volume rende disponibile per la prima volta al lettore non soltanto la prima traduzione in italiano del Tractatus, ma anche un ricco apparato di testi (le distinctiones 38, 39 e 40 dell’Ordinatio, i capitoli 7 e 27 della Summa Logicae, la quaestio IV.4 dei Quodlibeta, le Quaestiones in Libros Physicorum 41 e 44, il prologo della Expositio in libros Physicorum e un estratto dalla Expositio in Librum Perihermeneias Aristotelis) che consente di ricostruire in modo coerente una teoria ockhamiana della contingenza e di gettare luce su una nuova interpretazione del pensiero del teologo e filosofo inglese. (shrink)
There are two main ways in which the notion of mereological fusion is usually defined in the current literature in mereology which have been labelled ‘Leśniewski fusion’ and ‘Goodman fusion’. It is well-known that, with Minimal Mereology as the background theory, every Leśniewski fusion also qualifies as a Goodman fusion. However, the converse does not hold unless stronger mereological principles are assumed. In this paper I will discuss how the gap between the two notions can be filled, focussing in particular (...) on two specific sets of principles that appear to be of particular philosophical interest. The first way to make the two notions equivalent can be used to shed some interesting light on the kind of intuition both notions seem to articulate. The second shows the importance of a little-known mereological principle which I will call ‘Mild Supplementation’. As I will show, the mereology obtained by adding Mild Supplementation to Minimal Mereology occupies an interesting position in the landscape of theories that are stronger than Minimal Mereology but weaker than what Achille Varzi and Roberto Casati have labelled ‘Extensional Mereology’. (shrink)
Um dos principais problemas epistemológicos que podem ser encontrados no estudo dos experimentos mentais diz respeito ao critério de escolha entre casos que produzam resultados contraditórios. Que justificação possuímos para escolher entre C e ~C? James Roberto Brown argumenta a favor de um critério de probabilidade, onde se atribui probabilidade aos dois fenômenos e o mais provável de ser verdadeiro é o que aceitamos. John Norton argumenta que experimentos mentais são argumentos, por isso eles podem ser reconstruídos como argumentos. (...) Através desse processo podemos encontrar as falhas na argumentação do experimento e decidir, através de justificação lógica, qual é a conclusão correta. A abordagem de Brown se mostra muito infrutífera por não apresentar como o cálculo de probabilidade deve funcionar, nem como é possível atribuir probabilidades a verdade dos resultados dos experimentos mentais. Por outro lado, para aceitar a resposta de Norton, necessitamos aceitar que experimentos mentais realmente façam uso de argumentos e que isso seja tudo o que podemos dizer sobre eles. Essa estruturação não permite nenhuma outra fonte epistêmica para justificar os resultados dos experimentos metais. (shrink)
Buda não fez religião; fez filosofia e ciência. Foi o precursor do realismo científico, da psicanálise, da filosofia analítica, do existencialismo, do feminismo, da epistemologia, da teoria e crítica do conhecimento, da psicologia social, da psicologia positiva, do preservacionismo ecológico e de conceitos relativos à matéria e à energia que só muito recentemente a física quântica pôde comprovar. Saber adequadamente o que é Budismo é essencial para a formação e cultura de qualquer pessoa que não queira ser simplesmente mais um (...) alienado numa manada que caminha às cegas em meio a uma revolução tecnológica. É possível compreender o budismo de raiz através da linguagem e do conhecimento modernos, e estabelecer suas relações com o pensamento contemporâneo e suas referências. Com isso se torna possível aprofundar e ampliar nossa percepção a respeito da compatibilidade desses princípios milenares com nossas formas modernas de vida e conhecimento. O estudo necessário para isso é bastante trabalhoso. Budismo é um tema subjacente a uma gigantesca montanha literária e cultural. Quanto mais próximos estivermos do seu conceito original, mais profunda e volumosa será a escavação que temos que fazer. (shrink)
Logics based on weak Kleene algebra (WKA) and related structures have been recently proposed as a tool for reasoning about flaws in computer programs. The key element of this proposal is the presence, in WKA and related structures, of a non-classical truth-value that is “contaminating” in the sense that whenever the value is assigned to a formula ϕ, any complex formula in which ϕ appears is assigned that value as well. Under such interpretations, the contaminating states represent occurrences of a (...) flaw. However, since different programs and machines can interact with (or be nested into) one another, we need to account for different kind of errors, and this calls for an evaluation of systems with multiple contaminating values. In this paper, we make steps toward these evaluation systems by considering two logics, HYB1 and HYB2, whose semantic interpretations account for two contaminating values beside classical values 0 and 1. In particular, we provide two main formal contributions. First, we give a characterization of their relations of (multiple-conclusion) logical consequence—that is, necessary and sufficient conditions for a set Δ of formulas to logically follow from a set Γ of formulas in HYB1 or HYB2 . Second, we provide sound and complete sequent calculi for the two logics. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.