Research within the H2020 PROGRESSIVE project has identified good practices in user co-production strategies and methodologies. Early findings from research in the PROGRESSIVE project were shared with relevant stakeholders outside the consortium for consultation and review. The outcomes of that initial investigation highlighted the need to focus on the objectives, processes, and methods used in user and older people co-production. This guide adapts these insights and makes them relevant specifically for standardisation in ICT for active and healthy ageing. This guide (...) was approved by representatives of the PROGRESSIVE project on 22 February 2018. The consortium has requested comments from interested stakeholders in an enquiry from 1 March to 30 April 2018. The PROGRESSIVE guide was approved on 5 June 2018. (shrink)
There are two main ways in which the notion of mereological fusion is usually defined in the current literature in mereology which have been labelled ‘Leśniewski fusion’ and ‘Goodman fusion’. It is well-known that, with Minimal Mereology as the background theory, every Leśniewski fusion also qualifies as a Goodman fusion. However, the converse does not hold unless stronger mereological principles are assumed. In this paper I will discuss how the gap between the two notions can be filled, focussing in particular (...) on two specific sets of principles that appear to be of particular philosophical interest. The first way to make the two notions equivalent can be used to shed some interesting light on the kind of intuition both notions seem to articulate. The second shows the importance of a little-known mereological principle which I will call ‘Mild Supplementation’. As I will show, the mereology obtained by adding Mild Supplementation to Minimal Mereology occupies an interesting position in the landscape of theories that are stronger than Minimal Mereology but weaker than what Achille Varzi and Roberto Casati have labelled ‘Extensional Mereology’. (shrink)
According to ‘Strong Composition as Identity’, if an entity is composed of a plurality of entities, it is identical to them. As it has been argued in the literature, SCAI appears to give rise to some serious problems which seem to suggest that SCAI-theorists should take their plural quantifier to be governed by some ‘weak’ plural comprehension principle and, thus, ‘exclude’ some kinds of pluralities from their plural ontology. The aim of this paper is to argue that, contrary to what (...) may appear at first sight, the assumption of a weak plural comprehension principle is perfectly compatible with plural logic and the common uses of plural quantification. As I aim to show, SCAI-theorists can simply claim that their theory must be understood as formulated by means of the most ‘joint-carving’ plural quantifier, thus leaving open the possibility of other, less joint-carving, ‘unrestricted’ plural quantifiers. In the final part of the paper I will also suggest that SCAI-theorists should not only allow for singular quantification over pluralities of entities, but also for plural quantification over ‘super-pluralities’ of entities. (shrink)
Grounding contingentism is the doctrine according to which grounds are not guaranteed to necessitate what they ground. In this paper I will argue that the most plausible version of contingentism is incompatible with the idea that the grounding relation is transitive, unless either ‘priority monism’ or ‘contrastivism’ are assumed.
In this paper I address two important objections to the theory called ‘ Composition as Identity’ : the ‘wall-bricks-and-atoms problem’, and the claim that CAI entails mereological nihilism. I aim to argue that the best version of CAI capable of addressing both problems is the theory I will call ‘Atomic Composition as Identity’ which consists in taking the plural quantifier to range only over proper pluralities of mereological atoms and every non-atomic entity to be identical to the plurality of atoms (...) it fuses. I will proceed in three main steps. First, I will defend Sider’s Composition as identity. Oxford University Press, Oxford, pp 211–221, 2014) idea of weakening the comprehension principle for pluralities and I will show that :219–235, 2016a) it can ward off both the WaBrA problem and the threat of mereological nihilism. Second, I will argue that CAI-theorists should uphold an ‘atomic comprehension principle’ which, jointly with CAI, entails that there are only proper pluralities of mereological atoms. Finally, I will present a novel reading of the ‘one of’ relation that not only avoids the problems presented by Yi Composition as identity. Oxford University Press, Oxford, pp 169–191, 2014) and Calosi :429–443, 2016b, Am Philos Q 55:281–292, 2018) but can also help ACAI-theorists to make sense of the idea that a composite entity is both one and many. (shrink)
In this paper I will present three arguments (based on the notions of constitution, metaphysical reality, and truth, respectively) with the aim of shedding some new light on the structure of Fine’s (2005, 2006) ‘McTaggartian’ arguments against the reality of tense. Along the way, I will also (i) draw a novel map of the main realist positions about tense, (ii) unearth a previously unnoticed but potentially interesting form of external relativism (which I will label ‘hyper-presentism’) and (iii) sketch a novel (...) interpretation of Fine’s fragmentalism (which I contrast with Lipman’s 2015, 2016b, forthcoming). (shrink)
This paper is concerned with certain ontological issues in the foundations of geographic representation. It sets out what these basic issues are, describes the tools needed to deal with them, and draws some implications for a general theory of spatial representation. Our approach has ramifications in the domains of mereology, topology, and the theory of location, and the question of the interaction of these three domains within a unified spatial representation theory is addressed. In the final part we also consider (...) the idea of non-standard geographies, which may be associated with geography under a classical conception in the same sense in which non-standard logics are associated with classical logic. (shrink)
According to Composition is Identity, a whole is literally identical to the plurality of its parts. According to Mereological Nihilism, nothing has proper parts. In this note, it is argued that Composition is Identity can be shown to entail Mereological Nihilism in a much more simple and direct way than the one recently proposed by Claudio Calosi.
What is the relation between parts taken together and the whole that they compose? The recent literature appears to be dominated by two different answers to this question, which are normally thought of as being incompatible. According to the first, parts taken together are identical to the whole that they compose. According to the second, the whole is grounded in its parts. The aim of this paper is to make some theoretical room for the view according to which parts ground (...) the whole they compose while being, at the same time, identical to it. (shrink)
ABSTRACTThe ability of providing an adequate supervenience base for tensed truths may seem to be one of the main theoretical advantages of both the growing-block and the moving-spotlight theory of time over presentism. However, in this paper I will argue that some propositions appear to be as problematic for growing-block theorists as past-directed propositions are for presentists, namely propositions stating that nothing will be the case in the future. Furthermore, I will show that the moving-spotlight theory can adequately address all (...) the main supervenience challenges that can be levelled against A-theories of time. I will, thus, conclude that, at least as far as the supervenience principle is concerned, the moving-spotlight theory should be preferred over both presentism and the growing-block theory. (shrink)
Fine (2005, 2006) has presented a ‘trilemma’ concerning the tense-realist idea that reality is constituted by tensed facts. According to Fine, there are only three ways out of the trilemma, consisting in what he takes to be the three main families of tense-realism: ‘presentism’, ‘(external) relativism’, and ‘fragmentalism’. Importantly, although Fine characterises tense-realism as the thesis that reality is constituted (at least in part) by tensed facts, he explicitly claims that tense realists are not committed to their fundamental existence. Recently, (...) Correia and Rosenkranz (2011, 2012) have claimed that Fine’s tripartite map of tense realism is incomplete as it misses a fourth position they call ‘dynamic absolutism’. In this paper, I will argue that dynamic absolutists are committed to the irreducible existence of tensed facts and that, for this reason, they face a similar trilemma concerning the notion of fact-content. I will thus conclude that a generalised version of Fine’s trilemma, concerning both fact-constitution and fact-content, is indeed inescapable. (shrink)
The possibility of changing the past by means of time-travel appears to depend on the possibility of distinguishing the past as it is ‘before’ and ‘after’ the time-travel. So far, all the metaphysical models that have been proposed to account for the possibility of past-changing time-travels operate this distinction by conceiving of time as multi-dimensional, and thus by significantly inflating our metaphysics of time. The aim of this article is to argue that there is an intuitive sense in which past-changing (...) time-travels are metaphysically possible also in one-dimensional time. (shrink)
Most of the theories of location on the market appear to be ideologically parsimonious at least in the sense that they take as primitive just one locative notion and define all the other locative notions in terms of it. Recently, however, the possibility of some exotic metaphysical scenarios involving gunky mixtures and extended simple regions of space has been argued to pose a significant threat to parsimonious theories of locations. The aim of this paper is to show that a theory (...) taking as primitive a notion of plural pervasive location and allowing for irreducibly plural locative facts can account for all the putatively problematic scenarios for parsimonious theories of location. Furthermore, I will also argue that the notion of plural pervasive location is compatible with the possibility of multilocation. (shrink)
I present a ‘stage-theoretical’ interpretation of the supervaluationist semantics for the growing-block theory of time according to which the ‘nodes’ on the branching tree of historical possibilities are taken to be possible stages of the growth of the growing-block. As I will argue, the resulting interpretation (i) is very intuitive, (ii) can easily ward off an objection to supervaluationist treatments of the growing-block theory presented by Fabrice Correia and Sven Rosenkranz, and (iii) is also not saddled by the problems affecting (...) the supervaluationist version of the growing-block theory defended by R. A. Briggs and Graeme A. Forbes. (shrink)
The aim of this study is to address the “Grounding Grounding Problem,” that is, the question as to what, if anything, grounds facts about grounding. I aim to show that, if a seemingly plausible principle of modal recombination between fundamental facts and the principle customarily called “Entailment” are assumed, it is possible to prove not only that grounding facts featuring fundamental, contingent grounds are derivative but also that either they are partially grounded in the grounds they feature or they are (...) “abysses”. (shrink)
In its simplest form, a Spritz is an aperitif made with (sparkling) water and (white) wine. A ‘gunky Spritz’, as I will call it, is a Spritz in which the water and the wine are mixed through and through, so that every proper part of the Spritz has a proper part containing both water and wine. In the literature on the notion of location the possibility of mixtures like a gunky Spritz has been thought of as either threatening seemingly intuitive (...) locative principles, or as requiring the position of multiple primitive locative relations. In this paper I present a new theory of location which assumes as primitive only the notion of pervasive location and show that it can account for the possibility of gunky Spritz in an intuitive and adequate way. (shrink)
This paper presents a semantical analysis of the Weak Kleene Logics Kw3 and PWK from the tradition of Bochvar and Halldén. These are three-valued logics in which a formula takes the third value if at least one of its components does. The paper establishes two main results: a characterisation result for the relation of logical con- sequence in PWK – that is, we individuate necessary and sufficient conditions for a set.
This paper discusses three relevant logics that obey Component Homogeneity - a principle that Goddard and Routley introduce in their project of a logic of significance. The paper establishes two main results. First, it establishes a general characterization result for two families of logic that obey Component Homogeneity - that is, we provide a set of necessary and sufficient conditions for their consequence relations. From this, we derive characterization results for S*fde, dS*fde, crossS*fde. Second, the paper establishes complete sequent calculi (...) for S*fde, dS*fde, crossS*fde. Among the other accomplishments of the paper, we generalize the semantics from Bochvar, Hallden, Deutsch and Daniels, we provide a general recipe to define containment logics, we explore the single-premise/single-conclusion fragment of S*fde, dS*fde, crossS*fdeand the connections between crossS*fde and the logic Eq of equality by Epstein. Also, we present S*fde as a relevant logic of meaninglessness that follows the main philosophical tenets of Goddard and Routley, and we briefly examine three further systems that are closely related to our main logics. Finally, we discuss Routley's criticism to containment logic in light of our results, and overview some open issues. (shrink)
I argue that the growing-block theory of time and truthmaker maximalism jointly entail that some truthmakers undergo mereological change as time passes. Central to my argument is a grounding-based account of what I call the “purely incremental” nature of the growing-block theory of time. As I will show, the argument presented in this paper suggests that growing-block theorists endorsing truthmaker maximalism have reasons to take composition to be restricted and the “block” of reality to literally grow as time goes by.
Alessandro Torza argues that Ted Sider’s Lewisian argument against vague existence is insufficient to rule out the possibility of what he calls ‘super-vague existence’, that is the idea that existence is higher-order vague, for all orders. In this chapter it is argued that the possibility of super-vague existence is ineffective against the conclusion of Sider’s argument since super-vague existence cannot be consistently claimed to be a kind of linguistic vagueness. Torza’s idea of super-vague existence seems to be better suited to (...) model vague existence under the assumption that vague existence is instead a form of ontic indeterminacy, contra what Ted Sider and David Lewis assume. (shrink)
A wide family of many-valued logics—for instance, those based on the weak Kleene algebra—includes a non-classical truth-value that is ‘contaminating’ in the sense that whenever the value is assigned to a formula φ, any complex formula in which φ appears is assigned that value as well. In such systems, the contaminating value enjoys a wide range of interpretations, suggesting scenarios in which more than one of these interpretations are called for. This calls for an evaluation of systems with multiple contaminating (...) values. In this paper, we consider the countably infinite family of multiple-conclusion consequence relations in which classical logic is enriched with one or more contaminating values whose behavior is determined by a linear ordering between them. We consider some motivations and applications for such systems and provide general characterizations for all consequence relations in this family. Finally, we provide sequent calculi for a pair of four-valued logics including two linearly ordered contaminating values before defining two-sided sequent calculi corresponding to each of the infinite family of many-valued logics studied in this paper. (shrink)
The supervaluationist approach to branching time (‘SBT-theory’) appears to be threatened by the puzzle of retrospective determinacy: if yesterday I uttered the sentence ‘It will be sunny tomorrow’ and only in some worlds overlapping at the context of utterance it is sunny the next day, my utterance is to be assessed as neither true nor false even if today is indeed a sunny day. John MacFarlane (“Truth in the Garden of Forking Paths” 81) has recently criticized a promising solution to (...) this puzzle for falling short of an adequate account of ‘actually’. In this paper, I aim to rebut MacFarlane's criticism. To this effect, I argue that: (i) ‘actually’ can be construed either as an indexical or as a nonindexical operator; (ii) if ‘actually’ is nonindexical, MacFarlane's criticism is invalid; (iii) there appear to be independent reasons for SBT-theorists to claim that ‘actually’ is a nonindexical expression. (shrink)
In this paper I start from a definition of “culture of the artificial” which might be stated by referring to the background of philosophical, methodological, pragmatical assumptions which characterizes the development of the information processing analysis of mental processes and of some trends in contemporary cognitive science: in a word, the development of AI as a candidate science of mind. The aim of this paper is to show how (with which plausibility and limitations) the discovery of the mentioned background might (...) be dated back to a period preceding the cybernetic era, the decade 1930–1940 at least. Therefore a somewhat detailed analysis of Hull's “robot approach” is given, as well as of some of its independent and future developments. -/- Reprinted in R.L. Chrisley (ed.), Artificial Intelligence: Critical Concepts in Cognitive Science, vol. 1, Routledge, London and New York, 2000, pp. 301-326. (shrink)
Il Tractatus de praedestinatione et de praescientia Dei respectu futurorum contingentium, composto da Guglielmo di Ockham tra il 1321 e il 1324, costituisce uno snodo cruciale nelle discussioni medievali sul tema del fatalismo teologico e sulle questioni che vi sono implicate, come la conoscenza dei futuri contingenti e il compatibilismo tra prescienza divina e libero arbitrio. Raccogliendo e ripensando fonti di diversa provenienza, Roberto Grossatesta e Pietro Lombardo in primis, il Venerabilis inceptor sposta il problema sul piano epistemologico e (...) linguistico, affrontandolo dal punto di vista di un’analisi proposizionale degli enunciati che parlano dei contingenti futuri. In questo modo egli affida agli strumenti dell’argomentazione logica e dell’indagine semantica il compito di sciogliere le implicazioni teologiche della questione, in una teoria che garantisca al contempo la prescienza di Dio e la libera volontà umana. Il principio della soluzione ockhamiana, che costituirà un punto di riferimento – pro o contro – nei dibattiti teologici del XIV secolo, consiste nell’intreccio tra analisi proposizionale e logica fidei, in nome di una soluzione pragmatica del dilemma compatibilista: come mostra il caso esemplare della profezia, gli enunciati della scienza divina costituiscono i postulati di una logica della credenza che poi procede da quelle premesse, attraverso una catena argomentativa, a formulare i precetti che guideranno i passi del cristiano nel mondo. Il volume rende disponibile per la prima volta al lettore non soltanto la prima traduzione in italiano del Tractatus, ma anche un ricco apparato di testi (le distinctiones 38, 39 e 40 dell’Ordinatio, i capitoli 7 e 27 della Summa Logicae, la quaestio IV.4 dei Quodlibeta, le Quaestiones in Libros Physicorum 41 e 44, il prologo della Expositio in libros Physicorum e un estratto dalla Expositio in Librum Perihermeneias Aristotelis) che consente di ricostruire in modo coerente una teoria ockhamiana della contingenza e di gettare luce su una nuova interpretazione del pensiero del teologo e filosofo inglese. (shrink)
What are the relationships between an entity and the space at which it is located? And between a region of space and the events that take place there? What is the metaphysical structure of localization? What its modal status? This paper addresses some of these questions in an attempt to work out at least the main coordinates of the logical structure of localization. Our task is mostly taxonomic. But we also highlight some of the underlying structural features and we single (...) out the interactions between the notion of localization and nearby notions, such as the notions of part and whole, or of necessity and possibility. A theory of localization—we argue—is needed in order to account for the basic relations between objects and space, and runs afoul a pure part-whole theory. We also provide an axiomatization of the relation of localization and examine cases of localization involving entities different from material objects. (shrink)
This chapter analyzes the concept of an event and of event representation as an umbrella notion. It provides an overview of different ways events have been dealt with in philosophy, linguistics, and cognitive science. This variety of positions has been construed in part as the result of different descriptive and explanatory projects. It is argued that various types of notions — common-sense, theoretically revised, scientific, and internalist psychological — be kept apart.
In this article, Zubiri’s concept of history is expounded. We may view this concept, along with the rest of his philosophy, from a naturalistic perspective which contemplates history as a social dynamism grounded in nature, and not as an essentially spiritual phenomenon or a rational one endowed with teleological progress separate from nature. By means of fundamental concepts in Zubirian philosophy, such as “possibility”, “tradition”, “reality in a condition”, “capacitation”, and others, our philosopher distinguishes himselffrom other idealistic visions which analyze (...) history in virtue of a teleological ideal external to the dynamism proper to the real. This is the case with Kant, who structures history starting from an ideal of ethical-political progress, which is for him already inscribed within human nature. And it is also the case with Hegel, for whom history is a rational process, which is supposed to reach the Absolute Spirit by means of the prominence of the State as the locus of Reason’s revelation. -/- En este artículo se expone la concepción de la historia de Xavier Zubiri. Tal concepción la podemos situar, al igual que el resto de su filosofía, en el marco de una perspectiva naturalista que contempla la historia como un dinamismo social fundamentado en la naturaleza, y no como un fenómeno esencialmente espiritual o racional de progreso teleológico ajeno a lo natural. Mediante conceptos fundamentales en la filosofía zubiriana como sonlos de “posibilidad”, “tradición”, “realidad en condición”, “capacitación”, ..., nuestro filósofo se desmarca de otras visiones idealistas que analizan la historia en virtud de un ideal teleológico externo al propio dinamismo de lo real. Es este el caso de Kant, el cual vertebra la historia a partir de un ideal de progreso ético-político que estaba inscrito ya en la misma naturaleza humana. O es también el caso de Hegel, para quien la historia es un proceso racional que ha de alcanzar el Espíritu Absoluto a través del protagonismo del Estado como lugar de desvelación de la Razón. (shrink)
ABSTRACT The philosophical tradition approaches to morals have their grounds predominantly on metaphysical and theological concepts and theories. Among the traditional ethics concepts, the most prominent is the Divine Command Theory (DCT). As per the DCT, God gives moral foundations to the humankind by its creation and through Revelation. Morality and Divinity are inseparable since the most remote civilization. These concepts submerge in a theological framework and are largely accepted by most followers of the three Abrahamic traditions: Judaism, Christianity, and (...) Islam: the greatest part of the human population. Holding faith and Revelation for its grounds, the Divine Command Theories are not strictly subject to the demonstration. The opponents to the Divine Command conception of morals, grounded in the impossibility of demonstration of its metaphysical and religious assumptions, have tried for many centuries (albeit unsuccessfully) to devalue its importance. They held the argument that it does not show material evidence and logical coherence and, for this reason, cannot be taken into account for scientific nor philosophical purposes. It is just a belief and, as so, should be understood. Besides these extreme oppositions, many other concepts contravene the Divine Command theories, in one or another way, in part or in full. Many philosophers and social scientists, from the classic Greek philosophy up to the present date, for instance, sustain that morality is only a construct, and thus culturally relative and culturally determined. However, this brings many other discussions and imposes the challenge to determine what is the meaning of culture, which elements of culture are morally determinant, and finally, what are the boundaries of such relativity. Moral determinists claim that everything related to human behavior, including morality, is determined, once free will does not exist. More recently, modern thinkers argued that there is a strict science of morality. However, the scientific method alone, despite explaining several facts and evidence, cannot enlighten the entire content and full meaning of ethics. Morals’ understanding requires a broader perception, and an agreement among philosophers, which they have never achieved. All of these questions have many different configurations depending on each philosophical strand, and start complex analysis and endless debates, as long as many of them are reciprocally conflictive. The universe and the atmosphere involving this thesis are the dominions of all these conceptual conflicts, observed from an objective and evolutionary standpoint. Irrespective of this circumstance and its intrinsic importance, however, these questions are far distant from the methodological approach of an analytical discussion on objective morals, what is, indeed, the aim and scope of this work. We should briefly revisit these prominent traditional theories because this thesis shelters a comparative study, and its assumptions at least differ profoundly from all traditional theories. Therefore, it becomes necessary offering direct and specific elements of comparison to the reader, for the right criticism, dispensing interruptive researches. However, even revisiting the traditional theories, for this comparative and critical exposure purpose, they will be kept by the side of our main concerns, as “aliena materia.” Irrespective of the validity of any or all of the elements of this discussion, and their meaning as the philosophical universe of this thesis, the purpose of this work is demonstrating and justifying the existence and meaning of prehistoric moral archetypes arisen directly from the very first social needs and efforts for survival. These archetypes are the definition of the essential foundation of ethics, its aggregation to the collective unconscious and corresponding logic organization and transmission to evolutionary stages of the human genome and different relations space-time, irrespective of any contemporary experience of the individuals. The system defined by these archetypes composes an evolutionary human social model. Is this a metaethical position? Yes, it is. Moreover, as in any metaethical reasoning, we should look carefully for the best and coherent routes, as the Analytical Philosophy offers them. Thus, this work should reasonably demonstrate that morals are not a cultural product of the civilized men or modern societies and that despite being subject to several cultural relative aggregations and subtractions, its essential foundations are archetypal and have never structurally changed. This reasoning induces that morality is an original attribute of the “homo sapiens”; it is not a property and nor an accident: it integrates the human essence and belongs to the realm of the ontological human identity. The human phenomena is a continuing process, playing its role between random determination and free will, and we need to question how morality began and how did it come to us in the present. (shrink)
Since the early eighties, computationalism in the study of the mind has been “under attack” by several critics of the so-called “classic” or “symbolic” approaches in AI and cognitive science. Computationalism was generically identified with such approaches. For example, it was identified with both Allen Newell and Herbert Simon’s Physical Symbol System Hypothesis and Jerry Fodor’s theory of Language of Thought, usually without taking into account the fact ,that such approaches are very different as to their methods and aims. Zenon (...) Pylyshyn, in his influential book Computation and Cognition, claimed that both Newell and Fodor deeply influenced his ideas on cognition as computation. This probably added to the confusion, as many people still consider Pylyshyn’s book as paradigmatic of the computational approach in the study of the mind. Since then, cognitive scientists, AI researchers and also philosophers of the mind have been asked to take sides on different “paradigms” that have from time to time been proposed as opponents of (classic or symbolic) computationalism. Examples of such oppositions are: -/- computationalism vs. connectionism, computationalism vs. dynamical systems, computationalism vs. situated and embodied cognition, computationalism vs. behavioural and evolutionary robotics. -/- Our preliminary claim in section 1 is that computationalism should not be identified with what we would call the “paradigm (based on the metaphor) of the computer” (in the following, PoC). PoC is the (rather vague) statement that the mind functions “as a digital computer”. Actually, PoC is a restrictive version of computationalism, and nobody ever seriously upheld it, except in some rough versions of the computational approach and in some popular discussions about it. Usually, PoC is used as a straw man in many arguments against computationalism. In section 1 we look in some detail at PoC’s claims and argue that computationalism cannot be identified with PoC. In section 2 we point out that certain anticomputationalist arguments are based on this misleading identification. In section 3 we suggest that the view of the levels of explanation proposed by David Marr could clarify certain points of the debate on computationalism. In section 4 we touch on a controversial issue, namely the possibility of developing a notion of analog computation, similar to the notion of digital computation. A short conclusion follows in section 5. (shrink)
Classically, truth and falsehood are opposite, and so are logical truth and logical falsehood. In this paper we imagine a situation in which the opposition is so pervasive in the language we use as to threaten the very possibility of telling truth from falsehood. The example exploits a suggestion of Ramsey’s to the effect that negation can be expressed simply by writing the negated sentence upside down. The difference between ‘p’ and ‘~~p’ disappears, the principle of double negation becomes trivial, (...) and the truth/falsehood opposition is up for grabs. Our moral is that this indeterminacy undermines the idea of inferential role semantics. (shrink)
Since the second half of the XXth century, researchers in cybernetics and AI, neural nets and connectionism, Artificial Life and new robotics have endeavoured to build different machines that could simulate functions of living organisms, such as adaptation and development, problem solving and learning. In this book these research programs are discussed, particularly as regard the epistemological issues of the behaviour modelling. One of the main novelty of this book consists of the fact that certain projects involving the building of (...) simulative machine models before the advent of cybernetics are been investigated for the first time, on the basis of little known, and sometimes completely forgotten or unpublished, texts and figures. These pre-cybernetics projects can be considered as steps toward the “discovery” of a modelling methodology that has been fully developed by those more recent research programs, and that shares some of their central goals and key methodological proposals. -/- More info in Springer link: http://www.springer.com/new+%26+forthcoming+titles+%28default%29/book/978-1-4020-0606-7 -/- This book is the English translation of La scoperta dell'artificiale, Dunod/Masson, Milan, 1998. (shrink)
The aim of this article is to propose a novel supervaluationist theory of ‘actually’ in the open future. First, I will argue that any adequate theory of actuality in a branching setting must comply with three main desiderata. Second, I will prove that none of the actuality operators that have been proposed in the literature is up to the task. Finally, I will propose a novel theory of actuality in the open future combining one of the existing definitions of the (...) actuality operator with a new definition of the historical possibility operator, and argue for its adequacy. The central feature of the theory I will advance is the introduction of an actuality parameter capable of being shifted by the historical possibility operator. I will argue that this account appears to not only be consistent with the idea that the future is genuinely open, but also with the general idea that ‘actually’ is, in a relevant sense, a ‘rigid’ operator. (shrink)
Ordinary reasoning about space—we argue—is first and foremost reasoning about things or events located in space. Accordingly, any theory concerned with the construction of a general model of our spatial competence must be grounded on a general account of the sort of entities that may enter into the scope of the theory. Moreover, on the methodological side the emphasis on spatial entities (as opposed to purely geometrical items such as points or regions) calls for a reexamination of the conceptual categories (...) required for this task. Building on material presented in an earlier paper, in this work we offer some examples of what this amounts to, of the difficulties involved, and of the main directions along which spatial theories should be developed so as to combine formal sophistication with some affinity with common sense. (shrink)
Um dos principais problemas epistemológicos que podem ser encontrados no estudo dos experimentos mentais diz respeito ao critério de escolha entre casos que produzam resultados contraditórios. Que justificação possuímos para escolher entre C e ~C? James Roberto Brown argumenta a favor de um critério de probabilidade, onde se atribui probabilidade aos dois fenômenos e o mais provável de ser verdadeiro é o que aceitamos. John Norton argumenta que experimentos mentais são argumentos, por isso eles podem ser reconstruídos como argumentos. (...) Através desse processo podemos encontrar as falhas na argumentação do experimento e decidir, através de justificação lógica, qual é a conclusão correta. A abordagem de Brown se mostra muito infrutífera por não apresentar como o cálculo de probabilidade deve funcionar, nem como é possível atribuir probabilidades a verdade dos resultados dos experimentos mentais. Por outro lado, para aceitar a resposta de Norton, necessitamos aceitar que experimentos mentais realmente façam uso de argumentos e que isso seja tudo o que podemos dizer sobre eles. Essa estruturação não permite nenhuma outra fonte epistêmica para justificar os resultados dos experimentos metais. (shrink)
In this article I examine the main conceptions of public reason in contemporary political philosophy in order to set the frame for appreciating the novelty of the pragmatist understanding of public reason as based upon the notion of consequences and upon a theory of rationality as inquiry. The approach is inspired by Dewey but is free from any concern with history of philosophy. The aim is to propose a different understanding of the nature of public reason aimed at overcoming the (...) limitations of the existing approaches. Public reason is presented as the proper basis for discussing contested issues in the broad frame of deep democracy. (shrink)
This is a position article summarizing our approach to the philosophy of space and spatial representation. Our concern is mostly methodological: above all, we argue that a number of philosophical puzzles that arise in this field—puzzles concerning the nature of spatial entities, their material and mereological constitution, their relationship with the space that they occupy—stem from a confusion between semantic issues and true metaphysical concerns.
We introduce and discuss a knowledge-driven distillation approach to explaining black-box models by means of two kinds of interpretable models. The first is perceptron (or threshold) connectives, which enrich knowledge representation languages such as Description Logics with linear operators that serve as a bridge between statistical learning and logical reasoning. The second is Trepan Reloaded, an ap- proach that builds post-hoc explanations of black-box classifiers in the form of decision trees enhanced by domain knowledge. Our aim is, firstly, to target (...) a model-agnostic distillation approach exemplified with these two frameworks, secondly, to study how these two frameworks interact on a theoretical level, and, thirdly, to investigate use-cases in ML and AI in a comparative manner. Specifically, we envision that user-studies will help determine human understandability of explanations generated using these two frameworks. (shrink)
An imaginary dialogue between Andrea Bonomi and Gonzalo Pirobutirro (the main character of Gadda’s novel La cognizione del dolore) aiming to challenge Bonomi’s tenet that a work of fiction defines a domain of objects which is closed with respect to the actual world.
A dialogue between a figure and its background, illustrating that the perceptual conditions that determine which is which are not as clear as standard Gestalt theory dictates.
Axiom weakening is a technique that allows for a fine-grained repair of inconsistent ontologies. Its main advantage is that it repairs on- tologies by making axioms less restrictive rather than by deleting them, employing the use of refinement operators. In this paper, we build on pre- viously introduced axiom weakening for ALC, and make it much more irresistible by extending its definitions to deal with SROIQ, the expressive and decidable description logic underlying OWL 2 DL. We extend the definitions of (...) refinement operator to deal with SROIQ constructs, in particular with role hierarchies, cardinality constraints and nominals, and illustrate its application. Finally, we discuss the problem of termi- nation of an iterated weakening procedure. (shrink)
The book is divided into three parts. The first, containing three papers, focuses on the characterization of the central tenets of previii sentism (by Neil McKinnon) and eternalism (by Samuel Baron and Kristie Miller), and on the ‘sceptical stance’ (by Ulrich Meyer), a view to the effect that there is no substantial difference between presentism and eternalism. The second and main section of the book contains three pairs of papers that bring the main problems with presentism to the fore and (...) outlines its defence strategy. Each pair of papers in this section can be read as a discussion between presentists and eternalists, wherein each directly responds to the arguments and objections offered by the other. This is a discussion that is sometimes absent in the literature, or which is at best carried out in a fragmented way. The first two papers of the section deal with the problem of the compatibility of Special Relativity Theory (SRT) and presentism. SRT is often considered to be a theory that contradicts the main tenet of presentism, thereby rendering presentism at odds with one of our most solid scientific theories. Christian Wüthrich’s paper presents arguments for the incompatibility of the two theories (SRT and presentism) within a new framework that includes a discussion of further complications arising from the theory of Qauantum Mechanics. Jonathan Lowe’s paper, by contrast, develops new general arguments against the incompatibility thesis and replies to Wüthrich’s paper. The second pair of papers focuses on the problem that presentists face, in providing grounds for past tensed truths. In the first (by Matthew Davidson), new arguments are provided to defend the idea that the presentist cannot adequately explain how what is now true about the past is grounded, since for the presentist the past is completely devoid of ontological ground. The second paper (by Brian Kierland) takes up the challenge of developing a presentist explanation of past truths, beginning by outlining some existing views in the literature before advancing an original proposal. (shrink)
Leibniz’s question “why is there something rather than nothing?”, also known as the Primordial Existential Question, has often been the focus of intense philosophical controversy. While some authors take it to pose a profound metaphysical puzzle, others denounce the alleged lack of meaning or the inconceivability of the idea of nothingness. In a series of articles, Adolf Grünbaum develops an empirically informed critique with the aim to demonstrate that the Primordial Existential Question poses a “non-issue” which does not require explanation. (...) Grünbaum’s critique prompted heated debates in the recent literature. In this paper, I examine each step of Grünbaum’s reasoning and argue that it fails to show that the Primordial Existential Question is ill-founded. Moreover, I identify and rebut several strategies that one may employ to amend Grünbaum’s critique. In doing so, I address various issues related to the Primordial Existential Question, including the alleged need for its proponents to rely on contentious metaphysical presuppositions and the purported availability of empirical evidence which answers or dissolves such a question. (shrink)
In this paper I investigate Aristotle’s account of predication in Topics I 9. I argue for the following interpretation. In this chapter Aristotle (i) presents two systems of predication cutting across each other, the system of the so-called four ‘predicables’ and of the ten ‘categories’, in order to distinguish them and explore their mutual relationship. I propose a semantic interpretation of the relationship between them. According to this reading, every proposition formed through a predicable constitutes at the same time a (...) predication according to one of the ten categories, and, consequently, signifies one of them, expressing one of the predicative relationship conveyed by them. Further, Aristotle (ii) explains the predicative connection between these two systems and the ten items signified by the ‘things said without any combination’ enumerated in Chapter 4 of the Categories, whose list is almost identical with that of categories in Top. I 9, with the only exception of their first members. (shrink)
How to achieve responsibility through philosophical dialogue? This is indeed a core issue of today's democracy. In this regard, education plays an important role.
The term “Complex Systems Biology” was introduced a few years ago [Kaneko, 2006] and, although not yet of widespread use, it seems particularly well suited to indicate an approach to biology which is well rooted in complex systems science. Although broad generalizations are always dangerous, it is safe to state that mainstream biology has been largely dominated by a gene-centric view in the last decades, due to the success of molecular biology. So the one gene - one trait approch, which (...) has often proved to be effective, has been extended to cover even complex traits. This simplifying view has been appropriately criticized, and the movement called systems biology has taken off. Systems biology [Noble, 2006] emphasizes the presence of several feedback loops in biological systems, which severely limit the range of validity of explanations based upon linear causal chains (e.g. gene → behaviour). Mathematical modelling is one the favourite tools of systems biologists to analyze the possible effects of competing negative and positive feedback loops which can be observed at several levels (from molecules to organelles, cells, tissues, organs, organisms, ecosystems). Systems biology is by now a well-established field, as it can be inferred by the rapid growth in number of conferences and journals devoted to it, as well as by the existence of several grants and funded projects.Systems biology is mainly focused upon the description of specific biological items, like for example specific organisms, or specific organs in a class of animals, or specific genetic-metabolic circuits. It therefore leaves open the issue of the search for general principles of biological organization, which apply to all living beings or to at least to broad classes. We know indeed that there are some principles of this kind, biological evolution being the most famous one. The theory of cellular organization also qualifies as a general principle. But the main focus of biological research has been that of studying specific cases, with some reluctance to accept (and perhaps a limited interest for) broad generalizations. This may however change, and this is indeed the challenge of complex systems biology: looking for general principles in biological systems, in the spirit of complex systems science which searches for similar features and behaviours in various kinds of systems. The hope to find such general principles appears well founded, and I will show in Section 2 that there are indeed data which provide support to this claim. Besides data, there are also general ideas and models concerning the way in which biological systems work. The strategy, in this case, is that of introducing simplified models of biological organisms or processes, and to look for their generic properties: this term, borrowed from statistical physics, is used for those properties which are shared by a wide class of systems. In order to model these properties, the most effective approach has been so far that of using ensembles of systems, where each member can be different from another one, and to look for those properties which are widespread. This approach was introduced many years ago [Kauffman, 1971] in modelling gene regulatory networks. At that time one had very few information about the way in which the expression of a given gene affects that of other genes, apart from the fact that this influence is real and can be studied in few selected cases (like e.g. the lactose metabolism in E. coli). Today, after many years of triumphs of molecular biology, much more has been discovered, however the possibility of describing a complete map of gene-gene interactions in a moderately complex organism is still out of reach. Therefore the goal of fully describing a network of interacting genes in a real organism could not be (and still cannot be) achieved. But a different approach has proven very fruitful, that of asking what are the typical properties of such a set of interacting genes. Making some plausible hypotheses and introducing some simplifying assumptions, Kauffman was able to address some important problems. In particular, he drew attention to the fact that a dynamical system of interacting genes displays selforganizing properties which explain some key aspects of life, most notably the existence of a limited number of cellular types in every multicellular organism (these numbers are of the order of a few hundreds, while the number of theoretically possible types, absent interactions, would be much much larger than the number of protons in the universe). In section 3 I will describe the ensemble based approach in the context of gene regulatory networks, and I will show that it can describe some important experimental data. Finally, in section 4 I will discuss some methodological aspects. (shrink)
In a series of ten articles from leading American and European scholars, Pragmatist Epistemologies explores the central themes of epistemology in the pragmatist tradition through a synthesis of new and old pragmatist thought, engaging contemporary issues while exploring from a historical perspective. It opens a new avenue of research in contemporary pragmatism continuous with the main figures of pragmatist tradition and incorporating contemporary trends in philosophy. Students and scholars of American philosophy will find this book indispensable.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.