The objective of Working Group 4 of the COST Action NET4Age-Friendly is to examine existing policies, advocacy, and funding opportunities and to build up relations with policy makers and funding organisations. Also, to synthesize and improve existing knowledge and models to develop from effective business and evaluation models, as well as to guarantee quality and education, proper dissemination and ensure the future of the Action. The Working Group further aims to enable capacity building to improve interdisciplinary participation, to promote knowledge (...) exchange and to foster a cross-European interdisciplinary research capacity, to improve cooperation and co-creation with cross-sectors stakeholders and to introduce and educate students SHAFE implementation and sustainability. To enable the achievement of the objectives of Working Group 4, the Leader of the Working Group, the Chair and Vice-Chair, in close cooperation with the Science Communication Coordinator, developed a template to map the current state of SHAFE policies, funding opportunities and networking in the COST member countries of the Action. On invitation, the Working Group lead received contributions from 37 countries, in a total of 85 Action members. The contributions provide an overview of the diversity of SHAFE policies and opportunities in Europe and beyond. These were not edited or revised and are a result of the main areas of expertise and knowledge of the contributors; thus, gaps in areas or content are possible and these shall be further explored in the following works and reports of this WG. But this preliminary mapping is of huge importance to proceed with the WG activities. In the following chapters, an introduction on the need of SHAFE policies is presented, followed by a summary of the main approaches to be pursued for the next period of work. The deliverable finishes with the opportunities of capacity building, networking and funding that will be relevant to undertake within the frame of Working Group 4 and the total COST Action. The total of country contributions is presented in the annex of this deliverable. (shrink)
In this paper we introduce three methods to approach philosophical problems informationally: Minimalism, the Method of Abstraction and Constructionism. Minimalism considers the specifications of the starting problems and systems that are tractable for a philosophical analysis. The Method of Abstraction describes the process of making explicit the level of abstraction at which a system is observed and investigated. Constructionism provides a series of principles that the investigation of the problem must fulfil once it has been fully characterised by the previous (...) two methods. For each method, we also provide an application: the problem of visual perception, functionalism, and the Turing Test, respectively. (shrink)
The present paper discusses different approaches to metaphysics and defends a specific, non-deflationary approach that nevertheless qualifies as scientifically-grounded and, consequently, as acceptable from the naturalistic viewpoint. By critically assessing some recent work on science and metaphysics, we argue that such a sophisticated form of naturalism, which preserves the autonomy of metaphysics as an a priori enterprise yet pays due attention to the indications coming from our best science, is not only workable but recommended.
The paper investigates the ethics of information transparency (henceforth transparency). It argues that transparency is not an ethical principle in itself but a pro-ethical condition for enabling or impairing other ethical practices or principles. A new definition of transparency is offered in order to take into account the dynamics of information production and the differences between data and information. It is then argued that the proposed definition provides a better understanding of what sort of information should be disclosed and what (...) sort of information should be used in order to implement and make effective the ethical practices and principles to which an organisation is committed. The concepts of “heterogeneous organisation” and “autonomous computational artefact” are further defined in order to clarify the ethical implications of the technology used in implementing information transparency. It is argued that explicit ethical designs, which describe how ethical principles are embedded into the practice of software design, would represent valuable information that could be disclosed by organisations in order to support their ethical standing. (shrink)
The article firstly outlines the essential contents of Nicolai Hartmann's "Possibility and Actuality" (1938), on the occasion of its recent Italian translation ("Possibilità e realtà", Milano-Udine 2018). Secondly, it discusses the main contents of "Nicolai Hartmanns Neue Ontologie und die Philosophische Anthropologie" (Berlin-Boston 2019), the outcome of a conference devoted to Hartmann's philosophy, regarded in the context of twentieth-century philosophical anthropology (Max Scheler, Helmuth Plessner). -/- .
It is often claimed that the greatest value of the Bayesian framework in cognitive science consists in its unifying power. Several Bayesian cognitive scientists assume that unification is obviously linked to explanatory power. But this link is not obvious, as unification in science is a heterogeneous notion, which may have little to do with explanation. While a crucial feature of most adequate explanations in cognitive science is that they reveal aspects of the causal mechanism that produces the phenomenon to be (...) explained, the kind of unification afforded by the Bayesian framework to cognitive science does not necessarily reveal aspects of a mechanism. Bayesian unification, nonetheless, can place fruitful constraints on causal–mechanical explanation. 1 Introduction2 What a Great Many Phenomena Bayesian Decision Theory Can Model3 The Case of Information Integration4 How Do Bayesian Models Unify?5 Bayesian Unification: What Constraints Are There on Mechanistic Explanation?5.1 Unification constrains mechanism discovery5.2 Unification constrains the identification of relevant mechanistic factors5.3 Unification constrains confirmation of competitive mechanistic models6 ConclusionAppendix. (shrink)
The free-energy principle states that all systems that minimize their free energy resist a tendency to physical disintegration. Originally proposed to account for perception, learning, and action, the free-energy principle has been applied to the evolution, development, morphology, anatomy and function of the brain, and has been called a postulate, an unfalsifiable principle, a natural law, and an imperative. While it might afford a theoretical foundation for understanding the relationship between environment, life, and mind, its epistemic status is unclear. Also (...) unclear is how the free-energy principle relates to prominent theoretical approaches to life science phenomena, such as organicism and mechanism. This paper clarifies both issues, and identifies limits and prospects for the free-energy principle as a first principle in the life sciences. (shrink)
In the paper it is argued that bridging the digital divide may cause a new ethical and social dilemma. Using Hardin's Tragedy of the Commons, we show that an improper opening and enlargement of the digital environment (Infosphere) is likely to produce a Tragedy of the Digital Commons (TDC). In the course of the analysis, we explain why Adar and Huberman's previous use of Hardin's Tragedy to interpret certain recent phenomena in the Infosphere (especially peer-to-peer communication) may not be entirely (...) satisfactory. We then seek to provide an improved version of the TDC that avoids the possible shortcomings of their model. Next, we analyse some problems encountered by the application of classical ethics in the resolution of the TDC. In the conclusion, we outline the kind of work that will be required to develop an ethical approach that may bridge the digital divide but avoid the TDC. (shrink)
Courtesy of its free energy formulation, the hierarchical predictive processing theory of the brain (PTB) is often claimed to be a grand unifying theory. To test this claim, we examine a central case: activity of mesocorticolimbic dopaminergic (DA) systems. After reviewing the three most prominent hypotheses of DA activity—the anhedonia, incentive salience, and reward prediction error hypotheses—we conclude that the evidence currently vindicates explanatory pluralism. This vindication implies that the grand unifying claims of advocates of PTB are unwarranted. More generally, (...) we suggest that the form of scientific progress in the cognitive sciences is unlikely to be a single overarching grand unifying theory. (shrink)
In recent work, the interrelated questions of whether there is a fundamental level to reality, whether ontological dependence must have an ultimate ground, and whether the monist thesis should be endorsed that the whole universe is ontologically prior to its parts have been explored with renewed interest. Jonathan Schaffer has provided arguments in favour of 'priority monism' in a series of articles (2003, 2004, 2007a, 2007b, forthcoming). In this paper, these arguments are analysed, and it is claimed that they are (...) not compelling: in particular, the possibility that there is no ultimate level of basic entities that compose everything else is on a par with the possibility of infinite 'upward' complexity. The idea that we must, at any rate, postulate an ontologically fundamental level for methodological reasons ( Cameron 2008 ) is also discussed and found unconvincing: all things considered, there may be good reasons for endorsing 'metaphysical infinitism'. In any event, a higher degree of caution in formulating metaphysical claims than found in the extant literature appears advisable. (shrink)
We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of the combination of prototypical concepts. The proposed logic relies on the logic of typicality ALC + TR, whose semantics is based on the notion of rational closure, as well as on the distributed semantics of probabilistic Description Logics, and is equipped with a cognitive heuristic used by humans for concept composition. We first extend the logic of typicality ALC + TR by typicality inclusions of the (...) form p :: T(C) v D, whose intuitive meaning is that “we believe with degree p about the fact that typical Cs are Ds”. As in the distributed semantics, we define different scenarios containing only some typicality inclusions, each one having a suitable probability. We then exploit such scenarios in order to ascribe typical properties to a concept C obtained as the combination of two prototypical concepts. We also show that reasoning in the proposed Description Logic is EXPTIME-complete as for the underlying standard Description Logic ALC. (shrink)
In the last decade, structural realism has been presented as the most promising strategy for developing a defensible realist view of science. Nevertheless, controversy still continues in relation to the exact meaning of the proposed structuralism. The stronger version of structural realism, the so-called ontic structural realism, has been argued for on the basis of some ideas related to quantum mechanics. In this paper, I will first outline these arguments, mainly developed by Steven French and James Ladyman, then challenge them, (...) putting a particular emphasis on a metaphysical principle which, even though it is crucial for the whole argument, hasn't been, in my opinion, clearly stated and examined yet. My overall view will be that a weaker version of the form of realism we are considering is more plausible – namely, epistemic structural realism. (shrink)
We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of combining prototypical concepts, an open problem in the fields of AI and cognitive modelling. Our logic extends the logic of typicality ALC + TR, based on the notion of rational closure, by inclusions p :: T(C) v D (“we have probability p that typical Cs are Ds”), coming from the distributed semantics of probabilistic Description Logics. Additionally, it embeds a set of cognitive heuristics for concept (...) combination. We show that the complexity of reasoning in our logic is EXPTIME-complete as in ALC. (shrink)
Ian Hacking’s Representing and Intervening is often credited as being one of the first works to focus on the role of experimentation in philosophy of science, catalyzing a movement which is sometimes called the “philosophy of experiment” or “new experimentalism”. In the 1980s, a number of other movements and scholars also began focusing on the role of experimentation and instruments in science. Philosophical study of experimentation has thus seemed to be an invention of the 1980s whose central figure is Hacking. (...) This article aims to assess this historical claim, made by Hacking himself as well as others. It does so first by highlighting how a broader perspective on the history of philosophy reveals this invention narrative to be incorrect, since experimentation was a topic of interest for earlier philosophers. Secondly, the article evaluates a revision of this historical claim also made by some philosophers of experiment: the rediscovery narrative, which frames Hacking and others as having rediscovered the work of these earlier authors. This second narratives faces problems as well. Therefore we develop a third narrative which we call the contextualist narrative. Rather than considering experimentation in an essentialist manner as a fixed research object that is either present or not in the work of specific authors, experimentation should be addressed through a narrative that asks in what way it becomes a philosophical problem for certain authors and for what purpose. Such contextualization enables a repositioning of Hacking’s philosophy of experiment in relation to the specific debates in which he intervened, such as the realism-antirealism debate, the Science Wars and the debate on incommensurability. (shrink)
Teller argued that violations of Bell’s inequalities are to be explained by interpreting quantum entangled systems according to ‘relational holism’, that is, by postulating that they exhibit irreducible (‘inherent’) relations. Teller also suggested a possible application of this idea to quantum statistics. However, the basic proposal was not explained in detail nor has the additional idea about statistics been articulated in further work. In this article, I reconsider relational holism, amending it and spelling it out as appears necessary for a (...) proper assessment, and application, of the position. †To contact the author, please write to: FB Philosophie‐Zukunftskolleg, University of Konstanz, Universitätstraße 10, 78464, Konstanz, Germany; e‐mail: matteo[email protected] ‐konstanz.de. (shrink)
On Jaeggi’s reading, the immanent and progressive features of ideology critique are rooted in the connection between its explanatory and its normative tasks. I argue that this claim can be cashed out in terms of the mechanisms involved in a functional explanation of ideology and that stability plays a crucial role in this connection. On this reading, beliefs can be said to be ideological if (a) they have the function of supporting existing social practices, (b) they are the output of (...) systematically distorted processes of belief formation, (c) the conditions in which distorting mechanisms trigger can be traced back to structural causal factors shaped by the social practice their outputs are designed to support. Functional problems thus turn out to be interlocked with normative problems because ideology fails to provide principles to regulate cooperation that would be accepted under conditions of non-domination, hence failing to anchor a stable cooperative scheme. By explaining ideology as parasitic on domination, ideology critique points to the conditions under which cooperation stabilizes as those of a practice whose principles are accepted without coercion. Thus, it seems to entail a conception of justice whose principles are articulated as part of a theory of social cooperation. (shrink)
Th is paper looks at quantum theory and the Standard Model of elementary particles with a view to suggesting a detailed empirical implementation of trope ontology in harmony with our best physics.
This paper offers a critical assessment of the current state of the debate about the identity and individuality of material objects. Its main aim, in particular, is to show that, in a sense to be carefully specified, the opposition between the Leibnizian ‘reductionist’ tradition, based on discernibility, and the sort of ‘primitivism’ that denies that facts of identity and individuality must be analysable has become outdated. In particular, it is argued that—contrary to a widespread consensus—‘naturalised’ metaphysics supports both the acceptability (...) of non-qualitatively grounded (both ‘contextual’ and intrinsic) identity and a pluralistic approach to individuality and individuation. A case study is offered that focuses on non-relativistic quantum mechanics, in the context of which primitivism about identity and individuality, rather than being regarded as unscientific, is on the contrary suggested to be preferable to the complicated forms of reductionism that have recently been proposed. More generally, by assuming a plausible form of anti-reductionism about scientific theories and domains, it is claimed that science can be regarded as compatible with, or even as suggesting, the existence of a series of equally plausible grades of individuality. The kind of individuality that prevails in a certain context and at a given level can be ascertained only on the basis of the specific scientific theory at hand. (shrink)
Steinberg has recently proposed an argument against Schaffer’s priority monism. The argument assumes the principle of Necessity of Monism, which states that if priority monism is true, then it is necessarily true. In this paper, I argue that Steinberg’s objection can be eluded by giving up Necessity of Monism for an alternative principle, that I call Essentiality of Fundamentality, and that such a principle is to be preferred to Necessity of Monism on other grounds as well.
This paper puts forward the hypothesis that the distinctive features of quantum statistics are exclusively determined by the nature of the properties it describes. In particular, all statistically relevant properties of identical quantum particles in many-particle systems are conjectured to be irreducible, ‘inherent’ properties only belonging to the whole system. This allows one to explain quantum statistics without endorsing the ‘Received View’ that particles are non-individuals, or postulating that quantum systems obey peculiar probability distributions, or assuming that there are primitive (...) restrictions on the range of states accessible to such systems. With this, the need for an unambiguously metaphysical explanation of certain physical facts is acknowledged and satisfied. (shrink)
An argument to the effect that, under a few reasonable assumptions, the bare particular ontology is best understood in terms of supersubstativalism: objects are identical to regions of space(-time) and properties directly inhere in space(-time) points or region as their bearers.
Abstract In this article, a critical assessment is carried out of the two available forms of nominalism with respect to the ontological constitution of material objects: resemblance nominalism and trope theory. It is argued that these two nominalistic ontologies naturally converge towards each other when the problems they have to face are identified and plausible solutions to these problems are sought. This suggests a synthesis between the two perspectives along lines first proposed by Sellars, whereby, at least at the level (...) of the simplest, truly fundamental constituents of reality, every particular is literally both an object and a particularized property (or, alternatively put, the distinction between objects and properties dissolves). Some potential problems and open issues for such an approach to nominalism in ontology are identified and discussed, with particular emphasis on the sort of fundamentalism that seems to crucially underlie the proposed ontology. Content Type Journal Article Pages 1-14 DOI 10.1007/s12136-011-0145-x Authors Matteo Morganti, Department of Philosophy, University of Rome ‘RomaTRE’, Via Ostiense, 234, 00144 Rome, Italy Journal Acta Analytica Online ISSN 1874-6349 Print ISSN 0353-5150. (shrink)
The paper analyses six ethical challenges posed by cloud computing, concerning ownership, safety, fairness, responsibility, accountability and privacy. The first part defines cloud computing on the basis of a resource-oriented approach, and outlines the main features that characterise such technology. Following these clarifications, the second part argues that cloud computing reshapes some classic problems often debated in information and computer ethics. To begin with, cloud computing makes possible a complete decoupling of ownership, possession and use of data and this helps (...) to explain the problems occurring when different providers of cloud computing retain or relinquish the right to use or own users‘ data. The problem of safety in cloud computing is coupled to that of reliability, insofar as users have to trust providers to preserve their data, applications and content in a reliable manner. It is argued that, in this context, data insurance could play an important role. Regarding fairness, the paper argues that cloud computing is already reshaping the nature of the Digital. Responsibility, accountability and privacy close the ethical analysis of cloud computing. In this case, the thesis is that the necessity to account for the actions of cloud computing users imposes delicate trade-offs between users‘ privacy and the traceability of their operations. (shrink)
This paper provides a defence of the account of partial resemblances between properties according to which such resemblances are due to partial identities of constituent properties. It is argued, first of all, that the account is not only required by realists about universals à la Armstrong, but also useful (of course, in an appropriately re-formulated form) for those who prefer a nominalistic ontology for material objects. For this reason, the paper only briefly considers the problem of how to conceive of (...) the structural universals first posited by Armstrong in order to explain partial resemblances, and focuses instead on criticisms that have been levelled against the theory (by Pautz, Eddon, Denkel and Gibb) and that apply regardless of one’s preferred ontological framework. The partial identity account is defended from these objections and, in doing so, a hitherto quite neglected connection—between the debate about partial similarity as partial identity and that concerning ontological finitism versus infinitism—is looked at in some detail. (shrink)
Structural realism first emerged as an epistemological thesis aimed to avoid the socalled pessimistic metainduction on the history of science. Some authors, however, have suggested that the preservation of structure across theory change is best explained by endorsing the metaphysical thesis that structure is all there is. Although the possibility of this latter, ‘ontic’ form of structural realism has been extensively debated, not much has been said concerning its justification. In this article, I distinguish between two arguments in favor of (...) ontic structural realism that can be reconstructed from the literature and find both of them wanting. (shrink)
In this volume, a range of high-profile researchers in philosophy of mind, philosophy of cognitive science, and empirical cognitive science, critically engage with Clark's work across the themes of: Extended, Embodied, Embedded, Enactive, and Affective Minds; Natural Born Cyborgs; and Perception, Action, and Prediction. Daniel Dennett provides a foreword on the significance of Clark's work, and Clark replies to each section of the book, thus advancing current literature with original contributions that will form the basis for new discussions, debates and (...) directions in the discipline. (shrink)
Husserl’s Logical Grammar is intended to explain how complex expressions can be constructed out of simple ones so that their meaning turns out to be determined by the meanings of their constituent parts and the way they are put together. Meanings are thus understood as structured contents and classified into formal categories to the effect that the logical properties of expressions reflect their grammatical properties. As long as linguistic meaning reduces to the intentional content of pre-linguistic representations, however, it is (...) not trivial to account for how semantics relates to syntax in this context. In this paper, I analyze Husserl’s Logical Grammar as a system of recursive rules operating on representations and suggest that the syntactic form of representations contributes to their semantics because it carries information about semantic role. I further discuss Husserl’s syntactic account of the unity of propositions and argue that, on this account, logical form supervenes on syntactic form. In the last section I draw some implications for the phenomenology of thought and conjecture that the structural features it displays are likely to convey the syntactic structures of an underlying language-like representational system. (shrink)
Despite the impressive amount of financial resources recently invested in carrying out large-scale brain simulations, it is controversial what the pay-offs are of pursuing this project. One idea is that from designing, building, and running a large-scale neural simulation, scientists acquire knowledge about the computational performance of the simulating system, rather than about the neurobiological system represented in the simulation. It has been claimed that this knowledge may usher in a new era of neuromorphic, cognitive computing systems. This study elucidates (...) this claim and argues that the main challenge this era is facing is not the lack of biological realism. The challenge lies in identifying general neurocomputational principles for the design of artificial systems, which could display the robust flexibility characteristic of biological intelligence. (shrink)
A critical discussion of Shoemaker's argument for the possibility of time without change, intended as an argument against relationist conceptions of time. A relational view of time is proposed based on the primitive identity of events (or whatever entities are the basic subjects of change and lack thereof).
In this paper, I will examine an evolutionary hypothesis about musical expressiveness first proposed by Peter Kivy. I will first present the hypothesis and explain why I take it to be different from ordinary evolutionary explanations of musical expressiveness. I will then argue that Kivy’s hypothesis is of crucial importance for most available resemblancebased accounts of musical expressiveness. For this reason, it is particularly important to assess its plausibility. After having reviewed the existing literature on the topic, I will list (...) five challenges the hypothesis is supposed to meet. Although my list of challenges does not aim at exhaustiveness, I believe that the hypothesis must meet all of the challenges I suggest if it is to work as a cornerstone for a theory of musical expressiveness. (shrink)
We put forward a new, ‘coherentist’ account of quantum entanglement, according to which entangled systems are characterized by symmetric relations of ontological dependence among the component particles. We compare this coherentist viewpoint with the two most popular alternatives currently on offer—structuralism and holism—and argue that it is essentially different from, and preferable to, both. In the course of this article, we point out how coherentism might be extended beyond the case of entanglement and further articulated.
In this paper, we evaluate some proposals that can be advanced to clarify the ontological consequences of Relational Quantum Mechanics. We first focus on priority monism and ontic structural realism and argue that these views are not suitable for providing an ontological interpretation of the theory. Then, we discuss an alternative interpretation that we regard as more promising, based on so-called ‘metaphysical coherentism’, which we also connect to the idea of an event-based, or ‘flash’, ontology.
In a recent paper, Sun Demirli (2010) proposes an allegedly new way of conceiving of individuation in the context of the bundle theory of object constitution. He suggests that allowing for distance relations to individuate objects solves the problems with worlds containing indiscernible objects that would otherwise affect the theory. The aim of the present paper is i) To show that Demirli’s proposal falls short of achieving this goal and ii) To carry out a more general critical assessment of the (...) issue by appraising the costs and benefits of Demirli’s view as well as of existing alternatives. (shrink)
This paper deals with Walter Jaeschke’s Hegels Philosophie. It begins with Hegel’s early writings, focusing on the relationship between logic and metaphysics. It goes on to explore central moments of Hegel’s philosophy: the relationship with Kant, the nature of categories, the philosophy of history, and the concept of the State. Jaeschke’s interpretation of Hegel’s thought is that of a paradigmatic thinker, whose fundamental philosophical breakthrough lies in the concept of Geist.
_Nicolai Hartmann and Alexius Meinong on Apriority and Causality. Notes on the Correspondence_ The article offers a critical reading of the nine letters composing the correspondence exchanged by Alexius Meinong (1853-1920) and Nicolai Hartmann (1882-1950) in 1915 and 1918-1920. The author explores the main contents of the correspondence, through a chronological-thematic analysis. The letters of 1915 are eminently dedicated to a discussion of the gnoseology-ontology relationship. Here, the author focuses (1.1) on the relationship between reality and knowledge and (1.2) on (...) that between a priori and cognitive principles. The analysis (2) of the 1918-1920 correspondence concludes the article, engaging in the theme of the law of causality. Different in the philosophical background and argumentative style, the late Meinong and the young Hartmann find a field of dialogue that appears not lacking in consequences on the philosophical evolution of the latter. (shrink)
The problem of animal consciousness has profound implications on our concept of nature and of our place in the natural world. In philosophy of mind and cognitive neuroscience the problem of animal consciousness raises two main questions (Velmans, 2007): the distribution question (“are there conscious animals beside humans?”) and the phenomenological question (“what is it like to be a non-human animal?”). In order to answer these questions, many approaches take into account similarities and dissimilarities in animal and human behavior, e.g. (...) the use of language or tools and mirror self-recognition (Allen and Bekoff, 2007), however behavioral arguments don’t seem to be conclusive (Baars, 2005). Cognitive neuroscience is providing comparative data on structural and functional similarities, respectively called “homologies” and “analogies”. Many experimental results suggest that the thalamocortical system is essential for consciousness (Edelman and Tononi, 2000; Tononi, 2008). The argument from homology states that the general structure of thalamocortical system remained the same in the last 100-200 million years, for it is neuroanatomically similar in all the present and past mammals and it didn’t change much during phylogeny (Allen and Bekoff, 2007). The argument from analogy states that the key functional processes correlated with consciousness in humans are still present in all other mammals and many other animals (Baars, 2005). These processes are information integration through effective cortical connectivity (Massimini et al., 2005; Rosanova et al., 2012) and elaboration of information at a global level (Dehaene and Changeux, 2011). On this basis, the Cambridge Declaration on Consciousness states that all mammals, birds, and many other animals (such as octopuses) possess the neurological substrates of consciousness (Low et al., 2012). Conscious experience is private (Chalmers, 1995; Nagel, 1974) therefore the answer to the phenomenological question may be impossible. Nevertheless, cognitive neuroscience may provide an answer to the distribution question, showing that conscious experience is not limited to humans since it is a major biological adaptation going back millions of years. (shrink)
This paper defends a relational view of time based on recent work on quantum gravity. Julian barbour's relational approach to physical theory, in particular, is developed as a basis for a relational, rather than anti-realist, metaphysics of time.
'Ontology and Metaontology: A Contemporary Guide' is a clear and accessible survey of ontology, focussing on the most recent trends in the discipline. -/- Divided into parts, the first half characterizes metaontology: the discourse on the methodology of ontological inquiry, covering the main concepts, tools, and methods of the discipline, exploring the notions of being and existence, ontological commitment, paraphrase strategies, fictionalist strategies, and other metaontological questions. The second half considers a series of case studies, introducing and familiarizing the reader (...) with concrete examples of the latest research in the field. The basic sub-fields of ontology are covered here via an accessible and captivating exposition: events, properties, universals, abstract objects, possible worlds, material beings, mereology, fictional objects. -/- The guide's modular structure allows for a flexible approach to the subject, making it suitable for both undergraduates and postgraduates looking to better understand and apply the exciting developments and debates taking place in ontology today. (shrink)
In this article, I argue that a capacity for mindreading conceived along the line of simulation theory provides the cognitive basis for forming we-centric representations of actions and goals. This explains the plural first personal stance displayed by we-intentions in terms of the underlying cognitive processes performed by individual minds, while preserving the idea that they cannot be analyzed in terms of individual intentional states. The implication for social ontology is that this makes sense of the plural subjectivity of joint (...) actions without making group agents require either a corporate body or the unity of consciousness. (shrink)
Saunders' recent arguments in favour of the weak discernibility of (certain) quantum particles seem to be grounded in the 'generalist' view that science only provides general descriptions of the worlIn this paper, I introduce the ‘generalist’ perspective and consider its possible justification and philosophical basis; and then look at the notion of weak discernibility. I expand on the criticisms formulated by Hawley (2006) and Dieks and Veerstegh (2008) and explain what I take to be the basic problem: that the properties (...) invoked by Saunders cannot be pointed to as ‘individuators’ of otherwise indiscernible (and thus numerically identical) entities because their ontological status remains underdetermined by the evidence and the established interpretation of the theory. In addition to to this, I suggest that Saunders does not deal adequately with bosons, and cannot do so exactly because he subscribes to PII and the generalist picture. The last part of the paper contains a critical examination of the claim (or at least implicit assumption) that the generalist picture should be regarded as obviously compelling by the modern-day empiricist. (shrink)
There is widespread recognition at universities that a proper understanding of science is needed for all undergraduates. Good jobs are increasingly found in fields related to Science, Technology, Engineering, and Medicine, and science now enters almost all aspects of our daily lives. For these reasons, scientific literacy and an understanding of scientific methodology are a foundational part of any undergraduate education. Recipes for Science provides an accessible introduction to the main concepts and methods of scientific reasoning. With the help of (...) an array of contemporary and historical examples, definitions, visual aids, and exercises for active learning, the textbook helps to increase students’ scientific literacy. The first part of the book covers the definitive features of science: naturalism, experimentation, modeling, and the merits and shortcomings of both activities. The second part covers the main forms of inference in science: deductive, inductive, abductive, probabilistic, statistical, and causal. The book concludes with a discussion of explanation, theorizing and theory-change, and the relationship between science and society. The textbook is designed to be adaptable to a wide variety of different kinds of courses. In any of these different uses, the book helps students better navigate our scientific, 21st-century world, and it lays the foundation for more advanced undergraduate coursework in a wide variety of liberal arts and science courses. Selling Points Helps students develop scientific literacy—an essential aspect of _any_ undergraduate education in the 21 st century, including a broad understanding of scientific reasoning, methods, and concepts Written for all beginning college students: preparing science majors for more focused work in particular science; introducing the humanities’ investigations of science; and helping non-science majors become more sophisticated consumers of scientific information Provides an abundance of both contemporary and historical examples Covers reasoning strategies and norms applicable in all fields of physical, life, and social sciences, _as well as_ strategies and norms distinctive of specific sciences Includes visual aids to clarify and illustrate ideas Provides text boxes with related topics and helpful definitions of key terms, and includes a final Glossary with all key terms Includes Exercises for Active Learning at the end of each chapter, which will ensure full student engagement and mastery of the information include earlier in the chapter Provides annotated ‘For Further Reading’ sections at the end of each chapter, guiding students to the best primary and secondary sources available Offers a Companion Website, with: For Students: direct links to many of the primary sources discussed in the text, student self-check assessments, a bank of exam questions, and ideas for extended out-of-class projects For Instructors: a password-protected Teacher’s Manual, which provides student exam questions with answers, extensive lecture notes, classroom-ready Power Point presentations, and sample syllabi Extensive Curricular Development materials, helping any instructor who needs to create a Scientific Reasoning Course, ex nihilo. (shrink)
Antonio Labriola in the crisis of Marxism. The article deals with the relationship between Marxism and science in Antonio Labriola’s philosophy in the years1898-1899. In the first part, the Author looks at the content of the Postscript to Discorrendo di socialismo e di filosofia and critically analyzes Labriola’s objections to some of the central theses defended by Benedetto Croce on the theory of value and the economics of Karl Marx. In the second part, two important writings by Labriola linked to (...) the incipient debate on revisionism are examined: Polemiche sul socialismo and Sulla crisi del marxismo. In conclusion, the Author identifies in a specific conception of theory defended by Labriola the common trait between these two apparently distinct areas of inquiry. (shrink)
This paper examines a recent proposal for reviving so-called resemblance nominalism. It is argued that, although consistent, it naturally leads to trope theory upon examination for reasons having to do with the appeal of neutrality as regards certain non-trivial ontological theses.
The topic of this article is the ontology of practical reasons. We draw a critical comparison between two views. According to the first, practical reasons are states of affairs; according to the second, they are propositions. We first isolate and spell out in detail certain objections to the second view that can be found only in embryonic form in the literature – in particular, in the work of Jonathan Dancy. Next, we sketch possible ways in which one might respond to (...) each one of these objections. A careful evaluation of these complaints and responses, we argue, shows that the first view is not as obviously compelling as it is thought by Dancy. Indeed, it turns out that the view that practical reasons are propositions is by no means unworkable and in fact, at least under certain assumptions, explicit considerations can be made in favour of a propositional construal of reasons. (shrink)
Endocrinologists apply the idea of feedback loops to explain how hormones regulate certain bodily functions such as glucose metabolism. In particular, feedback loops focus on the maintenance of the plasma concentrations of glucose within a narrow range. Here, we put forward a different, organicist perspective on the endocrine regulation of glycaemia, by relying on the pivotal concept of closure of constraints. From this perspective, biological systems are understood as organized ones, which means that they are constituted of a set of (...) mutually dependent functional structures acting as constraints, whose maintenance depends on their reciprocal interactions. Closure refers specifically to the mutual dependence among functional constraints in an organism. We show that, when compared to feedback loops, organizational closure can generate much richer descriptions of the processes and constraints at play in the metabolism and regulation of glycaemia, by making explicit the different hierarchical orders involved. We expect that the proposed theoretical framework will open the way to the construction of original mathematical models, which would provide a better understanding of endocrine regulation from an organicist perspective. (shrink)
In this article, I aim to provide an account of the peculiar reasons that motivate our negative reaction whenever we see musical instruments being mistreated and destroyed. Stephen Davies has suggested that this happens because we seem to treat musical instruments as we treat human beings, at least in some relevant respects. I argue in favour of a different explanation, one that is based on the nature of music as an art form. The main idea behind my account is that (...) musical instruments are not mere tools for the production of art; rather, they are involved in an essential way in artistic appreciation of music. This fact not only grounds our negative reaction to their mistreatment and destruction but also has a normative force that is lacked by the account proposed by Davies. (shrink)
From 1930 onwards, György Lukács considers ‘uneven development’ the typical relational form between economic progress and the corresponding evolution of other fields of human activity. In the early thirties Lukács focuses on the problem of elaborating an independent Marxist aesthetics, but then necessarily find himself having to deal with the general configuration of Marx’s alleged philosophy. The general theory illustrated in The Ontology of Social Being is where this philosophy, considered as a Weltanschauung, is given its final framework. His reflection (...) on the ‘specificity’ of the aesthetic experience, as part of the broader framework of the main fields of art, science and everyday life, is the theoretical medium Lukács used in the fifties and sixties to fine-tune the need that had arisen decades earlier to attribute Marxism with genuine philosophical universality. (shrink)
This paper examines Lukács’ interpretation of Lenin, with particular focus on his Lenin: A Study on the Unity of his Thought (1924), and the Postscript which was added to the book in January 1967. From 1924 onward, Lukács mainly focuses on the methodological basis of Lenin’s political thought, which vital point lays in applying the category of totality in order to grasp the complexity of the socio-historical conjuncture. In addition, Lukács deals with some ethical aspects of Lenin’s personality, i.e. his (...) human attitude. The ethical aspects are but the weakest point in Lukács’ interpretation. Diametrically opposed to any form of decisionism, Lenin embodies for Lukács both a politician and a political scientist. (shrink)
The principle of Identity of Indiscernibles has been challenged with various thought experiments involving symmetric universes. In this paper, I describe a fractal universe and argue that, while it is not a symmetric universe in the classical sense, under the assumption of a relational theory of space it nonetheless contains a set of objects indiscernible by pure properties alone. I then argue that the argument against the principle from this new thought experiment resists better than those from classical symmetric universes (...) three main objections put forth against this kind of arguments. (shrink)
In this paper I discuss Stephen Davies’s defence of literalism about emotional descriptions of music. According to literalism, a piece of music literally possesses the expressive properties we attribute to it when we describe it as ‘sad’, ‘happy’, etc. Davies’s literalist strategy exploits the concept of polysemy: the meaning of emotion words in descriptions of expressive music is related to the meaning of those words when used in their primary psychological sense. The relation between the two meanings is identified by (...) Davies in music’s presentation of emotion-characteristics-in-appearance. I will contend that there is a class of polysemous uses of emotion terms in descriptions of music that is not included in Davies’s characterization of the link between emotions in music and emotions as psychological states. I conclude by indicating the consequences of my claim for the phenomenology of expressive music. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.