I argue that (i) even though Adam Smith’s fourstagestheory has been criticized with good reasons as both vitiated by undue generalization from modern Europe to the first stage and made bottom-heavy by assumptions of modern episteme, yet, in his writings an alternative view emerges where the savage is not just crushed under the weight of want and isolation but is endowed with imagination and sympathy; (ii) his picture of the fourth stage is, far from a (...) triumphal apology of Capitalism, a tragic diagnosis of an inner tension between ambition and greed and their unintended beneficial effects; (iii) the tensions in the picture are not just a report of tensions out there, but also depend on Smith’s pre-comprehension of the phenomena he tries to account for; (iv) and yet, the tragic character of this picture is to be credited to his integrity; I summarize peculiarities of Smith’s peculiar outlook, post-empiricism, as well as its potentialities (sect 2). I then reconstruct his view of development of language and science, suggesting that his theory of association of ideas and imagination provides a consistent account of both science and lore, yielding a comparatively less ethnocentric evaluation of the savage mind (sect. 3). I reconstruct his virtually twofold reconstruction of subsistence in the rude and early state, arguing that he tends to ascribe inability to evolve to want and isolation and describes his own view of evolution as a necessary path but also that in several passages imagination and sympathy do play a role also for the savage (sect. 4). I compare Smith’s view of the first stage with his diagnosis of commercial society arguing that his reconstruction is burdened by eighteenth-century ideology as well as by modern episteme and I conclude with an ambivalent appraisal of Smith’s comparison between the polished man and the savage. -/- . (shrink)
Aims and Objectives. This article uses the concept of embodiment to demonstrate a conceptual approach to applied phenomenology. -/- Background. Traditionally, qualitative researchers and healthcare professionals have been taught phenomenological methods, such as the epoché, reduction, or bracketing. These methods are typically construed as a way of avoiding biases so that one may attend to the phenomena in an open and unprejudiced way. However, it has also been argued that qualitative researchers and healthcare professionals can benefit from phenomenology’s well-articulated theoretical (...) framework, which consists of core concepts, such as selfhood, empathy, temporality, spatiality, affectivity, and embodiment. -/- Design. This is a discursive article that demonstrates a conceptual approach to applied phenomenology. -/- Method. To outline and explain this approach to applied phenomenology, the Discussion section walks the reader through fourstages of phenomenology, which progress incrementally from the most theoretical to the most practical. -/- Discussion. Part one introduces the philosophical concept of embodiment, which can be applied broadly to any human subject. Part two shows how philosophically trained phenomenologists use the concept of embodiment to describe general features of illness and disability. Part three illustrates how the phenomenological concept of embodiment can inform empirical qualitative studies and reflects on the challenges of integrating philosophy and qualitative research. Part four turns to phenomenology’s application in clinical practice and outlines a workshop model that guides clinicians through the process of using phenomenological concepts to better understand patient experience. -/- Conclusion and Relevance to Clinical Practice. A conceptual approach to applied phenomenology provides a valuable alternative to traditional methodological approaches. Phenomenological concepts provide a foundation for better understanding patient experience in both qualitative health research and clinical practice, and therefore provide resources for enhancing patient care. (shrink)
In this paper, I will reread the history of molecular genetics from a psychoanalytical angle, analysing it as a case history. Building on the developmental theories of Freud and his followers, I will distinguish fourstages, namely: (1) oedipal childhood, notably the epoch of model building (1943–1953); (2) the latency period, with a focus on the development of basic skills (1953–1989); (3) adolescence, exemplified by the Human Genome Project, with its fierce conflicts, great expectations and grandiose claims (1989–2003) (...) and (4) adulthood (2003–present) during which revolutionary research areas such as molecular biology and genomics have achieved a certain level of normalcy—have evolved into a normal science. I will indicate how a psychoanalytical assessment conducted in this manner may help us to interpret and address some of the key normative issues that have been raised with regard to molecular genetics over the years, such as ‘relevance’, ‘responsible innovation’ and ‘promise management’. (shrink)
In this article, I respond to questions about, and criticisms of, my article “Towardan African Moral Theory” that have been put forth by Allen Wood, Mogobe Ramose, Douglas Farland and Jason van Niekerk. The major topicsI address include: what bearing the objectivity of moral value should have on cross-cultural moral differences between Africans and Westerners; whether a harmonious relationship is a good candidate for having final moral value; whether consequentialism exhausts the proper way to respond to the value of (...) a harmonious relationship; what makes a moral theory count as “African”; how the existing literature on African ethics relates to the aim of analytically developing and defending a single foundational moral principle; whether the intuitions I appeal to ground an African moral theory are pro tanto rightmakers or general moral truths; whether the moral theory I defend can capture pro tanto rightness; and whether the best interpretation of African ethics is self-regarding (deeming the only basic moral reason for action to be that it would develop one's own valuable human nature) or other-regarding (holding that a certain kind of harmonious relationship between individuals could ground a basic moral reason for action) . (shrink)
The primary quantum mechanical equation of motion entails that measurements typically do not have determinate outcomes, but result in superpositions of all possible outcomes. Dynamical collapse theories (e.g. GRW) supplement this equation with a stochastic Gaussian collapse function, intended to collapse the superposition of outcomes into one outcome. But the Gaussian collapses are imperfect in a way that leaves the superpositions intact. This is the tails problem. There are several ways of making this problem more precise. But many authors dismiss (...) the problem without considering the more severe formulations. Here I distinguish four distinct tails problems. The first (bare tails problem) and second (structured tails problem) exist in the literature. I argue that while the first is a pseudo-problem, the second has not been adequately addressed. The third (multiverse tails problem) reformulates the second to account for recently discovered dynamical consequences of collapse. Finally the fourth (tails problem dilemma) shows that solving the third by replacing the Gaussian with a non-Gaussian collapse function introduces new conflict with relativity theory. (shrink)
"Procedural Justice" offers a theory of procedural fairness for civil dispute resolution. The core idea behind the theory is the procedural legitimacy thesis: participation rights are essential for the legitimacy of adjudicatory procedures. The theory yields two principles of procedural justice: the accuracy principle and the participation principle. The two principles require a system of procedure to aim at accuracy and to afford reasonable rights of participation qualified by a practicability constraint. The Article begins in Part I, (...) Introduction, with two observations. First, the function of procedure is to particularize general substantive norms so that they can guide action. Second, the hard problem of procedural justice corresponds to the following question: How can we regard ourselves as obligated by legitimate authority to comply with a judgment that we believe (or even know) to be in error with respect to the substantive merits? The theory of procedural justice is developed in several stages, beginning with some preliminary questions and problems. The first question - what is procedure? - is the most difficult and requires an extensive answer: Part II, Substance and Procedure, defines the subject of the inquiry by offering a new theory of the distinction between substance and procedure that acknowledges the entanglement of the action-guiding roles of substantive and procedural rules while preserving the distinction between two ideal types of rules. The key to the development of this account of the nature of procedure is a thought experiment, in which we imagine a world with the maximum possible acoustic separation between substance and procedure. Part III, The Foundations of Procedural Justice, lays out the premises of general jurisprudence that ground the theory and answers a series of objections to the notion that the search for a theory of procedural justice is a worthwhile enterprise. Sections II and III set the stage for the more difficult work of constructing a theory of procedural legitimacy. Part IV, Views of Procedural Justice, investigates the theories of procedural fairness found explicitly or implicitly in case law and commentary. After a preliminary inquiry that distinguishes procedural justice from other forms of justice, Part IV focuses on three models or theories. The first, the accuracy model, assumes that the aim of civil dispute resolution is correct application of the law to the facts. The second, the balancing model, assumes that the aim of civil procedure is to strike a fair balance between the costs and benefits of adjudication. The third, the participation model, assumes that the very idea of a correct outcome must be understood as a function of process that guarantees fair and equal participation. Part IV demonstrates that none of these models provides the basis for a fully adequate theory of procedural justice. In Part V, The Value of Participation, the lessons learned from analysis and critique of the three models are then applied to the question whether a right of participation can be justified for reasons that are not reducible to either its effect on the accuracy or its effect on the cost of adjudication. The most important result of Part V is the Participatory Legitimacy Thesis: it is (usually) a condition for the fairness of a procedure that those who are to be finally bound shall have a reasonable opportunity to participate in the proceedings. The central normative thrust of Procedural Justice is developed in Part VI, Principles of Procedural Justice. The first principle, the Participation Principle, stipulates a minimum (and minimal) right of participation, in the form of notice and an opportunity to be heard, that must be satisfied (if feasible) in order for a procedure to be considered fair. The second principle, the Accuracy Principle, specifies the achievement of legally correct outcomes as the criterion for measuring procedural fairness, subject to four provisos, each of which sets out circumstances under which a departure from the goal of accuracy is justified by procedural fairness itself. In Part VII, The Problem of Aggregation, the Participation Principle and the Accuracy Principle are applied to the central problem of contemporary civil procedure - the aggregation of claims in mass litigation. Part VIII offers some concluding observations about the point and significance of Procedural Justice. (shrink)
This article defends the Doomsday Argument, the Halfer Position in Sleeping Beauty, the Fine-Tuning Argument, and the applicability of Bayesian confirmation theory to the Everett interpretation of quantum mechanics. It will argue that all four problems have the same structure, and it gives a unified treatment that uses simple models of the cases and no controversial assumptions about confirmation or self-locating evidence. The article will argue that the troublesome feature of all these cases is not self-location but selection (...) effects. (shrink)
In this paper, we describe four broad ‘meta-methods’ employed in scientific and philosophical research of qualia. These are the theory-centred metamethod, the property-centred meta-method, the argument-centred meta-method, and the event-centred meta-method. Broadly speaking, the theory-centred meta-method is interested in the role of qualia as some theoretical entities picked out by our folk psychological theories; the property-centred meta-method is interested in some metaphysical properties of qualia that we immediately observe through introspection ; the argument-centred meta-method is interested in (...) the role of qualia in some arguments for non-physicalism; the event-centred metamethod is interested in the role of qualia as some natural events whose nature is hidden and must be uncovered empirically. We will argue that the event-centred metamethod is the most promising route to a comprehensive scientific conception of qualia because of the flexibility of ontological and methodological assumptions it can provide. We also reveal the hidden influences of the different meta-methods and in doing so show why consideration of meta-methods has value for the study of consciousness. (shrink)
Over the past fifteen years there has been a considerable amount of debate concerning what theoretical population dynamic models tell us about the nature of natural selection and drift. On the causal interpretation, these models describe the causes of population change. On the statistical interpretation, the models of population dynamics models specify statistical parameters that explain, predict, and quantify changes in population structure, without identifying the causes of those changes. Selection and drift are part of a statistical description of population (...) change; they are not discrete, apportionable causes. Our objective here is to provide a definitive statement of the statistical position, so as to allay some confusions in the current literature. We outline four commitments that are central to statisticalism. They are: 1. Natural Selection is a higher order effect; 2. Trait fitness is primitive; 3. Modern Synthesis (MS)-models are substrate neutral; 4. MS-selection and drift are model-relative. (shrink)
Relativity theory is often said to support something called ‘the four-dimensional view of reality’. But there are at least three different views that sometimes go by this name. One is ‘spacetime unitism’, according to which there is a spacetime manifold, and if there are such things as points of space or instants of time, these are just spacetime regions of different sorts: thus space and time are not separate manifolds. A second is the B-theory of time, according (...) to which the past, present, and future are all equally real and there is nothing metaphysically special about the present. A third is perdurantism, according to which persisting material objects are made up of different temporal parts located at different times. We sketch routes from relativity to unitism and to the B-theory. We then discuss some routes to perdurantism, via the B-theory and via unitism. (shrink)
Four-dimensionalism and eternalism are theories on time, change, and persistence. Christian philosophers and theologians have adopted four-dimensional eternalism for various reasons. In this paper I shall attempt to argue that four-dimensional eternalism conflicts with Christian thought. Section I will lay out two varieties of four-dimensionalism—perdurantism and stage theory—along with the typically associated ontologies of time of eternalism and growing block. I shall contrast this with presentism and endurantism. Section II will look at some of the (...) purported theological benefits of adopting four-dimensionalism and eternalism. Section III will examine arguments against four-dimensional eternalism from the problem of evil. Section IV will argue that four-dimensional eternalism causes problems for Christian eschatology. (shrink)
Perdurantists think of continuants as mereological sums of stages from different times. This view of persistence would force us to drop the idea that there is genuine change in the world. By exploiting a presentist metaphysics, Brogaard proposed a theory, called presentist four-dimensionalism, that aims to reconcile perdurantism with the idea that things undergo real change. However, her proposal commits us to reject the idea that stages must exist in their entirety. Giving up the tenet that (...) all the stages are equally real could be a price that perdurantists are unwilling to pay. I argue that Kit Fine ’s fragmentalism provides us with the tools to combine a presentist metaphysics with a perdurantist theory of persistence without giving up the idea that reality is constituted by more than purely present stages. (shrink)
This thesis is about the conceptualization of persistence of physical, middle-sized objects within the theoretical framework of the revisionary ‘B-theory’ of time. According to the B-theory, time does not flow, but is an extended and inherently directed fourth dimension along which the history of the universe is ‘laid out’ once and for all. It is a widespread view among philosophers that if we accept the B-theory, the commonsensical ‘endurance theory’ of persistence will have to be rejected. (...) The endurance theory says that objects persist through time by being wholly present at distinct times as numerically the same entity. Instead of endurantism, it has been argued, we have to adopt either ‘perdurantism’ or the ‘stage theory’. Perdurantism is the theory that objects are four-dimensional ‘space-time worms’ persisting through time by having distinct temporal parts at distinct times. The stage theory says that objects are instantaneous temporal parts (stages) of space-time worms, persisting by having distinct temporal counterparts at distinct times. In the thesis, it is argued that no good arguments have been provided for the conclusion that we are obliged to drop the endurance theory by acceptance of the B-theory. This conclusion stands even if the endurance theory incorporates the claim that objects endure through intrinsic change. It is also shown that perdurantism and the stage theory come with unwelcome consequences. -/- Paper I demonstrates that the main arguments for the view that objects cannot endure in B-time intrinsically unchanged fail. Papers II and III do the same with respect to the traditional arguments against endurance through intrinsic change in B-time. Paper III also contains a detailed account of the semantics of the tenseless copula, which occurs frequently in the debate. The contention of Paper IV is that four-dimensional space-time worms, as traditionally understood, are not suited to take dispositional predicates. In Paper V, it is shown that the stage theory needs to introduce an overabundance of persistence-concepts, many of which will have to be simultaneously applicable to a single object (qua falling under a single sortal), in order for the theory to be consistent. The final article, Paper VI, investigates the sense in which persistence can, as is sometimes suggested, be a ‘conventional matter’. It also asks whether alleged cases of ‘conventional persistence’ create trouble for the endurance theory. It is argued that conventions can only enter at a trivial semantic level, and that the endurance theory is no more threatened by such conventions than are its rivals. (shrink)
This paper presents the strongest version of a non-perdurantist four-dimensionalism: a theory according to which persisting objects are four-dimensionally extended in space-time, but not in virtue of having maximal temporal parts. The aims of considering such a view are twofold. First, to evaluate whether such an account could provide a plausible middle ground between the two main competitor accounts of persistence: three-dimensionalism and perdurantist four-dimensionalism. Second, to see what light such a theory sheds on the (...) debate between these two competitor theories. I conclude that despite prima facie reasons to suppose that non-perdurantist four-dimensionalism might be a credible alternative to either other account of persistence, ultimately the view is unsuccessful. The reasons for its failure illuminate the sometimes stagnant debate between three-dimensionalists and perdurantists, providing new reasons to prefer a perdurantist metaphysics. (shrink)
The aim of this paper is to propose a systematic classification of emotions which can also characterize their nature. The first challenge we address is the submission of clear criteria for a theory of emotions that determine which mental phenomena are emotions and which are not. We suggest that emotions as a subclass of mental states are determined by their functional roles. The second and main challenge is the presentation of a classification and theory of emotions that can (...) account for all existing varieties. We argue that we must classify emotions according to four developmental stages: 1. pre-emotions as unfocussed expressive emotion states, 2. basic emotions, 3. primary cognitive emotions, and 4. secondary cognitive emotions. We suggest four types of basic emotions (fear, anger, joy and sadness) which are systematically differentiated into a diversity of more complex emotions during emotional development. The classification distinguishes between basic and non-basic emotions and our multi-factorial account considers cognitive, experiential, physiological and behavioral parameters as relevant for constituting an emotion. However, each emotion type is constituted by a typical pattern according to which some features may be more significant than others. Emotions differ strongly where these patterns of features are concerned, while their essential functional roles are the same. We argue that emotions form a unified ontological category that is coherent and can be well defined by their characteristic functional roles. Our account of emotions is supported by data from developmental psychology, neurobiology, evolutionary biology and sociology. (shrink)
Scientists are constantly making observations, carrying out experiments, and analyzing empirical data. Meanwhile, scientific theories are routinely being adopted, revised, discarded, and replaced. But when are such changes to the content of science improvements on what came before? This is the question of scientific progress. One answer is that progress occurs when scientific theories ‘get closer to the truth’, i.e. increase their degree of truthlikeness. A second answer is that progress consists in increasing theories’ effectiveness for solving scientific problems. A (...) third answer is that progress occurs when the stock of scientific knowledge accumulates. A fourth and final answer is that scientific progress consists in increasing scientific understanding, i.e. the capacity to correctly explain and reliably predict relevant phenomena. This paper compares and contrasts these four accounts of scientific progress, considers some of the most prominent arguments for and against each account, and briefly explores connections to different forms of scientific realism. (shrink)
In humans, knowing the world occurs through spatial-temporal experiences and interpretations. Conscious experience is the direct observation of conscious events. It makes up the content of consciousness. Conscious experience is organized in four dimensions. It is an orientation in space and time, an understanding of the position of the observer in space and time. A neural correlate for four-dimensional conscious experience has been found in the human brain which is modeled by Einstein’s Special Theory of Relativity. Spacetime (...) intervals are fundamentally involved in the organization of coherent conscious experiences. They account for why conscious experience appears to us the way it does. They also account for assessment of causality and past-future relationships, the integration of higher cognitive functions, and the implementation of goal-directed behaviors. Spacetime intervals in effect compose and direct our conscious life. The relativistic concept closes the explanatory gap and solves the hard problem of consciousness (how something subjective like conscious experience can arise in something physical like the brain). There is a place in physics for consciousness. We describe all physical phenomena through conscious experience, whether they be described at the quantum level or classical level. Since spacetime intervals direct the formation of all conscious experiences and all physical phenomena are described through conscious experience, the equation formulating spacetime intervals contains the information from which all observable phenomena may be deduced. It might therefore be considered expression of a theory of everything. (shrink)
Recently, Jim Stone has argued that counterpart theory is incompatible with the existence of temporal parts. I demonstrate that there is no such incompatibility.
After rejecting substance dualism, some naturalists embrace patternism. It states that persons are bodies and that bodies are material machines running abstract person programs. Following Aristotle, these person programs are souls. Patternists adopt four-dimensionalist theories of persistence: Bodies are 3D stages of 4D lives. Patternism permits at least six types of life after death. It permits quantum immortality, teleportation, salvation through advanced technology, promotion out of a simulated reality, computational monadology, and the revision theory of resurrection.
The four case studies on chance in evolution provide a rich source for further philosophical analysis. Among the issues raised are the following: Are there different conceptions of chance at work, or is there a common underlying conception? How can a given concept of chance be distinguished from other chance concepts and from nonchance concepts? How can the occurrence of a given chance process be distinguished empirically from nonchance processes or other chance processes? What role does chance play in (...) evolutionary theory? I argue that in order to answer these questions, a careful distinction between process and outcome must be made; however, the purpose of this essay is not to answer these questions definitively, but rather to elaborate on them and to provide a starting point for further discussion. (shrink)
The paper re-expresses arguments against the normative validity of expected utility theory in Robin Pope (1983, 1991a, 1991b, 1985, 1995, 2000, 2001, 2005, 2006, 2007). These concern the neglect of the evolving stages of knowledge ahead (stages of what the future will bring). Such evolution is fundamental to an experience of risk, yet not consistently incorporated even in axiomatised temporal versions of expected utility. Its neglect entails a disregard of emotional and financial effects on well-being before a (...) particular risk is resolved. These arguments are complemented with an analysis of the essential uniqueness property in the context of temporal and atemporal expected utility theory and a proof of the absence of a limit property natural in an axiomatised approach to temporal expected utility theory. Problems of the time structure of risk are investigated in a simple temporal framework restricted to a subclass of temporal lotteries in the sense of David Kreps and Evan Porteus (1978). This subclass is narrow but wide enough to discuss basic issues. It will be shown that there are serious objections against the modification of expected utility theory axiomatised by Kreps and Porteus (1978, 1979). By contrast the umbrella theory proffered by Pope that she has now termed SKAT, the Stages of Knowledge Ahead Theory, offers an epistemically consistent framework within which to construct particular models to deal with particular decision situations. A model by Caplin and Leahy (2001) will also be discussed and contrasted with the modelling within SKAT (Pope, Leopold and Leitner 2007). (shrink)
This paper presents the strongest version of a non-perdurantist four-dimensionalism: a theory according to which persisting objects are four-dimensionally extended in space-time, but not in virtue of having maximal temporal parts. The aims of considering such a view are twofold. First, to evaluate whether such an account could provide a plausible middle ground between the two main competitor accounts of persistence: three-dimensionalism and perdurantist four-dimensionalism. Second, to see what light such a theory sheds on the (...) debate between these two competitor theories. I conclude that despite prima facie reasons to suppose that non-perdurantist four-dimensionalism might be a credible alternative to either other account of persistence, ultimately the view is unsuccessful. The reasons for its failure illuminate the sometimes stagnant debate between three-dimensionalists and perdurantists, providing new reasons to prefer a perdurantist metaphysics. (shrink)
At first, I explain how Bergmann reads Meinong. As regards his method, Bergmann’s stated aim is to examine Meinong’s thought through all the stages of its development; but he is very selective in choosing exactly what to consider, not just within each of Meinong’s texts, but equally among his texts – indeed he completely ignores Meinong’s mature works. Moreover, he often alters Meinong’s thought by translating it into his foil ontology. As regards the content, Bergmann interprets Meinong as a (...) reist and a nominalist. I try to show that such a view is not correct. I then discuss this interpretation by focusing on which Meinong Bergmann reads, that is, which writings he refers to and at the same time which of Meinong’s theories he criticizes. I sketch the four phases of the development of Meinong’s thought distinguished by Bergmann: his first theory of relations, the theory of the objects of higher order, of objectives, and finally object theory. I present Bergmann’s critique and compare his distinction of different degrees of independence, which establish differences of status among categories of existents, with Meinong’s distinction between kinds of being. Finally, taking into account also Meinong’s mature work, I offer an assessment of Bergmann’s proposal to rethink object theory. Considering Meinong’s theory of incomplete objects, I show that Bergmann would have found in Meinong an ally not only in the battle against representationalism, as he maintains, but also in that against nominalism. (shrink)
Infectious logics are systems that have a truth-value that is assigned to a compound formula whenever it is assigned to one of its components. This paper studies four-valued infectious logics as the basis of transparent theories of truth. This take is motivated as a way to treat different pathological sentences differently, namely, by allowing some of them to be truth-value gluts and some others to be truth-value gaps and as a way to treat the semantic pathology suffered by at (...) least some of these sentences as infectious. This leads us to consider four distinct four-valued logics: one where truth-value gaps are infectious, but gluts are not; one where truth-value gluts are infectious, but gaps are not; and two logics where both gluts and gaps are infectious, in some sense. Additionally, we focus on the proof theory of these systems, by offering a discussion of two related topics. On the one hand, we prove some limitations regarding the possibility of providing standard Gentzen sequent calculi for these systems, by dualizing and extending some recent results for infectious logics. On the other hand, we provide sound and complete four-sided sequent calculi, arguing that the most important technical and philosophical features taken into account to usually prefer standard calculi are, indeed, enjoyed by the four-sided systems. (shrink)
● Sergio Cremaschi, The non-existing Island. I discuss the way in which the cleavage between the Continental and the Anglo-American philosophies originated, the (self-)images of both philosophical worlds, the converging rediscoveries from the Seventies, as well as recent ecumenic or anti-ecumenic strategies. I argue that pragmatism provides an important counter-instance to both the familiar self-images and to the fashionable ecumenic or anti-ecumenic strategies. My conclusions are: (i) the only place where Continental philosophy exists (as Euro-Communism one decade ago) is America; (...) (ii) less obviously, also analytic philosophy does not exist, or does no more exist as a current or a paradigm; what does exist is, on the one hand, philosophy of language and, on the other, philosophy of mind, that is, two disciplines; (iii) the dissolution of analytic philosophy as a school has been extremely fruitful, precisely in so far as it has left room for disciplines and research programmes; (iv) what is left, of the Anglo-American/Continental cleavage is primarily differences in styles, depending partly on intellectual traditions, partly owing to sociology, history, institutional frameworks; these differences should not be blurred by rash ecumenism; besides, theoretical differences are alive as ever, but within both camps; finally, there is indeed a lag (not a difference) in the appropriation of intellectual techniques by most schools of 'Continental' philosophy, and this should be overcome through appropriation of what the best 'analytic' philosophers have produced. -/- ● Michael Strauss, Language and sense-perception: an aspect of analytic philosophy. To test an assertion about one fact by comparing it with perceived reality seems quite unproblematic. But the very possibility of such a procedure is incompatible with the intellectualistic basis of logical positivism and atomism (as it is for example to be found in Russell's Analysis of Mind). According to the intellectualistic approach pure sensation is meaningless. Sensation receives its meaning and order from the intellect through interpretation, which is performed with the help of linguistic tools, i.e. words and sentences. Before being interpreted, sensation is not a picture or a representation, it is neither true nor false, neither an illusion nor knowledge; it does not tell us anything; it is a lifeless and order-less matter. But how can a thought (or a proposition) be compared with such a lifeless matter? This difficulty confronts the intellectualist, if on the one hand he admits the necessity of comparing thought with sense-perception, and on the other hand presupposes that we possess only intellectual and no immediate perceptual understanding of what we see and hear. In this paper I give a critical exposition of three attempts, made by Russell, Neurath and Wittgenstein, to solve this problem. The first attempt adheres to strict conventionalism, the second tends to naturalism and the third leads to an amended, very moderate version of conventionalism. This amended conventionalism looks at sense impressions as being a peculiar language, which includes primary symbols, i.e. symbols not founded on convention and not being in need of interpretation. -/- ● Ernst Tugendhat, Phenomenology and language analysis. The paper, first published in German in 1970, by which Tugendhat gave a start to the German rediscovery of analytic philosophy. The author stages a confrontation between phenomenology and language analysis. He argues that language analysis does not differ from phenomenology as far as the topics dealt with are concerned; instead, both currents are quite different in method. The author argues that language-analytic philosophy does not simply lay out of the mainstream of transcendental philosophy, but that instead it challenges this tradition on the very level of foundations. The author criticizes the linguistic-analytic approach centred on the subject as well as any object-centred approach, while proposing inter-subjective understanding through language as the new universal framework. This is, when construed in so general terms, the same program of hermeneutics, though in a more basic version. -/- ● Jürgen Habermas, Language game, intention and meaning. On a few suggestions by Sellars and Wittgenstein. -/- The paper, first published in German in 1975, in which Habermas announces his own linguistic turn through a discovery of speech acts. In this essay the author wants to work out a categorical framework for a communicative theory of society; he takes Wittgenstein's concept of language game as a Leitfade and, besides, he takes advantage also of Wilfried Sellars's quasi-transcendental account of the genesis of intentionality. His goal is to single out the problems connected with a theory of consciousness oriented in a logical-linguistic sense. -/- ● Zvie Bar-On, Isomorphism of speech acts and intentional states. -/- This essay presents the problem of the formal relationship between speech acts and intentional states as an essential part of the perennial philosophical question of the relation between language and thought. I attempt to show how this problem had been dealt with by two prominent philosophers of different camps in our century, Edmund Husserl and John Searle. Both of them wrote extensively about the theory of intentionality. I point out an interesting, as it were unintended, continuity of their work on that theory. Searle started where Husserl left off 80 years earlier. Their meeting point could be used as the first clue in our search. They both adopted in effect the same distinction between two basic aspects of the intentional experience: its content or matter, and its quality or mode. Husserl did not yet have the concept of a speech act as contradistinguished from an intentional state. The working hypothesis, however, which he suggested, could be used as a second clue for the further elaboration of the theory. The relationship of the two levels, the mental and the linguistic, which remained for Husserl in the background only, became the cornerstone of Searle' s inquiry. He employed the speech act as the model and analysed the intentional experience by means of the conceptual apparatus of his own theory of speech acts. This procedure enabled him to mark out a number of parallelisms and correlations between the two levels. This procedure explains the phenomenon of the partial isomorphism of speech acts and intentional states. -/- ● Roberta de Monticelli, Ontology. A dialogue among the linguistic philosopher, the naturalist, and the phenomenological philosopher. -/- This paper proposes a comparison between two main ways of conceiving the role and scope of that fundamental part of philosophy (or of "first" philosophy) which is traditionally called "ontology". One way, originated within the analytic tradition, consists of two main streams, namely philosophy of language and (contemporary) philosophy of mind, the former yielding "reduced ontology" and the latter "neo-Aristotelian ontology". The other way of conceiving ontology is exemplified by "phenomenological ontology" (more precisely, the Husserlian, not the Heideggerian version). Ontology as a theory of reference ("reduced" ontology, or ontology as depending on semantics) is presented and justified on the basis of some classical thesis of traditional philosophy of language (from Frege to Quine). "Reduced ontology" is shown to be identifiable with one level of a traditional, Aristotelian ontology, namely the one which corresponds to one of the four "senses of being" listed in Aristotle's Metaphysics: "being" as "being true". This identification is justified on the basis of Franz Brentano's "rules for translation" of the Aristotelian table of judgements in terms of (positive and negative) existential judgments such as are easily translatable into sentences of first order predicate logic. The second part of the paper is concerned with "neo-Aristotelian ontology", i.e. with naturalism and physicalism as the main ontological options underlying most of contemporary discussion in the philosophy of mind. The qualification of such options as "neo-Aristotelian" is justified; the relationships between "neo-Aristotelian ontology" and "reduced ontology" are discussed. In the third part the fundamental tenet of "phenomenological ontology" is identified by the thesis that a logical theory of existence and being does capture a sense of "existing" and "being" which, even though not the basic one, is grounded in the basic one. An attempt is done of further clarifying this "more basic" sense of "being". An argument making use of this supposedly "more basic" sense is advanced in favour of a "phenomenological ontology". -/- ● Kuno Lorenz, Analytic Roots in Dialogic Constructivism. -/- Both in the Vienna Circle ad in Russell's early philosophy the division of knowledge into two kinds (or two levels), perceptual and conceptual, plays a vital role. Constructivism in philosophy, in trying to provide a pragmatic foundation - a knowing-how - to perceptual as well as conceptual competences, discovered that this is dependent on semiotic tools. Therefore, the "principle of method" had to be amended by the "principle of dialogue". Analytic philosophy being an heir of classical empiricism, conceptually grasping the "given", and constructive philosophy being an heir of classical rationalism, perceptually providing the "constructed", merge into dialogical constructivism, a contemporary development of ideas derived especially from the works of Charles S. Peirce (his pragmatic maxim as a means of giving meaning to signs) and of Ludwig Wittgenstein (his language games as tools of comparison for understanding ways of life). -/- 7. Albrecht Wellmer, "Autonomy of meaning" and "principle of charity" from the viewpoint of the pragmatics of language. -/- In this essay I present an interpretation of the principle of the autonomy of meaning and of the principle of charity, the two main principles of Davidson's semantic view of truth, showing how both principles may fit in a perspective dictated by the pragmatics of language. I argue that (I) the principle of the autonomy of meaning may be thoroughly reformulated in terms of the pragmatics of language, (ii) the principle of charity needs a supplement in terms of pragmatics of language in order to become really enlightening as a principle of interpretation. Besides, I argue that: (i) on the one hand, the fundamental thesis of Habermas on the pragmatic theory of meaning ("we understand a speech act when we know what makes it admissible") is correlated with the seemingly intentionalist thesis according to which we understand a speech act when we know what a speaker means; (ii) on the other hand, to say that the meaning competence of a competent speaker is basically a competence about a potential of reasons (or also of possible justifications) which is inherently connected with the meaning of statements, or with their use in utterances. -/- ● Rüdiger Bubner, The convergence of analytic and hermeneutic philosophy -/- This paper argues that the analytic philosophy does not exist, at least as understood by its original programs. Differences in the analytic camp have always been bigger than they were believed to be. Now these differences are coming to the fore thanks to a process of dissolution of dogmatism. Philosophical analysis is led by its own inner logic towards questions that may be fairly qualified as hermeneutic. Recent developments in analytic philosophy, e.g. Davidson, seem to indicate a growing convergence of themes between philosophical analysis and hermeneutics; thus, the familiar opposition of Anglo-Saxon and Continental philosophy might soon belong to history. The fact of an ongoing appropriation of analytical techniques by present-day German philosophers may provide a basis for a powerful argument for the unity of philosophizing, beyond its strained images privileging one technique of thinking and rejecting the remainder. Actual philosophical practice should take the dialogue between the two camps more seriously; in fact, the processes described so far are no danger to philosophical work. They may be a danger for parochial approaches to philosophizing; indeed, contrary to what happens in the natural sciences, Thomas Kuhn's "normal science" developing within the framework of one fixed paradigm is not typical for philosophical thinking. And in philosophy innovating revolutions are symptoms more of vitality than of crisis. -/- ● Karl-Otto Apel, The impact of analytic philosophy on my intellectual biography. -/- In my paper I try to reconstruct the history of my Auseinandersetzung mit - as I called it - "language-analytical" philosophy (including even Peircean semiotics) since the late Fifties. The heuristics of my study was predetermined by two main motives of my beginnings: the hermeneutic turn of phenomenology and the transformation of "transcendental philosophy" in the light of the "language a priori". Thus, I took issue with the early and the later Wittgenstein, logical positivism, and post-Wittgensteinian and post-empiricist philosophy of science (i.e. G.H. von Wright and the renewal of the "explanation vs understanding controversy" as well as the debate between Th. Kuhn and Popper/Lakatos); besides, with speech act theory and the debate about "transcendental arguments" since Strawson. The "pragmatic turn", started already by C.L. Morris and the later Carnap, led me to study also the relationship between Wittgensteinian "use" theory of meaning and of truth. This resulted on my side in something like a program of "transcendental semiotics", i.e. "transcendental pragmatics" and "transcendental hermeneutics". -/- ● Ben-Ami Scharfstein, A doubt on both their houses: the blindness to non-western philosophies. The burden of my criticism is that contemporary European philosophers of all kinds have continued to think as if there were no true philosophy but that of the West. For the most part, the existentialists have been oblivious of their Eastern congeners; the hermeneuticians have yet to stretch their horizons beyond the most familiar ones; and the analysts remain unaware of the analyses and linguistic sensitivities of the ancient non-European philosophers. Briefly, ignorance still blinds almost all contemporary Western philosophers to the rich, variegated philosophical traditions outside of their familiar orbit. Both Continental and Anglo-Americans have lost the breadth of view that once characterized such thinkers as Herder and the Humboldts. The blindness that has resulted is not simply that of individual Western philosophers but of our whole, still parochial philosophical culture. (shrink)
A platitude that took hold with Kuhn is that there can be several equally good ways of balancing theoretical virtues for theory choice. Okasha recently modelled theory choice using technical apparatus from the domain of social choice: famously, Arrow showed that no method of social choice can jointly satisfy four desiderata, and each of the desiderata in social choice has an analogue in theory choice. Okasha suggested that one can avoid the Arrow analogue for theory (...) choice by employing a strategy used by Sen in social choice, namely, to enhance the information made available to the choice algorithms. I argue here that, despite Okasha’s claims to the contrary, the information-enhancing strategy is not compelling in the domain of theory choice. (shrink)
Conscious experience is the direct observation of conscious events. Human conscious experience is four-dimensional. Conscious events are linked (associated) by spacetime intervals to produce a coherent conscious experience. This explains why conscious experience appears to us the way it does. Conscious experience is an orientation in space and time, an understanding of the position of the observer in space and time. Causality, past-future relations, learning, memory, cognitive processing, and goal-directed actions all evolve from four-dimensional conscious experience. A neural (...) correlate for four-dimensional conscious experience can be found in the human brain and is modelled by Einstein's special theory of relativity. The relativistic concept of spacetime interval is central for understanding conscious experience and cognition. (shrink)
Casati and Varzi have developed a theory of boundary based on extensional mereotopology and the distinction between fiat and bona fide boundaries. Firstly, I point out some problems in their theory that are related to the contact of bodies. Next, I propose a way of classification of boundaries into four kinds based on substance ontology and an alternative distinction between potential and actual boundaries. Finally, I will show that my way of classification makes it possible to solve (...) the problems above. (shrink)
This paper begins the last instalment of a six-part project correlating the key aspects of Kant’s architectonic conception of philosophy with a special version of the Chinese Book of Changes that I call the “Compound Yijing”, which arranges the 64 hexagrams (gua) into both fourfold and threefold sets. I begin by briefly summarizing the foregoing articles: although Kant and the Yijing employ different types of architectonic reasoning, the two systems can both be described in terms of three “levels” of elements. (...) Starting at an unnumbered level devoid of any element (the tao or thing in itself), the system proceeds by elaborating a key fourfold distinction (or “quaternity”) on the first level, a twelvefold distinction on the second level, and twelve quaternities (grouped in four quadrants, each with a set of three quaternities) on the third level. Each set of three quaternities (i.e., each quadrant) on the third level corresponds to one of the four “faculties” of the university, as elaborated in Kant’s book, The Conflict of the Faculties. Previous papers have examined the correlations between three key quaternities that Kant defends in relation to each of three faculties (philosophy, theology, and law) and the 12 gua that correspond to that faculty in the Compound Yijing. The final step is to explore the fourth quaternity on the third level, the 12 gua corresponding to the medical faculty. The “idea of reason” in Kant’s metaphysics that guides this wing of the comparative analysis is freedom, and the ultimate purpose of this faculty of the university is to train doctors to care for people’s physical well-being, as free agents imbedded in nature. But this paper will focus only on the four gua that correspond to four basic concepts in Kant’s theory of medicine. The two quaternities in the “yin-yang” (medical) quadrant of the Compound Yijing that will be skipped here are as follows. First, Kant’s account of the idea of freedom itself, which gives rise to the area of traditional metaphysics known as rational cosmology, comes in the first Critique’s Dialectic, in the section on the Antinomy of Reason (CPR A405-567/B432- 595). There he examines four irresolvable issues: whether the world has a beginning in time; whether composite substances consist of simple parts; whether a causality of freedom operates in the natural world; and whether an absolutely necessary being exists. Later I will argue that these correspond to the quaternity consisting of gua 15, 22, 36, and 52. (shrink)
In this paper I argue that the idiosyncrasy of linguistic competence fosters semantic conceptions in which meanings are taken for granted, such as the one that Quine calls ‘uncritical semantics’ or ‘the myth of the museum’. This is due to the degree of automaticity in the use of language which is needed for fluent conversation. Indeed, fluent conversation requires that we speakers instinctively associate each word or sentence with its meaning (or linguistic use), and instinctively resort to the conceptual repertoire (...) of our language, without calling into question that the meaning of a particular word, or the conceptual repertoire of our language, could have been different than they are. This habit of taking meanings for granted, inherent to our linguistic ability, sometimes interferes with our semantic research, hampering it. In order to illustrate this problem, I pinpoint four places in Quine’s work where, despite his acknowledged analytical rigour, and despite his congenital aversion to the habit of taking meanings for granted, he himself appears to slip into this habit, inadvertently. (shrink)
This paper provides a critical overview of the realist current in contemporary political philosophy. We define political realism on the basis of its attempt to give varying degrees of autonomy to politics as a sphere of human activity, in large part through its exploration of the sources of normativity appropriate for the political and so distinguish sharply between political realism and non-ideal theory. We then identify and discuss four key arguments advanced by political realists: from ideology, from the (...) relationship of ethics to politics, from the priority of legitimacy over justice and from the nature of political judgement. Next, we ask to what extent realism is a methodological approach as opposed to a substantive political position and so discuss the relationship between realism and a few such positions. We close by pointing out the links between contemporary realism and the realist strand that runs through much of the history of Western political thought. (shrink)
Following the development of the selectionist theory of the immune system, there was an attempt to characterize many biological mechanisms as being "selectionist" as juxtaposed to "instructionist." But this broad definition would group Darwinian evolution, the immune system, embryonic development, and Chomsky's language-acquisition mechanism as all being "selectionist." Yet Chomsky's mechanism (and embryonic development) are significantly different from the selectionist mechanisms of biological evolution or the immune system. Surprisingly, there is a very abstract way using two dual mathematical logics (...) to make the distinction between genuinely selectionist mechanisms and what are better called "generative" mechanisms. This note outlines that distinction. (shrink)
The “four-color” theorem seems to be generalizable as follows. The four-letter alphabet is sufficient to encode unambiguously any set of well-orderings including a geographical map or the “map” of any logic and thus that of all logics or the DNA plan of any alive being. Then the corresponding maximally generalizing conjecture would state: anything in the universe or mind can be encoded unambiguously by four letters. That admits to be formulated as a “four-letter theorem”, and thus (...) one can search for a properly mathematical proof of the statement. It would imply the “four colour theorem”, the proof of which many philosophers and mathematicians believe not to be entirely satisfactory for it is not a “human proof”, but intermediated by computers unavoidably since the necessary calculations exceed the human capabilities fundamentally. It is furthermore rather unsatisfactory because it consists in enumerating and proving all cases one by one. Sometimes, a more general theorem turns out to be much easier for proving including a general “human” method, and the particular and too difficult for proving theorem to be implied as a corollary in certain simple conditions. The same approach will be followed as to the four colour theorem, i.e. to be deduced more or less trivially from the “four-letter theorem” if the latter is proved. References are only classical and thus very well-known papers: their complete bibliographic description is omitted. (shrink)
This paper aims at bringing a new philosophical perspective to the current debate on the death penalty through a discussion of peculiar kinds of uncertainties that surround the death penalty. I focus on laying out the philosophical argument, with the aim of stimulating and restructuring the death penalty debate. I will begin by describing views about punishment that argue in favour of either retaining the death penalty (‘retentionism’) or abolishing it (‘abolitionism’). I will then argue that we should not ignore (...) the so-called “whom-question”, i.e. “To whom should we justify the system of punishment?” I identify three distinct chronological stages to address this problem, namely, “the Harm Stage”, “the Blame Stage”, and “the Danger Stage”. I will also identify four problems arising from specific kinds of uncertainties present in current death penalty debates: (1) uncertainty in harm, (2) uncertainty in blame, (3) uncertainty in rights, and (4) uncertainty in causal consequences. In the course of examining these four problems, I will propose an ‘impossibilist’ position towards the death penalty, according to which the notion of the death penalty is inherently contradictory. Finally, I will suggest that it may be possible to apply this philosophical perspective to the justice system more broadly, in particular to the maximalist approach to restorative justice. (shrink)
“The truth,” Quine says, “is that you can bathe in the same river twice, but not in the same river stage. You can bathe in two river stages which are stages of the same river, and this is what constitutes bathing in the same river twice. A river is a process through time, and the river stages are its momentary parts.” (Quine 1953, p. 65) Quine’s view is four-dimensionalism, and that is what Theodore Sider’s book is (...) about. In Sider’s usage, four-dimensionalism is the view that, necessarily, anything in space and time has a distinct temporal part, or stage, corresponding to each time at which it exists (p. 59). (shrink)
There is, among some scientists and philosophers, the idea that any theory that would allow the time travel would introduce causal issues. These types of temporal paradoxes can be avoided by the Novikov self-consistency principle or by a variation in the interpretation of many worlds with interacting worlds. The world in which we live has, according to David Lewis, a Parmenidean ontology: "a manifold of events in four dimensions," and the occupants of the world are the 4-dimensional aggregates (...) of the stages - "temporal lines". The causal loops in backwards time travel involve events that appear to "come from nowhere," paradoxical "self-existent" objects or information, resulting in a bootstrap paradox. Many believe that causality loops are not impossible or unacceptable, but only inexplicable. DOI: 10.13140/RG.2.2.28792.70407. (shrink)
This article analyses letters to the editor written on or about Muslims printed in a British broadsheet newspaper. The pragma-dialectical theory of argumentation is applied as a model for explaining and understanding the arguments employed in the sampled letters. Our presentation of pragma-dialectical theory focuses on argumentative reasonableness. More specifically, we introduce the four dialectical stages through which any argument must pass and explain the ten rules of critical discussion that participants must follow throughout if they (...) are to resolve the argument. The article focuses in particular on the letter writers' use of argument schemes-that is, the manner in which these writers use arguments to support their standpoints. We conclude by highlighting the role that unreasonable arguments can play in perpetuating racialized inequalities and hence the importance of analyzing argumentation. (shrink)
‘Judgment’ is Brentano’s terms for any mental state liable to be true or false. This includes not only the products of conceptual thought, such as belief, but also perceptual experiences, such as seeing that the window was left open. ‘Every perception counts as a judgment,’ writes Brentano (1874: II, 50/1973a: 209). Accordingly, his theory of judgment is not exactly a theory of the same phenomenon we call today ‘judgment,’ but of a larger class of phenomena one (perhaps the (...) main) species of which is what we call judgment. Even if we keep this in mind, though, the profound heterodoxy of Brentano’s theory of judgment is still striking. Brentano develops this heterodox theory in some detail already in the Psychology from Empirical Standpoint (Brentano 1874/1973a). But he continued to work out its details, and various aspects of it, until his death. Many of the relevant articles, notes, and fragments of relevance have been collected by Oskar Kraus in 1930 and published under the title Truth and Evidence (Brentano 1930/1966b). Kraus prefaces this volume with an elaborate reconstruction, of dubious plausibility, according to which Brentano’s accounts of judgment and truth have gone through four distinct stages. In reality, there is a unified underlying conviction underwriting Brentano’s work both on judgment and on truth (see CHAP. 20 on the latter). Here I present this unified core of this highly original theory of judgment, which can be captured in terms of three main theses. The first is that contrary to appearances, all judgments are existential judgments (§1). The second is that the existential force of judgment is indeed a force, or mode, or attitude – it does not come from the judgment’s content (§2). The third is that judgment is not a propositional attitude but an ‘objectual’ attitude (§3). (shrink)
I propose a relevance-based independence axiom on how to aggregate individual yes/no judgments on given propositions into collective judgments: the collective judgment on a proposition depends only on people’s judgments on propositions which are relevant to that proposition. This axiom contrasts with the classical independence axiom: the collective judgment on a proposition depends only on people’s judgments on the same proposition. I generalize the premise-based rule and the sequential-priority rule to an arbitrary priority order of the propositions, instead of a (...) dichotomous premise/conclusion order resp. a linear priority order. I prove four impossibility theorems on relevance-based aggregation. One theorem simultaneously generalizes Arrow’s Theorem (in its general and indifference-free versions) and the well-known Arrow-like theorem in judgment aggregation. (shrink)
We present a theory of human artistic experience and the neural mechanisms that mediate it. Any theory of art has to ideally have three components. The logic of art: whether there are universal rules or principles; The evolutionary rationale: why did these rules evolve and why do they have the form that they do; What is the brain circuitry involved? Our paper begins with a quest for artistic universals and proposes a list of ‘Eight laws of artistic experience’ (...) -- a set of heuristics that artists either consciously or unconsciously deploy to optimally titillate the visual areas of the brain. One of these principles is a psychological phenomenon called the peak shift effect: If a rat is rewarded for discriminating a rectangle from a square, it will respond even more vigorously to a rectangle that is longer and skinnier that the prototype. We suggest that this principle explains not only caricatures, but many other aspects of art. Example: An evocative sketch of a female nude may be one which selectively accentuates those feminine form-attributes that allow one to discriminate it from a male figure; a Boucher, a Van Gogh, or a Monet may be a caricature in ‘colour space’ rather than form space. Even abstract art may employ ‘supernormal’ stimuli to excite form areas in the brain more strongly than natural stimuli. Second, we suggest that grouping is a very basic principle. The different extrastriate visual areas may have evolved specifically to extract correlations in different domains , and discovering and linking multiple features into unitary clusters -- objects -- is facilitated and reinforced by direct connections from these areas to limbic structures. In general, when object-like entities are partially discerned at any stage in the visual hierarchy, messages are sent back to earlier stages to alert them to certain locations or features in order to look for additional evidence for the object . Finally, given constraints on allocation of attentional resources, art is most appealing if it produces heightened activity in a single dimension rather than redundant activation of multiple modules. This idea may help explain the effectiveness of outline drawings and sketches, the savant syndrome in autists, and the sudden emergence of artistic talent in fronto-temporal dementia. In addition to these three basic principles we propose five others, constituting a total of ‘eight laws of aesthetic experience’. (shrink)
This paper examines the evolution of Husserl’s philosophy of nonintuitive intentions. The analysis has two stages. First, I expose a mistake in Husserl’s account of non-intuitive acts from his 1901 Logical Investigations. I demonstrate that Husserl employs the term “signitive” too broadly, as he concludes that all non-intuitive acts are signitive. He states that not only meaning acts, but also the contiguity intentions of perception are signitive acts. Second, I show how Husserl, in his 1913/14 Revisions to the Sixth (...) Logical Investigation, amends his 1901 theory of non-intuitive acts, which he now calls “empty” intentions. He there accurately distinguishes empty meaning acts from the empty intentions of perception. In the conclusion, I reveal how Husserl’s alterations to his theory of non-intuitive intentions can inform our understanding of a larger shift in his philosophy. (shrink)
Although the theory of the assertoric syllogism was Aristotle's great invention, one which dominated logical theory for the succeeding two millenia, accounts of the syllogism evolved and changed over that time. Indeed, in the twentieth century, doctrines were attributed to Aristotle which lost sight of what Aristotle intended. One of these mistaken doctrines was the very form of the syllogism: that a syllogism consists of three propositions containing three terms arranged in four figures. Yet another was that (...) a syllogism is a conditional proposition deduced from a set of axioms. There is even unclarity about what the basis of syllogistic validity consists in. Returning to Aristotle's text, and reading it in the light of commentary from late antiquity and the middle ages, we find a coherent and precise theory which shows all these claims to be based on a misunderstanding and misreading. (shrink)
Judgment aggregation theory, or rather, as we conceive of it here, logical aggregation theory generalizes social choice theory by having the aggregation rule bear on judgments of all kinds instead of merely preference judgments. It derives from Kornhauser and Sager’s doctrinal paradox and List and Pettit’s discursive dilemma, two problems that we distinguish emphatically here. The current theory has developed from the discursive dilemma, rather than the doctrinal paradox, and the final objective of the paper is (...) to give the latter its own theoretical development along the line of recent work by Dietrich and Mongin. However, the paper also aims at reviewing logical aggregation theory as such, and it covers impossibility theorems by Dietrich, Dietrich and List, Dokow and Holzman, List and Pettit, Mongin, Nehring and Puppe, Pauly and van Hees, providing a uniform logical framework in which they can be compared with each other. The review goes through three historical stages: the initial paradox and dilemma, the scattered early results on the independence axiom, and the so-called canonical theorem, a collective achievement that provided the theory with its specific method of analysis. The paper goes some way towards philosophical logic, first by briefly connecting the aggregative framework of judgment with the modern philosophy of judgment, and second by thoroughly discussing and axiomatizing the ‘general logic’ built in this framework. (shrink)
I here defend a theory consisting of four claims about ‘property’ and properties, and argue that they form a coherent whole that can solve various serious problems. The claims are (1): ‘property’ is defined by the principles (PR): ‘F-ness/Being F/etc. is a property of x iff F’ and (PA): ‘F-ness/Being F/etc. is a property’; (2) the function of ‘property’ is to increase the expressive power of English, roughly by mimicking quantification into predicate position; (3) property talk should be (...) understood at face value: apparent commitments are real and our apparently literal use of ‘property’ is really literal; (4) there are no properties. In virtue of (1)–(2), this is a deflationist theory and in virtue of (3)–(4), it is an error theory. (1) is fleshed out as a claim about understanding conditions, and it is argued at length, and by going through a number of examples, that it satisfies a crucial constraint on meaning claims: all facts about ‘property’ can be explained, together with auxiliary facts, on its basis. Once claim (1) has been expanded upon, I argue that the combination of (1)–(3) provides the means for handling several problems: they help giving a happy-face solution to what I call the paradox of abstraction , they form part of a plausible account of the correctness of committive sentences, and, most importantly, they help respond to various indispensability arguments against nominalism. (shrink)
If ordinary particulars are bundles of properties, and if properties are said to be universals, then three well-known objections arise : no particular can change, all particulars have all of their properties essentially (even the most insignificant ones), and there cannot be two numerically distinct but qualitatively indiscernible particulars. In this paper, I try to make a little headway on these issues and see how the objections can be met, if one accepts a certain view about persistence through time and (...) across possible worlds – namely, four-dimensionalism and its modal analogue. The paper is especially devoted to the second and third of the three objections. (shrink)
A RELATIVISTIC THEORY OF PHENOMENOLOCICAL CONSTITUTION: A SELF-REFERENTIAL, TRANSCENDENTAL APPROACH TO CONCEPTUAL PATHOLOGY. (Vol. I: French; Vol. II: English) -/- Steven James Bartlett -/- Doctoral dissertation director: Paul Ricoeur, Université de Paris Other doctoral committee members: Jean Ladrière and Alphonse de Waehlens, Université Catholique de Louvain Defended publically at the Université Catholique de Louvain, January, 1971. -/- Universite de Paris X (France), 1971. 797pp. -/- The principal objective of the work is to construct an analytically precise methodology which can (...) serve to identify, eliminate, and avoid a certain widespread _conceptual fault_ or _misconstruction_, called a "projective misconstruction" or "projection" by the author. It is argued that this variety of error in our thinking (i) infects a great number of our everyday, scientific, and philosophical concepts, claims, and theories, (ii) has largely been undetected, and (iii), when remedied, leads to a less controversial and more rigorous elucidation of the transcendental preconditions of human knowledge than has traditionally been possible. The dissertation identifies, perhaps for the first time, a _projective_ variety of self-referential inconsistency, and proposes an innovative, self-reflexive approach to transcendental argument in a logical and phenomenological context. The strength of the approach lies, it is claimed, in the fact that a rejection of the approach is possible only on pain of self-referential inconsistency. The argument is developed in the following stages: A general introduction identifies the central theme of the work, defines the scope of applicability of the results reached, and sketches the direction of the studies that follow. The preliminary discussion culminates in a recognition of the need for a _critique of impure reason_. The body of the work is divided into two parts: Section I seeks to develop a methodology, on a purely formal basis, which is, on the one hand, capable of being used to study the transcendental foundations of the special sciences, including its own proper transcendental foundation. On the other hand, the methodology proposed is intended as a diagnostic and therapeutic tool for dealing with _projective_ uses of concepts. The approach initiates an analysis of concepts from a perspective which views _knowledge as coordination_. Section I describes formal structures that possess the status of preconditions in such a coordinative account of knowledge. Special attention is given to the preconditions of _identifying reference_ to logical particulars. The first section attempts, then, to provide a self-referential, transcendental methodology which is essentially revisionary in that it is motivated by a concern for conceptual error-elimination. Phenomenology, considered in its unique capacity as a self-referential, transcendental discipline, is of special relevance to the study. Section II accordingly examines a group of concepts which come into question in connection with the central theme of _phenomenological constitution_. The "_de-projective methodology_" developed in Section I is applied to these concepts that have a foundational importance in transcendental phenomenology. A translation is, in effect, proposed from the language of consciousness to a language in which preconditions of referring are investigated. The result achieved is the elimination of self-defeating, projective concepts from a rigorous, phenomenological study of the constitutive foundations of science. The dissertation was presented in a two volume, double-language format for the convenience of French and English researchers. Each volume contains an analytical index. (shrink)
In a recent publication in this journal, Asle Kiran and Peter-Paul Verbeek (hereafter K&V) argue that extension theory and the notion of trust it implies are flawed. In this commentary, I defend extension theory against their critique. I first briefly introduce extension theory, then reconstruct K&V’s five arguments against extension theory and demonstrate that four of their five arguments are misplaced.
Algorithmic systems and predictive analytics play an increasingly important role in various aspects of modern life. Scholarship on the moral ramifications of such systems is in its early stages, and much of it focuses on bias and harm. This paper argues that in understanding the moral salience of algorithmic systems it is essential to understand the relation between algorithms, autonomy, and agency. We draw on several recent cases in criminal sentencing and K–12 teacher evaluation to outline four key (...) ways in which issues of agency, autonomy, and respect for persons can conflict with algorithmic decision-making. Three of these involve failures to treat individual agents with the respect they deserve. The fourth involves distancing oneself from a morally suspect action by attributing one’s decision to take that action to an algorithm, thereby laundering one’s agency. (shrink)
So-called theories of well-being (prudential value, welfare) are under-represented in discussions of well-being. I do four things in this article to redress this. First, I develop a new taxonomy of theories of well-being, one that divides theories in a more subtle and illuminating way. Second, I use this taxonomy to undermine some misconceptions that have made people reluctant to hold objective-list theories. Third, I provide a new objective-list theory and show that it captures a powerful motivation for the (...) main competitor theory of well-being (the desire-fulfilment theory). Fourth, I try to defuse the worry that objective-list theories are problematically arbitrary and show how the theory can and should be developed. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.