I argue that medieval solutions to the limit decision problem imply four-dimensionalism, i.e. the view according to which substances that persist through time are extended through time as well as through space, and have different temporal parts at different times.
The paper discusses the version of entity realism presented by Ian Hacking in his book, Representing and Intervening. Hacking holds that an ontological form of scientific realism, entity realism, may be defended on the basis of experimental practices which involve the manipulation of unobservable entities. There is much to be said in favour of the entity realist position that Hacking defends, especially the pragmatist orientation of his approach to realism. But there are problems with the position. The paper explores (...) two issues that reflect negatively on Hacking’s version of the entity realist position. The first issue relates to the role of description in fixing the reference of theoretical terms. The second issue relates to Hacking’s claim that the argument for entity realism based on experiment is a different kind of argument from the standard argument for scientific realism based on the success of science. (shrink)
We drive our lives permanently by decisions YES/NO, and even we no longer distinguish the elementary intermediary steps of such decisions most often, they form stereotyped chains that once triggered, they run unconsciously, daily facilitating our activities. We lead our lives actually by conscious decisions, each of such decisions establishing our future trajectory. The YES/NO dipole is actually the elemental evaluation and decisional unit in the informational transmission/reception equipment and lines and in computers, respectively. Based on a binary probabilistic system, (...) this is defined as a unit of information (Bit). We operate therefore as an informational system and we actually live in a bipolar universe, which is fundamentally informational. Indeed, the laws of nature and its equilibrium or steady state conditions are based on bipolar units with opposite characteristics, such as action/reaction, attraction/rejection, gravity/anti-gravity, matter/antimatter, entropy/anti-entropy, to enumerate just a few examples. As part of this bipolar universe, we are also bipolar entities connected to information and matter. Starting from the informational features of the human being, seven informational components are identified, forming the informational system of the human body, distinguished by their different functions, reflected at the conscious level through the center Iknow (the memory, including whole life experience), Iwant (decisions center), Iove (emotions), Iam (body status), Icreate (informational genetic transmitter), Icreated (genetic generator inherited from parents) and Ibelieve, which is the gateway to the antientropic component, favorable to maintain the life structure and functioning. Taking into account the characteristics of these centers, it is discussed the life cycle and are deduced suitable conclusions concerning an optimal, active lifestyle, that would contribute to a successful life, aging and destiny. (shrink)
This article defends Marjorie Suchocki’s position against two main objections raised by David E. Conner. Conner objects that God as a single actual entity must be temporal because there is succession in God’s experience ofthe world. The reply is that time involves at least two successive occasions separated by perishing, but in God nothing ever perishes. Conner also objects that Suchocki’s personalistic process theism is not experiential but is instead theoretical and not definitive. The reply is that his dismissal (...) of Part V of PR is arbitrary, the interpretation of all experience is theoretical, and no metaphysical interpretations are absolutely definitive, including PR as a whole. Also, Conner ignores religious experience. (shrink)
In recent years we may observe increasing interest in the development of social innovation both regarding theory as well as the practice of responding to social problems and challenges. One of the crucial challenges at the beginning of the 21st century is population ageing. Various new and innovative initiatives, programs, schemes, and projects to respond to negative consequences of this demographic process are emerging around the world. However, social theories related to ageing are still insufficiently combined with these new practices, (...) social movements, organisational models, and institutions. Many scholars are still using notions and tools from classical theories of social gerontology or the sociology of ageing such as disengagement theory, activity theory, and successful and productive ageing. Such theories do not sufficiently explain ageing in the context of, for example, a broad use of the information and communications technologies including robotics and automation, new healthcare and long-term care models, advancements in the development and governance of age-friendly environments, and public engagement of older adults into co-production of services delivered by public, private, non-governmental as well as non-formal entities. (shrink)
Physicalism as a metaphysical or ontological concept has maintained a dominant position since the second half of the last century to the present day. The claim that everything is physically constituted often accompanies microphysical reductionism, which assumes the existence of fundamental laws to which everything is reducible. In this context, a question regarding the status and possible autonomy of the laws of special sciences arises. The article focuses on the basic philosophical discussions between the strong, weak, and non-reductive physicalism that (...) treat the laws of special sciences in different ways, but none of which can be considered sufficiently convincing and successful. The article seeks to prove the existence of a universal mechanism that leads to the emergence of new and complex entities and regulations of their behaviour, thus justifying the autonomous status of special sciences and laws. (shrink)
Is vision merely a state of the beholder’s sensory organ which can be explained as an immediate effect caused by external sensible objects? Or is it rather a successive process in which the observer actively scanning the surrounding environment plays a major part? These two general attitudes towards visual perception were both developed already by ancient thinkers. The former is embraced by natural philosophers (e.g., atomists and Aristotelians) and is often labelled “intromissionist”, based on their assumption that vision is (...) an outcome of the causal influence exerted by an external object upon a sensory organ receiving an entity from the object. The latter attitude to vision as a successive process is rather linked to the “extramissionist” theories of the proponents of geometrical optics (such as Euclid or Ptolemy) who suggest that an entity – a visual ray – is sent forth from the eyes to the object. The present paper focuses on the contributions to this ancient controversy proposed by some 13th-century Latin thinkers. [...]. (shrink)
Arthur Kaufmann is one of the most prominent figures among the contemporary philosophers of law in German speaking countries. For many years he was a director of the Institute of Philosophy of Law and Computer Sciences for Law at the University in Munich. Presently, he is a retired professor of this university. Rare in the contemporary legal thought, Arthur Kaufmann's philosophy of law is one with the highest ambitions — it aspires to pinpoint the ultimate foundations of law by explicitly (...) proposing an ontology, a general theory of knowledge and concept of a person. Kaufmann's work derives, first of all, from the thinking of Gustav Radburch, his teacher, and then from ideas of Karl Engish and Hans-Georg Gadamer. The philosophy undertakes to pursue the ultimate foundation of law, law which is understood by Kaufmann, first of all, as a "concrete judgement" that is, what is right in a concrete situation. Justice belongs to the essence of law and "unjust law" is contradictio in adiectio. Kaufmann opposes all those theories, which as the only foundation for establishing just law (Recht) adopt legal norms (Gesetz). In Kaufmann's opinion , such theories are powerless in the face of all types of distortions of law rendered by political forces. He suggests that the basic phenomenon which needs to be explained and which cannot be disregarded by a philosopher of law is so-called "legal lawlessness" ("Gestzliches Unrecht"). "Legal lawlessness" which forms a part of life experience for the people of twentieth century totalitarian states. It proved "with the accuracy of scientific experiment" that the reality of law consists of something more than bare conformity with legal norms. The existence of lex corrupta indicates that law contains something "non-dispositive" which requires acknowledgment of both law-maker and judge. Kaufmann, accepting the convergent concept of truth and cognition, assumes that "non-dispositive" content, emerging as the conformity of a number of cognitive acts of different subjects (inter-subjective communicativeness and verifiability), indicates the presence of being in this cognition. The questions "What is law?" and "What are the principles of a just solution?" lead straight to the ontology of law, to the question about the ontological foundations of law. Kaufmann discerns the ontological foundations of law in the specifically understood "nature of things" and, ultimately, in a "person". He proposes a procedural theory of justice, founded on a "person". In my work, I undertake to reconstruct the train of thought which led Kaufmann to the recognition of a "person" as the ontological foundation of law. In the first part, the conception of philosophy adopted by Kaufmann, initial characteristics of law — of reality which is the subject of analysis, as well as, the requirements for proper philosophical explanation of law posed by Kaufmann are introduced. In the second, Kaufmann's reconstruction of the process of the realisation of law is presented. Next, the conception of analogy which Kaufmann uses when explaining law is analyzed. In the fourth part, Kaufmann's conception of ontological foundations of law is discussed. A critical analysis is carried out in which I demonstrate that the theory of the ontological foundation of law proposed by Kaufmann and the concept of a person included in it do not allow a satisfac¬tory explanation of the phenomenon of "legal lawlessness" and lead to a number of difficulties in the philosophical explanation of law. Finally, the perspectives of a proper formulation of the issue of the ontological foundations of law are drafted in the context of the analyzed theory. My interest is centered on the conception of philosophy adopted by Kaufmann, according to which the existence of the reality is inferred on the basis of a certain configuration of the content of consciousness, whereas at the point of departure of philosophy of law, the data to be explained is a certain process, which is, basically, a process of cognition, while the reality appears only as a condition for the possibility of the occurrence of the process. I wish to argue that the difficulties which appear in the explanation of law are a consequence of the assumed fundamental philosophical solu¬tions, which seem to be characteristic, though usually not assumed explicitly, in philoso¬phy and theory of law dominant at present in continental Europe. Thereby, I wish to show the significance of ontological and epistemological solutions to the possibility of a proper formulation of the problems posed by philosophy and theory of law. Kaufmann proclaims himself in favour of a philosophy which poses questions about the ultimate foundations of understanding of the reality. In epistemology, he assumes that answers to the questions "What is reality like?" and ultimately "What is real?" are inferred on the basis of uniformity of a cognitive acts of different subjects. Cognition of the reality is accomplished exclusively through the content of conceptual material. The two fundamental questions posed by philosophy of law are "What is just law?" and "How is the just law enacted?" The latter is a question about the process of achieving a solution to a concrete case. Since, in Kaufmann's opinion, law does not exist apart from the process of its realisation, an answer to the question about the manner of realisation of law is of fundamental significance to answering the question: "What is law?" and to the explanation of the question about the ontological grounding of law, which is, as well, the foundation of justice. The proper solution has to take into account the moment of "non-dispositive" content of law; its positiveness understood as the reality and, at the same time, it has to point to the principles of the historical transformation of the content. Law, in the primary meaning of the word, always pertains, in Kaufmann's opinion, to a concrete case. A legal norm is solely the "possibility" of law and the entirely real law is ipsa res iusta, that which is just in a given situation. Determination of what is just takes place in a certain type of process performed by a judge (or by man confronted with a choice). Kaufmann aims to reconstruct this process. A question about the ontological foundation of law is a question about the ontological foundations of this process. In the analyzed theory it is formulated as a question about the transcendental conditions, necessary for the possibility of the occurrence of the process: how the reality should be thought to make possible the reconstructed process of the realisation of law. Kaufmann rejects the model for finding a concrete solution based on simple subsump¬tion and proposes a model in which concrete law ensues, based on inference by analogy, through the process of "bringing to conformity" that which is normative with that which is factual. Kaufmann distinguishes three levels in the process of the realisation of law. On the highest level, there are the fundamental legal principles, on the second legal norms, on the third — concrete solutions. The fundamental principles of law are general inasmuch as they cannot be "applied" directly to concrete conditions of life, however, they play an important part in establishing norms. A judge encounters a concrete situation and a system of legal norms. A life situation and norms are situated on inherently different levels of factuality and normativeness. In order to acquire a definite law both a norm (system of norms) and a life situation (Lebenssachverhalt) should undergo a kind of "treatment" which would allow a mutual conformity to be brought to them. Legal norms and definite conditions of life come together in the process of analogical inference in which the "factual state" ("Tatbestand") — which represents a norm, and in the "state of things" ("Sachverhalt") — which represents a specific situation are constructed. A "factual state" is a sense interpreted from a norm with respect to specific conditions of life. The "state of things" is a sense constructed on the basis of concrete conditions of life with respect to norms (system of norms). Legal norms and concrete conditions of life meet in one common sense established during the process of realisation of law. Mutatis mutandis the same refers to the process of composition of legal norms: as the acquisition of concrete law consists in a mutual "synchronization" of norms and concrete conditions of life, so acquisition of legal norms consists of bringing to conformity fundamental principles and possible conditions of life. According to Kaufmann, both of these processes are based on inference through analogy. As this inference is the heart of these processes it is simultaneously a foundation finding just law and justice. How does Kaufmann understand such an inference? As the basis for all justice he assumes a specifically interpreted distributive justice grounded on proportionality. Equality of relations is required between life conditions and their normative qualification. Concrete conditions of life are ascribed normative qualification not through simple application of a general norm. More likely, when we look for a solution we go from one concrete normative qualified case to another, through already known "applications" of norms to a new "application". The relation between life conditions and their normative qualifica¬tion has to be proportional to other, earlier or possible (thought of) assignments of that which is factual to that which is normative. Law as a whole does not consist of a set of norms, but only of a unity of relations. Since law is a, based on proportion, relative unity of a norm and conditions of life, in order to explain law in philosophical manner, the question about ontological base of this unity has to be asked. What is it that makes the relation between a norm and conditions of life "non-dispositive"? What is the basis for such an interpretation of a norm and case which makes it possible to bring a norm and conditions of life into mutual "conformity"? This is a question about a third thing (next to norms and conditions of life), with respect to which the relative identity between a norm and conditions of life occurs, about the intermediary between that which is normative and that which is factual and which provides for the process of establishing of norms, as well as, finding solutions. It is the "sense" in which the idea of law or legal norm and conditions of life have to be identical to be brought to mutual "conformity". In Kaufmann's opinion such a sense is nothing else but the "nature of things" which determines the normative qualification of the reality. Since establishment of this "sense" appears to be "non-dispositive" and controlled inter-subjectively (namely, other subjects will reach a similar result) so, in conformity with the convergent concept of truth, the "nature of things" must be assigned a certain ontological status. According to Kaufmann this is a real relation which occurs between being and obligation, between the conditions of life and normative quality. However, it should be underlined that from the point of view of the analyzed system the "nature of things" is a correlate of constructed sense, a result of a construction which is based on the principle of consistent understanding of senses ("non-normative" and "normative") and is not a reality which is transcendent against the arrangement of senses. In Kaufmann's theory, inference from analogy appears to be a process of reshaping the concepts (senses) governed by tendency to understand the contents appearing in relations between that which is factual and that which is normative in a consistent way. The analogical structure of language (concepts) and recognition of being as composed of an essence and existence is an indispensable requirement for the possibility of the realisation of law, based on specifically understood inference from analogy. It is necessary to assume a moment of existence without content which ensures unity of cognition. Existence emerges thus as a condition of the possibility of cognition. According to Kaufmann, the "nature of things" is the heart of inference through analogy and the basis for establishment of finding of law. Inference from the "state of things" to a norm or from a norm to the "state of things" always means inference through the "nature of things". The "nature of things" is the proper medium of objective legal sense sought in every cognition of law. In Kaufmann's view, the question whether the "nature of things" is ultima ratio of interpretation of law or is only a means of supplement gaps in law or whether it is one of the sources of law, is posed wrongly. The "nature of things" serves neither to supplement the gaps nor is it a source of law as, for example, a legal norm may be. It is a certain kind of "catalyst" necessary in every act of making law and solving a concrete case. Owing to "nature of things" it is possible to bring to a mutual conformity the idea of law and possible conditions of life or legal norms and concrete conditions of life. In Kaufmann's conception the "nature of things" is not yet the ultimate basis for understanding the "non-dispositiveness" of law. The relation between obligation and being is determined in the process of the realisation of law. Both the process itself and that which is transformed in this process are given. A question about the ontological bases of "material" contents undergoing "treatment" in the process of the realisation of law and about being which is the basis of regularity of the occurrence of the process arises. Only this will allow an explanation that the result of the process is not optional. Thus, a question about reality to which law refers and about the subject realising the law has to be formed. To this, Kaufmann gives the following answer: that which is missing is man but not "empirical man" but man as a "person". A "person" understood as a set of relations between man and other people and things. A "person" is the intermediary between those things which are different — norm and case are brought to conformity. A "person" is that which is given and permanent in the process of the realisation of law. It determines the content of law, is "subject" of law; this aspect is described by Kaufmann as the "what" of the process of realisation of law. A "person" consists of precisely just these relations which undergo "treatment" in the process. On the other hand, a "person" is "a place" in which the processes of realisation of law occur, it is the "how" of normative discourse, a "person" is that which determines the procedure of the process, being "outside" of it. This aspect of a "person" is connected with the formal moment of law. A "person" being, at the same time, the "how" and the "what" of the process of the realisation of law, is also, to put it differently, a structural unity of relation and that which constitutes this relation (unity of relatio and relata). According to this approach a "person" is neither an object nor a subject. It exists only "in between". It is not substance. Law is the relation between being and obligation. That which is obligatory is connected with that which is general. That which is general does not exist on its own, it is not completely real. Accordingly, a "person" as such is also not real. It is relational, dynamic and historical. A "person" is not a state but an event. In Kaufmann's opinion, such a concept of a "person" helps to avoid the difficulties connected with the fungibility of law in classical legal positivism. A "person" is that which is given, which is not at free disposal and secures the moment of "non-dispositiveness" of law. Kaufmann concludes: "The idea (»nature«) of law is either the idea of a personal man or is nothing". Theory points at the structure of realising law and explains the process of adoption of general legal norms for a concrete situation. The analysis has shown however, that in this theory a satisfactory answer to the question about the ultimate foundations of law is not given. It seems that in the analyzed theory the understanding of human being takes place through understanding of law. What is good for man as a "person", what is just, what a "person" deserves may be determined only against the existing system of law. A "per¬son" adopted as a basis of law is the reality postulated in the analysis of the process of the realisation of law. It is a condition of possibility of this process ( explaining, on one hand, its unity and, on the other hand, the non-dispositive moments stated in this process). A "person" in the discussed theory is entirely defined by the structure of law, it can be nothing more than that which is given in law, what law refers to, what law is about. Being, which is a "person", is constituted by relations between people and objects, the relations which are based on fundamental links between norms and conditions of life established in a process of bringing them to conformity. It has to be assumed that man as a "person" is a subject of law only as far as realising law "treats" given senses according to their current configuration. The system of law is a starting point and it describes in content what man is as a "person". Moreover, being a "person" is the condition for entering legal relations. Consistently, Kaufmann writes that "empirical man" is not the subject of law, man is not "out of nature" a "person". People become "persons" due to the fact that they acknowledge each other as "persons" — acknowledging, at the same time, law. This acknowledgement is a con¬dition of existence, of the possibility of the occurrence of process of realisation of law and of constituting legal relations which ultimately constitute a "person". Kaufmann assumes, that law tends towards a moral aim: it may and must create an external freedom, without which the internal freedom to fulfil moral obligations cannot develop. However, this postulate is not based on the necessary structure of human being. From the point of view of his system, it is nothing more than only a condition for the possibility of the occurrence of the process of the realisation of law — lack of freedom would destroy the "how" of this process. Thus, the postulate to protect the freedom of personal acts has to be interpreted, in accordance with the analyzed theory, as a postulate, the fulfilment of which aims ultimately at the accomplishment of the very same process of realisation of law itself and not the realisation of a given man. Kaufmann considers a "person" to be an element which unites the system of law as a whole. Law is a structure of relations, which are interdependent and inter-contingent. Consequently, a "person" which is to form the ontological basis of law has to be entity consisting of all relations. Being also the "how" of the process of realisation of law, if a "person" is to warrant its unity, it has to be a common source for all procedures. Hence, a single "person" would constitute a subject of law. Man appears to be only a moment of a certain entirety, realisation of which should be an aim of his actions. Law, creating a "person" as an object and subject of law becomes a primary entity. In the analyzed theory, the basis for determination of aims which law sets to man is not the allocation of man-subject to something which improves him but rather, such relation is only just constituted by law. A question appears, why should aims set in law also be the aims of "empirical man"? Why is this "empirical man" to be punished in the name of a "person" understood in such a way? If, however, it is assumed that what is man is determined by a system which is superior to him, then man has to be understood only as a part of a whole and there are no grounds to prohibit istrumental treatment of man and so the road to all aspects of totalitarianism might be opened. A problem of the application of created theory to the reality arises, the reality which the theory pretends to explain. Ultimately in his theory Kaufmann does not give any systemic grounds for a radical questioning of the validity of any legal norms. Every new norm becomes an equal part of system of norms. It is only its interpretation and application to given conditions of life that may be disputable, however, this refers to all norms without exception. Cohesive inter-pretation of norms and applications is necessary and sufficient for the acquisition of just law. New norms have to be interpreted in the light of others, correspondingly, the other norms require reinterpretation in the light of the new ones. Contradiction in interpretation of a norm does not form a basis for questioning norms but may serve only to question the manner of their interpretation (understanding). Therefore, no grounds exist to assume any legal norm as criminal or unjust, and in consequence, to question any consistently realised system based on formally, properly established norms, as "legal lawlessness". As law and a "person" do not exist without the process of realisation of law, the role of legal safety becomes crucial as the condition for the possibility of the occurrence of the process of realisation of law. Denying legal safety would be tantamount to negation of law in general (also of moral law) as negation of safety takes away, at the same time, the basis for occurrence of the process of realisation of law. Moreover, any lack of legal safety would also mean lack of a basis for the existence of man as a "person". Kaufmann's thesis, that civil disobedience is legalized only when it has a chance to lead to success, consistent with his concept of the foundations of law, seems to point directly to conclusions which deny the facts taken under consideration and doubtlessly Kaufmann's own intentions, since it would have to be assumed that accordingly there are no grounds to question a legal system in force based on violence which secures its operation. Force finally seems to determine which one of the mutually irreconcilable normative systems constitute law and which does not. A legitimate position is one which leads to success, it is the weaker system which is negated. If so, then basically violent imposition of law is not an act directed against the law in force but, to the contrary, realisation of law. In the context of the new system the former system of law may be talked about as unjust solely in the sense of being incapable of being consistently united with the new. However, at the base, ultimately, lies force which reaffirms differences and excludes from the process of realisation of law certain norms and their interpretations. Kaufmann was aiming at grounding of that which is "non-dispositive" in a certain given framework of interpretation. Nevertheless, he does not provide foundations for the understanding of phenomena, which he undertakes to explain at a point of departure. Instead of explaining them the theory negates the possibility of their existence. The reality postulated in regard to "non-dispositive" moments of the reconstructed process of acquiring law consist of a specifically understood "person", which appears in Kaufmann's conceptions as a condition of the possibility of the realisation of law. According to this approach understanding of a "person" may be only a function of law. To understand "legal lawlessness" and foundations of justice it is necessary to look for such theory of law in which understanding of man as a "person" and being is not a function of understanding of law (in which a "person" is not only a condition for possibility of reconstructed process of realisation of law; for possibility of cognition processes). It seems necessary to start from theory of being and a "person" based on broader experience than the one assumed by Kaufmann and reconstruct the ontological foundations of the process of realisation of law only in such perspective. Kaufmann points out that that to which law refers is ipsa res iusta a concrete relation of man to other people and things. This relation, in his theory, appears to be basically only just constituted by law (normative senses "applied" to conditions of life). Therefore, understanding the relation between a given man and other people and things which constitute the aim of his actions, that is understanding of good, is enacted against the background of constitution of senses; constitution which is a result of a process aiming towards consistent understanding of particular contents (of nor¬mative and non-normative senses). "Being" is secondary towards constructed senses it is only their correlate. The primary relation consists of relation of a man to law (system of norms), while the secondary relation is one of man to something which is the aim of his action (relation between man and good). Considering such approach it is difficult to envision a satisfying answer to the fundamental question: why does law put concrete man under any obligation to obey it? The source of this problem can be seen in reduction of the base for understanding good to content of obligation formulated in auto-reflection. Such reduction seems to be a consequence of Kaufmann's adoption of "convergent concept of truth" and in con¬sequence his recognition of indirect, essentialistic grasp of reality formulated in concepts as the basic and only foundation of theory of being and of law. In view of such an approach, analogy of law, concepts and being is the condition for the possibility of the process of transformation of senses which aims at consistent interpretation of all law. Existence is postulated with respect to the possibility of unity of experience and cognition. However, also a different approach to understanding of the problem of being and good is possible. In spontaneous cognition being is affirmed, first of all, not as a certain, non-contradictory, determined content, but as something existing. Together with a cer¬tain content (passed indirectly through notions) existence of being is co-given. The basis for unity of being is not formed by the consistence of content, as it is in the case of the theories departing from the analysis of cognition processes, but by an act of existence realising content (essence). Such an approach makes it also possible to go beyond the convergent concept of truth. It is worth mentioning that allocation of an agent to good is realised not only by the content of duty. A statement that something is good is primary with respect to determination of this good in content. The recognised good always bears some content, however, there are no reasons to base the concept of good exclusively on indirect, formulated in concepts cognition. As primary, can be adopted the relation of man to good and not of man to law. Determination in content appears to be only an articulation of aspectual cognition of being, as an object of action. In such a case the basis for relative unity of norm and conditions of life is not the "nature of things" understood as correlate of sense but it is relation to good based on internal constitution of man as potential, not self-sufficient being. It does not mean, that the moments of the process of realisation of law singled out by Kaufmann are not important to determination of what is just. He, quite rightly, points to significant role played by norms in the evaluation of concrete situations, in man's search for closer specification in content of good innate to him. The structure of process of determining law for a concrete situation, to a great degree corresponds to the processes of determining law which take place not only in the legal sciences. Kaufmann's analyses of the process of realisation of law show the complexity of the structure of these processes and point towards important moments allowing a better understanding of law and man. Nevertheless, these analyses cannot be a basis for construction of philosophical theory of law, theory which hopes to point out the ultimate, ontological foundations for understanding law. Kaufmann's results may become fully valid only in a more general perspective including broader experience at the point of departure. (shrink)
A common and enduring early modern intuition is that materialists reduce organisms in general and human beings in particular to automata. Wasn’t a famous book of the time entitled L’Homme-Machine? In fact, the machine is employed as an analogy, and there was a specifically materialist form of embodiment, in which the body is not reduced to an inanimate machine, but is conceived as an affective, flesh-and-blood entity. We discuss how mechanist and vitalist models of organism exist in a more complementary (...) relation than hitherto imagined, with conceptions of embodiment resulting from experimental physiology. From La Mettrie to Bernard, mechanism, body and embodiment are constantly overlapping, modifying and overdetermining one another; embodiment came to be scientifically addressed under the successive figures of vie organique and then milieu intérieur, thereby overcoming the often lamented divide between scientific image and living experience. (shrink)
We propose a modular ontology of the dynamic features of reality. This amounts, on the one hand, to a purely spatial ontology supporting snapshot views of the world at successive instants of time and, on the other hand, to a purely spatiotemporal ontology of change and process. We argue that dynamic spatial ontology must combine these two distinct types of inventory of the entities and relationships in reality, and we provide characterizations of spatiotemporal reasoning in the light of (...) the interconnections between them. (shrink)
Belt and Road Initiative (BRI) is a global infrastructure development project that ambitiously aims to connect Asia with European and African continents through land and sea corridors. China adopted this gigantic game-changing master plan in 2013 and spurred much speculation among scholars and policymakers worldwide. This article investigates the development of the project through the lens of global political geography and economy. From an international relations perspective, the authors consult relevant pieces of literature and focus on the international issues and (...) events concerning the development of the project using concepts of ideas, interests, and institutions within the scope of geopolitics and political economy. The analysis is performed by reviewing critical events and arguments related to the ideas, interests and institutions evolving around the implementation of BRI. Drawing from the analysis, the authors argue that the rise of China as a dominant global superpower largely depends on the success of the BRI, and this initiative will continue to generate politics among the international actors, multinational entities, and institutions. Despite widespread speculations, the project poses a substantive threat to the USA's global dominance and is likely to create more global development cooperation under Chinese leadership and vision. (shrink)
The Razor says: do not multiply entities without necessity! The Laser says: do not multiply fundamental entities without necessity! Behind the Laser lies a deep insight. This is a distinction between the costs and the commitments of a theory. According to the Razor, every commitment is a cost. Not so according to the Laser. According to the Laser, derivative entities are an ontological free lunch: that is, they are a commitment without a cost. Jonathan Schaffer (2015) has (...) argued that the Laser should replace the Razor. In Sections 2-4 we shall discuss and argue against Schaffer’s arguments for replacing the Razor with the Laser. Schaffer considers several objections to his views, and in Sections 5-7 we shall argue that Schaffer does not deal successfully with two of them. In Section 8 we shall present a probabilistic argument for the Laser. However, the argument has a limitation and does not support the replacement of the Razor with the Laser. Indeed, it supports only the claim that, given certain assumptions, the multiplication of explanatorily relevant derivative entities does not matter; but, as we argue in the same section, there is an argument that multiplying explanatorily superfluous derivative entities does makes a theory less rationally acceptable. Our conclusion is that the Laser cannot replace the Razor and that derivative entities are not an ontological free lunch. (shrink)
What are we? Despite much discussion in historical and contemporary philosophy, we have not yet settled on an answer. A satisfactory personal ontology, an account of our metaphysical nature, will be informed by issues in the metaphysics of material objects. In the dissertation, I target two prominent materialist ontologies: animalism, the view that we are numerically identical to human organisms, and constitutionalism, the view that we are constituted by, but not identical to, human organisms. Because of the problems that arise (...) from endorsing these ontologies, I instead advance immaterialism, the view that we are essentially immaterial. In Chapter 2, I discuss how animalists must respond to a widely-discussed metaphysical puzzle, the problem of the many. This puzzle prompts some to endorse revisionary ontologies of material objects, and I argue that the animalist cannot appeal to these revisionary ontologies to solve the puzzle as it arises for personal ontology. In addition, solutions that don’t involve a commitment to revisionary ontology will be unavailable to the animalist: I argue that if animalists make use of non-revisionary solutions to the problem, they must abandon the most successful argument for their view. Absent their most successful argument, animalists will need to motivate the view in some other way. Some new arguments for animalism have been proposed, and I argue that they fail to give us reason to endorse animalism over competing ontologies. Without a strong argument, we should not prefer animalism over the other, more attractive, views. In Chapter 3, I show how constitutionalists face a different problem: explaining how the person is not the very same thing as the human organism, despite sharing the very same parts and occupying the very same physical space. We think that the person and the organism are different things because they have different modal profiles – the human organism can survive permanent loss of psychological life, but the person, presumably, cannot. Constitutionalists must then explain what grounds the difference in modal profiles, but such an explanation is hard to come by. This is an instance of the grounding problem, which is notoriously intractable. While the grounding problem is a well-known challenge to constitutional accounts of objects, I demonstrate that this puzzle is even more threatening when applied to persons. Some “solutions” to the problem fail to solve it at all, and solutions that might get the right result for ordinary objects require accepting that there are a multitude of persons where we ordinarily take there to be only one. We should not accept a personal ontology that requires a commitment to that multitude. I argue that the threat of the grounding problem is so great that we must reject the constitutionalist personal ontology. We will see from these puzzles in personal ontology that materialist solutions are either unsuccessful or yield unacceptable consequences. This should prompt us toward considering, instead, immaterialism. According to immaterialism, persons are not material objects, and the immaterialist can then provide solutions to the puzzles that threatened materialist ontologies. In Chapter 4, I outline these immaterialist solutions and show that the puzzles cannot be reinstantiated successfully against the immaterialist. I then discuss different available varieties of immaterialism and argue in defense of my preferred version. Ultimately, I argue that we are simple, immaterial entities that come into existence at the proper functioning of the brain. Endorsing this view of personal ontology permits us to adequately respond to metaphysical puzzles and retain judgments about persons that should be most important to us. In particular, the immaterialist has the resources to avoid the problem of too many thinkers and retain the judgment that there is exactly one person in circumstances where we take there to be just one. The immaterialist also has the resources to plausibly analyze thought experiments, such as cerebrum-swap cases, that threaten materialist ontologies. All things considered, it remains to be seen which personal ontology has the most evidence in its favor. In the context of debates that arise from material object metaphysics, however, evidence weighs in favor of immaterialism. Materialist personal ontologies are saddled with unacceptable responses to metaphysical puzzles, and endorsing materialism about persons requires taking on a very high cost: Either there are far more of us than we ordinarily take there to be, or there are no persons – far fewer of us than we ordinarily take there to be. Some might argue that these are the only acceptable options, so cost be damned. But we cannot afford to be so cavalier about our personal ontology. Instead, I advance immaterialist solutions to puzzles in personal ontology and propose that, in the interest of saving ourselves and everyone we love, we should seriously consider accounts according to which we are immaterial entities. (shrink)
How comes that two organisms can interact with each other or that we can comprehend what the other experiences? The theories of embodiment, intersubjectivity or empathy have repeatedly taken as their starting point an individualistic assumption (the comprehension of the other comes after the self-comprehension) or a cognitivist one (the affective dimension follows the cognitive process). The thesis of this book is that there are no two isolated entities at the origin which successively interact with each other. There is, (...) rather, an impersonal stratum – the original affectivity (Gefühlsdrang) – which lets the living organisms be constitutively attuned with the expressive dimension of the life from the very beginning. The book aims to rethink the issue of corporality on the basis of a biosemiotics of the interaction between lived body (Leib) and environment (Umwelt). The human emotions reveals themselves as devices that experiment further attunement levels and expose the human being – because of her ex-centricity – to the alienation in various forms of the psychopathological existence. This is an unprecedented perspective that turns to psychopathology to reread against the light the watermark weaving the structure of personal singularity. What emerges is an intermediate territory confining both with philosophy and psychiatry: the psychopathology of ordo amoris. The author advances the project of a new psychopathology of ordo amoris, not only tackling with the 19th-century tradition of psychiatry and phenomenological psychopathology, but also with the contemporary disputes on intersubjectivity in phenomenology and with those on schizophrenia as disruption of aida or as disembodiment process in psychiatry. What results from his project is the first systematic study on the international level showing the relevance of Schelerian concepts of body schema (Leibschema) and order of feeling (ordo amoris) to understand the disruptions of emotional regulation on the psychopathological dimension as well as on the formation process of the singularity. (shrink)
Turning away from entities and focusing instead exclusively on ‘structural’ aspects of scientific theories has been advocated as a cogent response to objections levelled at realist conceptions of the aim and success of science. Physical theories whose (predictive) past successes are genuine would, in particular, share with their successors structural traits that would ultimately latch on to ‘structural’ features of the natural world. Motives for subscribing to Structural Realism are reviewed and discussed. It is argued that structural retention claims (...) lose their force if one gives up merely historical readings of the transition from Galilean-relativistic classical mechanics to the ‘special’ theory of relativity, heeding instead basic requirements that lead to their common derivation. Further cause for scepticism is found upon realising that the basic mathematical framework of quantum theory essentially reflects its predictive purpose, without any necessary input, be it of a ‘structural’ kind, from the physical world. (shrink)
It is common to think of essence along modal lines: the essential truths, on this approach, are a subset of the necessary truths. But Aristotle conceives of the necessary truths as being distinct and derivative from the essential truths. Such a non-modal conception of essence also constitutes a central component of the neo-Aristotelian approach to metaphysics defended over the last several decades by Kit Fine. Both Aristotle and Fine rely on a distinction between what belongs to the essence proper of (...) an object and what merely follows from the essence proper of an object. In order for this type of approach to essence and modality to be successful, we must be able to identify an appropriate consequence relation which in fact generates the result that the necessary truths about objects follow from the essential truths. I discuss some proposals put forward by Fine and then turn to Aristotle’s account: Aristotle’s central idea, to trace the explanatory power of definitions to the causal power of essences has the potential to open the door to a philosophically satisfying response to the question of why certain things are relevant, while others are irrelevant, to the nature or essence of entities. If at all possible, it would be desirable for example to have something further to say by way of explanation to such questions as ‘Why is the number 2 completely irrelevant to the nature of camels?’. (shrink)
The contemporary debate between scientific realism and anti-realism is conditioned by a polarity between two opposing arguments: the realist’s success argument and the anti-realist’s pessimistic induction. This polarity has skewed the debate away from the problem that lies at the source of the debate. From a realist point of view, the historical approach to the philosophy of science which came to the fore in the 1960s gave rise to an unsatisfactory conception of scientific progress. One of the main motivations for (...) the scientific realist appeal to the success of science was the need to provide a substantive account of the progress of science as an increase of knowledge about the same entities as those referred to by earlier theories in the history of science. But the idea that a substantive conception of progress requires continuity of reference has faded from the contemporary debate. In this paper, I revisit the historical movement in the philosophy of science in an attempt to resuscitate the original agenda of the debate about scientific realism. I also briefly outline the way in which the realist should employ the theory of reference as the basis for a robust account of scientific progress which will satisfy realist requirements. (shrink)
Moral reasoning traditionally distinguishes two types of evil:moral (ME) and natural (NE). The standard view is that ME is the product of human agency and so includes phenomena such as war,torture and psychological cruelty; that NE is the product of nonhuman agency, and so includes natural disasters such as earthquakes, floods, disease and famine; and finally, that more complex cases are appropriately analysed as a combination of ME and NE. Recently, as a result of developments in autonomous agents in cyberspace, (...) a new class of interesting and important examples of hybrid evil has come to light. In this paper, it is called artificial evil (AE) and a case is made for considering it to complement ME and NE to produce a more adequate taxonomy. By isolating the features that have led to the appearance of AE, cyberspace is characterised as a self-contained environment that forms the essential component in any foundation of the emerging field of Computer Ethics (CE). It is argued that this goes someway towards providing a methodological explanation of why cyberspace is central to so many of CE's concerns; and it is shown how notions of good and evil can be formulated in cyberspace. Of considerable interest is how the propensity for an agent's action to be morally good or evil can be determined even in the absence of biologically sentient participants and thus allows artificial agents not only to perpetrate evil (and fort that matter good) but conversely to `receive' or `suffer from' it. The thesis defended is that the notion of entropy structure, which encapsulates human value judgement concerning cyberspace in a formal mathematical definition, is sufficient to achieve this purpose and, moreover, that the concept of AE can be determined formally, by mathematical methods. A consequence of this approach is that the debate on whether CE should be considered unique, and hence developed as a Macroethics, may be viewed, constructively,in an alternative manner. The case is made that whilst CE issues are not uncontroversially unique, they are sufficiently novel to render inadequate the approach of standard Macroethics such as Utilitarianism and Deontologism and hence to prompt the search for a robust ethical theory that can deal with them successfully. The name Information Ethics (IE) is proposed for that theory. Itis argued that the uniqueness of IE is justified by its being non-biologically biased and patient-oriented: IE is an Environmental Macroethics based on the concept of data entity rather than life. It follows that the novelty of CE issues such as AE can be appreciated properly because IE provides a new perspective (though not vice versa). In light of the discussion provided in this paper, it is concluded that Computer Ethics is worthy of independent study because it requires its own application-specific knowledge and is capable of supporting a methodological foundation, Information Ethics. (shrink)
This chapter develops the idea that the germ-soma split and the suppression of individual fitness differences within the corporate entity are not always essential steps in the evolution of corporate individuals. It illustrates some consequences for multilevel selection theory. It presents evidence that genetic heterogeneity may not always be a barrier to successful functioning as a higher-level individual. This chapter shows that levels-of-selection theorists are wrong to assume that the central problem in transitions is always that of minimizing within-group competition. (...) Evidence of intralevel conflict does not qualify as evidence against the existence of a higher level of selection. (shrink)
We introduce the notion of complexity, first at an intuitive level and then in relatively more concrete terms, explaining the various characteristic features of complex systems with examples. There exists a vast literature on complexity, and our exposition is intended to be an elementary introduction, meant for a broad audience. -/- Briefly, a complex system is one whose description involves a hierarchy of levels, where each level is made of a large number of components interacting among themselves. The time evolution (...) of such a system is of a complex nature, depending on the interactions among subsystems in the next level below the one under consideration and, at the same time, conditioned by the level above, where the latter sets the context for the evolution. Generally speaking, the interactions among the constituents of the various levels lead to a dynamics characterized by numerous characteristic scales, each level having its own set of scales. What is more, a level commonly exhibits ‘emergent properties’ that cannot be derived from considerations relating to its component systems taken in isolation or to those in a different contextual setting. In the dynamic evolution of some particular level, there occurs a self-organized emergence of a higher level and the process is repeated at still higher levels. -/- The interaction and self-organization of the components of a complex system follow the principle commonly expressed by saying that the ‘whole is different from the sum of the parts’. In the case of systems whose behavior can be expressed mathematically in terms of differential equations this means that the interactions are nonlinear in nature. -/- While all of the above features are not universally exhibited by complex systems, these are nevertheless indicative of a broad commonness relative to which individual systems can be described and analyzed. There exist measures of complexity which, once again, are not of universal applicability, being more heuristic than exact. The present state of knowledge and understanding of complex systems is itself an emerging one. Still, a large number of results on various systems can be related to their complex character, making complexity an immensely fertile concept in the study of natural, biological, and social phenomena. -/- All this puts a very definite limitation on the complete description of a complex system as a whole since such a system can be precisely described only contextually, relative to some particular level, where emergent properties rule out an exact description of more than one levels within a common framework. -/- We discuss the implications of these observations in the context of our conception of the so-called noumenal reality that has a mind-independent existence and is perceived by us in the form of the phenomenal reality. The latter is derived from the former by means of our perceptions and interpretations, and our efforts at sorting out and making sense of the bewildering complexity of reality takes the form of incessant processes of inference that lead to theories. Strictly speaking, theories apply to models that are constructed as idealized versions of parts of reality, within which inferences and abstractions can be carried out meaningfully, enabling us to construct the theories. -/- There exists a correspondence between the phenomenal and the noumenal realities in terms of events and their correlations, where these are experienced as the complex behavior of systems or entities of various descriptions. The infinite diversity of behavior of systems in the phenomenal world are explained within specified contexts by theories. The latter are constructs generated in our ceaseless attempts at interpreting the world, and the question arises as to whether these are reflections of `laws of nature' residing in the noumenal world. This is a fundamental concern of scientific realism, within the fold of which there exists a trend towards the assumption that theories express truths about the noumenal reality. We examine this assumption (referred to as a ‘point of view’ in the present essay) closely and indicate that an alternative point of view is also consistent within the broad framework of scientific realism. This is the view that theories are domain-specific and contextual, and that these are arrived at by independent processes of inference and abstractions in the various domains of experience. Theories in contiguous domains of experience dovetail and interpenetrate with one another, and bear the responsibility of correctly explaining our observations within these domains. -/- With accumulating experience, theories get revised and the network of our theories of the world acquires a complex structure, exhibiting a complex evolution. There exists a tendency within the fold of scientific realism of interpreting this complex evolution in rather simple terms, where one assumes (this, again, is a point of view) that theories tend more and more closely to truths about Nature and, what is more, progress towards an all-embracing ‘ultimate theory’ -- a foundational one in respect of all our inquiries into nature. We examine this point of view closely and outline the alternative view -- one broadly consistent with scientific realism -- that there is no ‘ultimate’ law of nature, that theories do not correspond to truths inherent in reality, and that successive revisions in theory do not lead monotonically to some ultimate truth. Instead, the theories generated in succession are incommensurate with each other, testifying to the fact that a theory gives us a perspective view of some part of reality, arrived at contextually. Instead of resembling a monotonically converging series successive theories are analogous to asymptotic series. -/- Before we summarize all the above considerations, we briefly address the issue of the complexity of the {\it human mind} -- one as pervasive as the complexity of Nature at large. The complexity of the mind is related to the complexity of the underlying neuronal organization in the brain, which operates within a larger biological context, its activities being modulated by other physiological systems, notably the one involving a host of chemical messengers. The mind, with no materiality of its own, is nevertheless emergent from the activity of interacting neuronal assemblies in the brain. As in the case of reality at large, there can be no ultimate theory of the mind, from which one can explain and predict the entire spectrum of human behavior, which is an infinitely rich and diverse one. (shrink)
Moral reasoning traditionally distinguishes two types of evil: moral and natural. The standard view is that ME is the product of human agency and so includes phenomena such as war, torture and psychological cruelty; that NE is the product of nonhuman agency, and so includes natural disasters such as earthquakes, floods, disease and famine; and finally, that more complex cases are appropriately analysed as a combination of ME and NE. Recently, as a result of developments in autonomous agents in cyberspace, (...) a new class of interesting and important examples of hybrid evil has come to light. In this paper, it is called artificial evil and a case is made for considering it to complement ME and NE to produce a more adequate taxonomy. By isolating the features that have led to the appearance of AE, cyberspace is characterised as a self-contained environment that forms the essential component in any foundation of the emerging field of Computer Ethics. It is argued that this goes some way towards providing a methodological explanation of why cyberspace is central to so many of CE’s concerns; and it is shown how notions of good and evil can be formulated in cyberspace. Of considerable interest is how the propensity for an agent’s action to be morally good or evil can be determined even in the absence of biologically sentient participants and thus allows artificial agents not only to perpetrate evil but conversely to ‘receive’ or ‘suffer from’ it. The thesis defended is that the notion of entropy structure, which encapsulates human value judgement concerning cyberspace in a formal mathematical definition, is sufficient to achieve this purpose and, moreover, that the concept of AE can be determined formally, by mathematical methods. A consequence of this approach is that the debate on whether CE should be considered unique, and hence developed as a Macroethics, may be viewed, constructively, in an alternative manner. The case is made that whilst CE issues are not uncontroversially unique, they are sufficiently novel to render inadequate the approach of standard Macroethics such as Utilitarianism and Deontologism and hence to prompt the search for a robust ethical theory that can deal with them successfully. The name Information Ethics is proposed for that theory. It is argued that the uniqueness of IE is justified by its being non-biologically biased and patient-oriented: IE is an Environmental Macroethics based on the concept of data entity rather than life. It follows that the novelty of CE issues such as AE can be appreciated properly because IE provides a new perspective. In light of the discussion provided in this paper, it is concluded that Computer Ethics is worthy of independent study because it requires its own application-specific knowledge and is capable of supporting a methodological foundation, Information Ethics. (shrink)
A “conceptual spaces” approach is used to formalize Aristotle’s main intuitions about time and change, and other ideas about temporal points of view. That approach has been used in earlier studies about points of view. Properties of entities are represented by locations in multidimensional conceptual spaces; and concepts of entities are identified with subsets or regions of conceptual spaces. The dimensions of the spaces, called “determinables”, are qualities in a very general sense. A temporal element is introduced by (...) adding a time variable to state functions that map entities into conceptual spaces. That way, states may have some permanency or stability around time instances. Following Aristotle’s intuitions, changes and events will not be necessarily instant phenomena, instead they could be processual and interval dependent. Change is defined relatively to the interval during which the change is taking place. Time intervals themselves are taken to represent points of view. To have a point of view is to look at the world as it is in the selected interval. Many important concepts are relativized to intervals, for instance change, events, identity, ontology, potentiality, etc. The definition of points of view as intervals allows to compare points of view in relation to all these concepts. The conceptual space approach has an immediate semantic and structural character, but it is tempting to develop also logics to describe them. A formal language is introduced to show how this could be done. (shrink)
At the heart of the Stoic theory of modality is a strict commitment to bivalence, even for future contingents. A commitment to both future truth and contingency has often been thought paradoxical. This paper argues that the Stoic retreat from necessity is successful. it maintains that the Stoics recognized three distinct senses of necessity and possibility: logical, metaphysical and providential. Logical necessity consists of truths that are knowable a priori. Metaphysical necessity consists of truths that are knowable a posteriori, a (...) world order according to certain metaphysical principles and natures that god crafts within the constraints of matter. Finally, what is providentially necessary is what occurs according to the chain of fate, but only once it is in process or past. -/- The method of the paper is a close reading of Diogenes Laertius 7.75, adducing broad textual evidence along the way, to show that the Stoic theory of modality embraces Philonian possibility, both that which is capable of being true as a matter of logical consistency, and that which is possible according to the bare fitness of the entity. What differentiates the Stoics from Philo is their additional commitment to possibility as opportunity, resisting the collapse of determinism into necessity. (shrink)
Cosmopolis A Review of Cosmopolitics -/- 2015/3-4 -/- Editorial Dominique de Courcelles & Paul Ghils -/- This issue addresses the general concept of “spirituality” as it appears in various cultural contexts and timeframes, through contrasting ideological views. Without necessarily going back to artistic and religious remains of primitive men, which unquestionably show pursuits beyond the biophysical dimension and illustrate practices seeking to unveil the hidden significance of life and death, the following papers deal with a number of interpretations covering a (...) wide field extending from belief to theory, from emotions to concepts, from the wisdom of personal experience to the most sophisticated doctrines. Spirit and spirituality are indeed many-faceted notions. They may refer to the intricate world of the interacting spirits which inhabit living beings in animistic traditions, without excluding a “grand force” linking human beings within a dynamic whole on which their very existence rely . They also bear upon more atomistic and either/or approaches of Western philosophy, which have become embodied in Cartesian dualism against a monotheist background, to the point of freezing the essence of individuals and culminating in the extreme individualism that characterizes our contemporaries. However, this equally refers to the opposite conception of materialism, across times and cultures, from ancient India and Greece (Cārvāka, originally known as Lokāyata, or some Buddhist doctrines for the former, Democritus or Lucretius for the latter) to more contemporary materialistic schools, whether modern or postmodern. -/- The following papers look at the contrasting forms of the philosophy and spirit of the human factor set into a whole, with no artificial disjoint between the psychical and the physical levels, as Wittgenstein put it: “And how can a body have a soul?”. This approach is not unrelated to the notion of anthropocene examined in a recent issue of Cosmopolis, with provides another comprehensive framework open to a spiritual life emerging from the very environment that generated it. -/- *** The first section of this issue was edited by Dominique de Courcelles, director at the National French Research Centre (CNRS), whom we wish to thank for collecting relevant studies relating to the religious and political questions, with a view to focusing on the war of ideas inevitably waged behind images, concepts and perceptions, taking an asymmetrical approach. To the extent that they are mindful of global/local interactions and include representations, opinions and beliefs, such disciplines as philosophy, philology, history and social sciences can provide useful studies accounting for new practices in geopolitics and a fair diplomacy. -/- In her introduction, Dominique de Courcelles first poses the question of how the religious and political spheres interrelate, with their corresponding religious demands and humanistic values. She then suggests that the right question today may be breaking with the philosophy of human rights concerned with the defense of human beings against the hazards of arbitrary politics or the instrumental use of religion, in favour of a fair philosophy of humankind, a new humanism. This would consist in recognizing a common loyalty of all towards one interhuman, not only interstate community, to protect it from both the autonomy demanded by individuals and the instrumental use of minorities. -/- Considering the fact of diversity, so important today in terms of both politics and religion, Abdelhai Azarkan looks at the conditions under which tolerance could obtain the double status of right and duty. He revisits to two philosophers, John Locke and Voltaire, who thought about it from the historical reality of religious wars. The former made tolerance into a right, basing his analysis on the political-legal level, while the latter saw tolerance as a duty, from an analysis based on ethical-political criteria. -/- Mathieu Guidère examines what he calls semantic denominationalism, a term which implies religious attributes and identities, whichever national loyalties or personal belonging they may have at the same time. Since the early 2000s, thie phenomenon has expanded tremendously, compounded by the “war on terror” and the over media-oriented terrorist actions. Denominational expressions act as formal names for ordinary and high-profile players in domestic and foreign policies of democratic states. These systems reveal a receding secularization, while the powerful comeback of religious identities signals the failure of nation-states and the weakening of the humanist spirit. -/- Barbara De Poli retraces the history of a contemporary jihadism claiming its Islamic essence and asserting the truth of genuine coranic principles via the war on infidels, with a view to restoring the Caliphate. After defining the term jihad, she shows that even if this contemporary jiadism is spreading in the Muslim world, it radically departs from Islamic law and the received use of the term jihad, in so far as it is rooted in the early radical thinking of Islamic ideologues in the 20th century, starting with with Hassan al-Banna, the founder of the Muslim Brotherhood. This current has been fueled by by international conflicts since the outbreak of the war in Afghanistan, in which the so-called Western countries bear a major responsibility. -/- Abderrazak Sayadi starts from the Tunisian experience to ask the question of humanist values and democracy within the relationships between the religious and political spheres. As a historian of religion, he is brought to demystify certain islamic principles and to paying attention to the reform of law, seeing the separation of religion and politics as a precondition to a successful democratic gamble and the establishment of a renewed humanism. -/- Dominique de Courcelles reminds us that getting a better knowledge of narrations and words makes it possible to better understand how logical and rhetorical thinking works for those who wage an asymetriccal war, re-enchanting and mystifying the world to better take control. As soon as 1932, an exchange of letters between Einstein and Freud made it clear that, in order to free man from fatality and war, education understood as culture was fundamental. Such illustrations as the exécution of Oussama ben Laden and the Caliph’s speech in Mossoul show that a premiminary analysis of images and words is essential to a fair diplomacy conducted by people from civil society, whose culture and wisdom allow justice and force to speak together and better resist war. -/- Marcel Boisard thinks that on the day the guns fall silent, exhausted by war, we will not return to the state borders that have prevailed for a century as an outcome of the Sykes-Picot agreement of 1916. It is time to prepare the “day after”, which will be a huge challenge. To this end, a summit of Middle-East nations is urgently needed to globally decide the fate of those peoples. On the condition that we know who the enemy is and accept to name it, to understand the history of the countries, groups and alliances, and to question any false or self-interested sense of certainty. -/- *** In the second section, Paul Schafer provides the author’s experiences to explain how culture, from the artistic to the biological, has the power to to open the doors to spirituality, from the inner self to the global environment. He asks himself whether a relative permanence of spirituality can arise from the specific moments that characterize it. Laurent Ledoux synthesizes the conclusion of a symposium held on 22 January 2015 on the links between philosophy and management, on the basis of the spiritual dimension conceived as “natural” and the answers it may suggest to the issues that face the organizations in a “contemporaniversal” world. Jacques Rifflet makes that question in a secular perspective, based on the wellsprings of personal commitment before it can be caught by any religious creed or scientific theory. In this sense spirituality, in alliance with reason, both inspires human consciousness and illuminates its destiny. Sami Aldeeb asks himself whether Islam can be reconciled with human rights. Caught between the belief in an absolute and final Word descended from the sky, and evidence showing that any religion is the creation of a given culture and a society situated in time and space, the Makkan and … contexts et médinois call for differentiated, if not opposite answers and exegeses. Bernard Carmona provides the outline of a dialogical framework, which is known to be a feature of debates between the various philosophical schools of classical India, exemplified here by the transdisciplinary perspective of debates within a Buddhist context. *** The articles not focused on the previous topic include a study by Landry Signé on China’s strategy, competing with the United States to control African resources. The author deals with the specific case of China’s rapprochement with Southern Sudan since Sudan was broken up. In the last paper, Goran Fejić and Rada Iveković, return to the essential role that women should play, and comments upon the role of some international legal instruments related in particular to the elimination of all forms of discrimination. The perspective is transnational and transethnic and is based on secular criteria, as regards nation-building and more generally society-building. Considering the persistence of widespread violence, whether in times of armed conflict or in times of peace, the question remains whether it is possible to fully implement rights and justice instruments. (shrink)
Psillos, Kitcher, and Leplin have defended convergent scientific realism against the pessimistic meta-induction by arguing for the divide et impera strategy. I argue that DEI faces a problem more serious than the pessimistic meta-induction: the problem of accretion. When empirically successful theories and principles are combined, they may no longer make successful predictions or allow for accurate calculations, or the combination otherwise may be an empirical failure. The shift from classical mechanics to the new quantum theory does not reflect the (...) discarding of “idle wheels.” Instead, scientists had to contend with new principles that made classical calculations difficult or impossible, and new results that were inconsistent with classical theorems, and that suggested a new way of conceiving of atomic dynamics. In this shift, reference to atoms and to electrons was preserved, but the underlying causal explanations and descriptions of atoms and electrons changed. I propose that the emphasis on accurate description of causal agents as a virtue of background theory be replaced with Ruetsche’s advocacy of pragmatic, modal resourcefulness. (shrink)
According to contemporary ‘process ontology’, organisms are best conceptualised as spatio-temporally extended entities whose mereological composition is fundamentally contingent and whose essence consists in changeability. In contrast to the Aristotelian precepts of classical ‘substance ontology’, from the four-dimensional perspective of this framework, the identity of an organism is grounded not in certain collections of privileged properties, or features which it could not fail to possess, but in the succession of diachronic relations by which it persists, or ‘perdures’ as one (...) entity over time. In this paper, I offer a novel defence of substance ontology by arguing that the coherency and plausibility of the radical reconceptualisation of organisms proffered by process ontology ultimately depends upon its making use of the ‘substantial’ principles it purports to replace. (shrink)
In this paper, I shall consider the challenge that Quine posed in 1947 to the advocates of quantified modal logic to provide an explanation, or interpretation, of modal notions that is intuitively clear, allows “quantifying in”, and does not presuppose, mysterious, intensional entities. The modal concepts that Quine and his contemporaries, e.g. Carnap and Ruth Barcan Marcus, were primarily concerned with in the 1940’s were the notions of (broadly) logical, or analytical, necessity and possibility, rather than the metaphysical modalities (...) that have since become popular, largely due to the influence of Kripke. In the 1950’s modal logicians responded to Quine’s challenge by providing quantified modal logic with model-theoretic semantics of various types. In doing so they also, explicitly or implicitly addressed Quine’s interpretation problem. Here I shall consider the approaches developed by Carnap in the late 1940’s, and by Kanger, Hintikka, Montague, and Kripke in the 1950’s, and discuss to what extent these approaches were successful in meeting Quine’s doubts about the intelligibility of quantified modal logic. (shrink)
In this paper I want to show that the idea supporters of traditional creationism (TC) defend, that success of a fictional character across different works has to be accounted for in terms of the persistence of (numerically) one and the same fictional entity, is incorrect. For the supposedly commonsensical data on which those supporters claim their ideas rely are rather controversial. Once they are properly interpreted, they can rather be accommodated by moderate creationism (MC), according to which fictional characters arise (...) out of a reflexive stance on a certain make-believe process. For MC, success of a fictional character across different works amounts to the fact that, first, different work-bound ficta are related with each other by means of a relation weaker than numerical identity, transfictional sameness, and second, that all those ficta are related by transfictional inclusion to a fictum that in some sense gather them all, the so-called general character. Since a general character is an abstract constructed entity, moreover, the more those particular ficta are generated, the more general fictional characters including all of them arise. (shrink)
A collective understanding that traces a debate between 'what is science?’ and ‘what is a science about?’ has an extraction to the notion of scientific knowledge. The debate undertakes the pursuit of science that hardly extravagance the dogma of pseudo-science. Scientific conjectures invoke science as an intellectual activity poured by experiences and repetition of the objects that look independent of any idealist views (believes in the consensus of mind-dependence reality). The realistic machinery employs in an empiricist exposition of the objective (...) phenomenon by synchronizing the general method to make observational predictions that cover all the phenomena of the particular entity without any exception. The formation of science encloses several epistemological purviews and a succession of conjectures cum refutation that a newer theorem could reinstate. My attempt is to advocate a holistic plea of scientific conjectures that outruns the restricted regulation of experience or testable hypothesis to render the validity of a chain of logical reasoning (deductive or inductive) of basic scientific statements. The milieu of scientific intensification integrates speculation that loads efficiency towards a new experimental dimension where the reality is not itself objective or observers relative; in fact the observed phenomenon divulges in the constructive progression of preferred methods of falsifiability and uncertainty. (shrink)
The questions concerning “who we are”, “where we go to”, and “where we come from”, preoccupied the humanity from immemorial times. During the last few decades, with the accelerated improvement of the investigation methods and of the advanced successful interventions allowing the life salvation, there have been reported some attempts to correlate the psychic phenomena with the body status by the recuperation, analysis and explanation of the symptoms recorded during the near-death experiences. Such special situations, in which the heart and (...) the brain, the support of mental activities, cease their activity, has become a fundamental tool to investigate the consciousness associated phenomena during the arrest status of the fundamental processes of the life. The fundamental question actually is whether consciousness really continues to exist even if the body has ceased its function by stopping the heartbeat and brain support activities [1,2]. An answer to such a question goes toward even further questions: can there be “life” beyond the death [3]? This exciting question includes also several aspects: what is consciousness and which is its nature [4]? Could consciousness exist as a disembodied entity? To answer these fundamental questions of existence, the collaboration of several disciplines such as neurology, psychology, medicine, biology, pharmacology and also physics, to call only the most important of them, is necessary, the geriatrics finding suitable responses within the multidisciplinary researches to its various questions related to the life prolongation and the improvement of the life quality. Within such a context, it was recently developed an informational model of consciousness, which can offer response to the above questions, based on the last discoveries of the quantum physics and cosmology [4,5]. (shrink)
Although successive generations of digital technology have become increasingly powerful in the past 20 years, digital democracy has yet to realize its potential for deliberative transformation. The undemocratic exploitation of massive social media systems continued this trend, but it only worsened an existing problem of modern democracies, which were already struggling to develop deliberative infrastructure independent of digital technologies. There have been many creative conceptions of civic tech, but implementation has lagged behind innovation. This article argues for implementing one (...) such vision of digital democracy through the establishment of a public corporation. Modeled on the Corporation for Public Broadcasting in the United States, this entity would foster the creation of new digital technology by providing a stable source of funding to nonprofit technologists, interest groups, civic organizations, government, researchers, private companies, and the public. Funded entities would produce and maintain software infrastructure for public benefit. The concluding sections identify what circumstances might create and sustain such an entity. (shrink)
Standard theories of rational decision making and rational preference embrace the idea that there is something special about the present. Standard decision theory, for example, demands that agents privilege the perspective of the present (i.e., the time of decision) in evaluating what to do. When forming preferences, most philosophers believe that a similar focus on the present is justified, at least in the sense that rationality requires or permits future experiences to be given more weight than past ones. In this (...) dissertation, I examine such theories in light of the expected success of the agents who follow them. In Chapters 2 and 3, I show that this bias toward the present is a liability: it tends to make agents less successful than they might otherwise be. I also show how these problems can be avoided: In the case of rational decision making, we must privilege the beginning rather than the present (what I call “inceptive maximization”). In the case of rational preferences, we must be completely temporally neutral. -/- In chapters 4 and 5 I introduce a larger framework in which to interpret these results. My core thesis is that practical rationality is a form of conditional reliability. Practically rational decisions, preferences, intentions, or other relevant factors reliably produce whatever we take to be of value, conditional on an agent’s beliefs. This focus on value-conduciveness is thus the analog of the focus on truth-conduciveness in reliability theories of epistemic norms. Like reliabilism in epistemology, I show that practical reliabilism is supported by a methodologically naturalistic approach to normativity. In this way and others, I argue that epistemic and practical reliabilism interconnect to create an overarching theory of normativity. (shrink)
It is one of the premises of eliminative materialism that commonsense psychology constitutes a theory. There is agreement that mental states can be construed as posited entities for the explanation and prediction of behavior. Disputes arise when it comes to the range of the commonsense theory of mental states. In chapter one, I review major arguments concerning the span and nature of folk psychology. In chapter two, relying on arguments by Quine and Sellars, I argue that the precise scope (...) of commonsense psychology cannot be determined because there are no resources to distinguish claims that are commonsense from all others. I use this conclusion to evaluate Churchland’s proposal that folk psychology should be eliminated in favor of a scientific theory. I argue that, although such an elimination is possible, it is unnecessary because commonsense psychology is in part informed by scientific theories. The properties that are usually attributed to mental states, on my view, are not common sense and would re-emerge even if we replaced our current theory with a scientific one. In chapter three, I examine how this affects eliminativist arguments, such as Churchland’s proposals for how to solve the emergence of the phenomenal character of sensations. I argue that it might be the case that some phenomenal properties are the result of endorsing a particular theory, but phenomenal character as such is a permanent feature of any theory about internal states. Addressing the problem of the incorrigibility of mental states, in chapter four, I challenge Rorty’s idea that such a property is the mark of the mental and can be attributed to our mental states based on our everyday usage of mental terms. The position asserted in the dissertation is compatible with the view that any theory can be revised, but doubts are expressed concerning the likelihood for a complete replacement of the current folk-psychological theory. Taking inspiration form Sellars, in chapter five, I argue that the establishment of a conceptual framework entails a wholistic jump from no concepts to a rudimentary framework. With this leap some properties are solidified and stand in the way of elimination. (shrink)
Three influential forms of realism are distinguished and interrelated: realism about the external world, construed as a metaphysical doctrine; scientific realism about non-observable entities postulated in science; and semantic realism as defined by Dummett. Metaphysical realism about everyday physical objects is contrasted with idealism and phenomenalism, and several potent arguments against these latter views are reviewed. -/- Three forms of scientific realism are then distinguished: (i) scientific theories and their existence postulates should be taken literally; (ii) the existence of (...) unobservable entities posited by our most successful scientific theories is justified scientifically; and (iii) our best current scientific theories are at least approximately true. It is argued that only some form of scientific realism can make proper sense of certain episodes in the history of science. -/- Finally, Dummett’s influential formulation of semantic issues about realism considered. Dummett argued that in some cases, the fundamental issue is not about the existence of entities, but rather about whether statements of some specified class (such as mathematics) have an objective truth value, independently of our means of knowing it. Dummett famously argued against such semantic realism and in favor of anti-realism. The relation of semantic realism to the metaphysical construal of realism and Dummett’s main argument against semantic realism is examined. (shrink)
A collective understanding that traces a debate between ‘what is science?’ and ‘what is a science about?’ has an extraction to the notion of scientific knowledge. The debate undertakes the pursuit of science that hardly extravagance the dogma of pseudo-science. Scientific conjectures invoke science as an intellectual activity poured by experiences and repetition of the objects that look independent of any idealist views (believes in the consensus of mind-dependence reality). The realistic machinery employs in an empiricist exposition of the objective (...) phenomenon by synchronizing the general method to make observational predictions that cover all the phenomena of the particular entity without any exception. The formation of science encloses several epistemological purviews and a succession of conjectures cum refutation that a newer theorem could reinstate. My attempt is to advocate a holistic plea of scientific conjectures that outruns the restricted regulation of experience or testable hypothesis to render the validity of a chain of logical reasoning (deductive or inductive) of basic scientific statements. The milieu of scientific intensification integrates speculation that loads efficiency towards a new experimental dimension where the reality is not itself objective or observers relative; in fact the observed phenomenon divulges in the constructive progression of preferred methods of falsifiability and uncertainty. (shrink)
Important decisions that impact humans lives, livelihoods, and the natural environment are increasingly being automated. Delegating tasks to so-called automated decision-making systems can improve efficiency and enable new solutions. However, these benefits are coupled with ethical challenges. For example, ADMS may produce discriminatory outcomes, violate individual privacy, and undermine human self-determination. New governance mechanisms are thus needed that help organisations design and deploy ADMS in ways that are ethical, while enabling society to reap the full economic and social benefits of (...) automation. In this article, we consider the feasibility and efficacy of ethics-based auditing as a governance mechanism that allows organisations to validate claims made about their ADMS. Building on previous work, we define EBA as a structured process whereby an entity’s present or past behaviour is assessed for consistency with relevant principles or norms. We then offer three contributions to the existing literature. First, we provide a theoretical explanation of how EBA can contribute to good governance by promoting procedural regularity and transparency. Second, we propose seven criteria for how to design and implement EBA procedures successfully. Third, we identify and discuss the conceptual, technical, social, economic, organisational, and institutional constraints associated with EBA. We conclude that EBA should be considered an integral component of multifaced approaches to managing the ethical risks posed by ADMS. (shrink)
Despite the recent advances in information and communication technology that have increased our ability to store and circulate information, the task of ensuring that the right sorts of information gets to the right sorts of people remains. We argue that the many efforts underway to develop efficient means for sharing information across healthcare systems and organizations would benefit from a careful analysis of human action in healthcare organizations. This in turn requires that the management of information and knowledge within healthcare (...) organizations be combined with models of resources and processes of patient care that are based on a general ontology of social interaction. The Health Level 7 (HL7) is one of several ANSI-accredited Standards Developing Organizations operating in the healthcare arena. HL7 has advanced a widely used messaging standard that enables healthcare applications to exchange clinical and administrative data in digital form. HL7 focuses on the interface requirements of the entire healthcare system and not exclusively on the requirements of one area of healthcare such as pharmacy, medical devices, imaging or insurance transactions. This has inspired the development of a powerful abstract model of patient care called the Reference Information Model (RIM). The present paper begins with an overview of the core classes of the HL7 (Version 3) RIM and a brief discussion of its “actcentered” view of healthcare. Central to this account is what is called the life cycle of events. A clinical action may progress from defined, through planned and ordered, to executed. These modalities of an action are represented as the mood of the act. We then outline the basis of an ontology of organizations, starting from the theory of speech Acts, and apply this ontology to the HL7 RIM. Special attention is given to the sorts of preconditions that must be satisfied for the successful performance of a speech act and to the sorts of entities to which speech acts give rise (e.g. obligations, claims, commitments, etc.). Finally we draw conclusions for the efficient communication and management of medical information and knowledge within and between healthcare organizations, paying special attention to the role that medical documents play in such organizations. (shrink)
The purpose of this study is to develop a system of corporate ethics based on an understanding and interpretation of the ethical demand of human beings who are in relation with each other according to Emmanuel Levinas' teachings and the responsibility the human being has to and for herself and others whom she encounters based on Paul Ricoeur's teachings on human action, text and hermeneutics. While the philosophies to which we will be referring may not overtly present a normative ethic, (...) we shall convey them in such a way that is reasonably germane to the development of our system of corporate ethics that would, indeed, demonstrate why (and, perhaps, how in some instances) the human being must act in response to the demand of the other whom she encounters. -/- We must not abandon the discussion as it relates to the economy but rather include it in a broader and more comprehensive dialogue about working to promote, support and protect the human dignity of all people using the advancements of technology to improve the human condition. The broader discussion must include but not be limited to developing a working definition of corporate ethics. To engage in discussions about the economy as an indicator for human success, we must speak openly and honestly about all stakeholders that participate and are affected by the economy. Without forsaking other stakeholders, we find the corporation - the entity whose actions have the farthest reaching and, in some cases, longer lasting impact on the human condition - to be an appropriate point of departure for this endeavor. (shrink)
I explore the process of changes in the observability of entities and objects in science and how such changes impact two key issues in the scientific realism debate: the claim that predictively successful elements of past science are retained in current scientific theories, and the inductive defense of a specific version of inference to the best explanation with respect to unobservables. I provide a case-study of the discovery of radium by Marie Curie in order to show that the observability (...) of some entities can change and that such changes are relevant for arguments seeking to establish the reliability of success-to-truth inferences with respect to unobservables. (shrink)
The goal of the OBO (Open Biomedical Ontologies) Foundry initiative is to create and maintain an evolving collection of non-overlapping interoperable ontologies that will offer unambiguous representations of the types of entities in biological and biomedical reality. These ontologies are designed to serve non-redundant annotation of data and scientific text. To achieve these ends, the Foundry imposes strict requirements upon the ontologies eligible for inclusion. While these requirements are not met by most existing biomedical terminologies, the latter may nonetheless (...) support the Foundry’s goal of consistent and non-redundant annotation if appropriate mappings of data annotated with their aid can be achieved. To construct such mappings in reliable fashion, however, it is necessary to analyze terminological resources from an ontologically realistic perspective in such a way as to identify the exact import of the ‘concepts’ and associated terms which they contain. We propose a framework for such analysis that is designed to maximize the degree to which legacy terminologies and the data coded with their aid can be successfully used for information-driven clinical and translational research. (shrink)
This article examines some aspects of the natural philosophy of Juan Gallego de la Serna, royal physician to the Spanish kings Philip III and Philip IV. In his account of animal generation, Gallego criticizes widely accepted views: (1) the view that animal seeds are animated, and (2) the alternative view that animal seeds, even if not animated, possess active potencies sufficient for the development of animal souls. According to his view, animal seeds are purely material beings. This, of course, raises (...) the question of how living beings can arise from inanimate matter. Gallego is aware that two other thinkers who understood animal seeds as purely material beings, Duns Scotus and the Louvain-based physician Thomas Feyens, did not solve this problem. Gallego’s solution makes use of the notion of incomplete entities developed by the Spanish Jesuit Francisco Suarez. While Suarez applies this notion to soul and body in order to explain why souls have a natural tendency towards organic bodies and organic bodies have a natural tendency towards souls, Gallego applies this notion to the natural tendency of animal seeds towards each other and towards further substances in their respective environment. In his view, this natural tendency of animal seeds to incorporate further substances explains that origin of material structures complex enough to constitute an animal soul. (shrink)
In Plato's Euthydemus, Socrates claims that the possession of epistēmē suffices for practical success. Several recent treatments suggest that we may make sense of this claim and render it plausible by drawing a distinction between so-called “outcome-success” and “internal-success” and supposing that epistēmē only guarantees internal-success. In this paper, I raise several objections to such treatments and suggest that the relevant cognitive state should be construed along less than purely intellectual lines: as a cognitive state constituted at least in part (...) by ability. I argue that we may better explain Socrates' claims that epistēmē suffices for successful action by attending to the nature of abilities, what it is that we attempt to do when acting, and what successful action amounts to in the relevant contexts. These considerations suggest that, contrary to several recent treatments, the success in question is not always internal-success. (shrink)
Abstract: This chapter discusses speakers’ conceptions of reported entities as evident in reporting practices. Pragmatic analyses will be offered to explain the diversity of permissible reporting practices. Several candidate theses on speakers’ conceptions of reported entities will be introduced. The possibility that there can be a unified analysis of direct and indirect reporting practices will be considered. Barriers to this unification will be discussed with an emphasis on the cognitive abilities speakers use in discerning the entities referred (...) to in reporting contexts. (shrink)
From the studies conducted, it may be seen in 2018 that the driving force behind the sharing economy in Bosnia and Herzegovina are not small entities that come together to use their spare capacity and gain some economic benefit from others. In the past several years, a set of legal reforms has been established for aspects of labour, taxes, and consumer protection in a collaborative economy. Recognising the potential, the Council of Ministers in Bosnia and Herzegovina also wants to (...) introduce sustainable production processes for converting biomass of harvested plants into useful wood and paper products. One of the biggest challenges is to develop a successful and reliable circular economy model. (shrink)
The Copernican revolution displaced us from the center of the universe. The Darwinian revolution displaced us from the center of the biological kingdom. And the Freudian revolution displaced us from the center of our mental lives. Today, Computer Science and digital ICTs are causing a fourth revolution, radically changing once again our conception of who we are and our “exceptional centrality.” We are not at the center of the infosphere. We are not standalone entities, but rather interconnected informational agents, (...) sharing with other biological agents and smart artifacts a global environment ultimately made of information. Having changed our views about ourselves and our world, are ICTs going to enable and empower us, or constrain us? This paper argues that the answer lies in an ecological and ethical approach to natural and artificial realities. It posits that we must put the “e” in an environmentalism that can deal successfully with the new issues caused by the fourth revolution. (shrink)
According to Hartry Field, the mathematical Platonist is hostage of a dilemma. Faced with the request of explaining the mathematicians’ reliability, one option could be to maintain that the mathematicians are reliably responsive to a realm populated with mathematical entities; alternatively, one might try to contend that the mathematical realm conceptually depends on, and for this reason is reliably reflected by, the mathematicians’ (best) opinions; however, both alternatives are actually unavailable to the Platonist: the first one because it is (...) in tension with the idea that mathematical entities are causally ineffective, the second one because it is in tension with the suggestion that mathematical entities are mind-independent. John Divers and Alexander Miller have tried to reject the conclusion of this argument—according to which Platonism is inconsistent with a satisfactory epistemology for arithmetic—by redescribing the second horn of the dilemma in light of Crispin Wright’s notion of judgment-dependent truth; in particular they have contended that once arithmetical truth is conceived in this way the Platonist can have a substantial epistemology which does not conflict with the idea that the mathematical entities exist mind-independently. In this paper I analyze Wright’s notion of judgment-dependent truth, and reject Divers and Miller’s argument for the conclusion that arithmetical truth can be so characterized. In the final part, I address the worry that my argument generalizes very quickly to the conclusion that no area of discourse could be characterized as judgment-dependent. As against this conclusion, I indicate under what conditions—notably not satisfied in Divers and Miller’s case, but possibly satisfied in others—a discourse’s judgment-dependency can be successfully vindicated. (shrink)
This essay provides a condensed introductory ‘snapshot’ of just a few of the many and profound correlations existing between early (pre-Abhidhamma) Pāḷi Buddhism and Transcendental Phenomenology, by focusing on what is arguably the most central and essential ‘philosophical problem’ in both traditions: the true nature and significance of the ‘I’ of subjective intentional consciousness. It argues that the Buddhist axiom of ‘not-self’ (anattā) is by no means incompatible with the fundamental phenomenological irreducibility, and necessity, of transcendental subjectivity – or, as (...) Husserl also puts it, of the ‘pure’ or ‘transcendental ‘I’’ – a structure evidently essential to intentional consciousness as ‘consciousness-of’. On the one hand, Husserl recognises (and struggles with) the peculiar ‘emptiness’ of the ‘pure ‘I’’. On the other hand, a fundamental distinction must clearly be drawn between genuine intentional subjectivity – which even Buddhas and Arahants must of necessity possess – and the erroneous bases upon which the concept of ‘self’ (attā) that Buddhism rejects is constituted: the feeling of ‘I am’ (‘asmī’ti), the sense of ‘I am this’ (‘ayam-aham-asmī’ti), and the concept/conceit of ‘I am’ (asmi-māna) – all of which Buddhas and Arahants by definition do not possess. Hence, it is argued that, while the ‘pure I’ does not refer to some permanent ‘entity’ called ‘self’, nor is it merely an empty, non-referring, conventional linguistic marker: it has not merely a ‘use’, but a genuine meaning, which derives from the intrinsic, irreducible, and ‘pre-linguistic’ experiential structure of ‘consciousness-of’ itself. What is more, this meaning is not only recognised and admitted, but actively utilised, within the doctrine and methodology of early Buddhism, without any sense of contradicting the axiom of anattā. (shrink)
Two radically different views about time are possible. According to the first, the universe is three dimensional. It has a past and a future, but that does not mean it is spread out in time as it is spread out in the three dimensions of space. This view requires that there is an unambiguous, absolute, cosmic-wide "now" at each instant. According to the second view about time, the universe is four dimensional. It is spread out in both space and time (...) - in space-time in short. Special and general relativity rule out the first view. There is, according to relativity theory, no such thing as an unambiguous, absolute cosmic-wide "now" at each instant. However, we have every reason to hold that both special and general relativity are false. Not only does the historical record tell us that physics advances from one false theory to another. Furthermore, elsewhere I have shown that we must interpret physics as having established physicalism - in so far as physics can ever establish anything theoretical. Physicalism, here, is to be interpreted as the thesis that the universe is such that some unified "theory of everything" is true. Granted physicalism, it follows immediately that any physical theory that is about a restricted range of phenomena only, cannot be true, whatever its empirical success may be. It follows that both special and general relativity are false. This does not mean of course that the implication of these two theories that there is no unambiguous cosmic-wide "now" at each instant is false. It still may be the case that the first view of time, indicated at the outset, is false. Are there grounds for holding that an unambiguous cosmic-wide "now" does exist, despite special and general relativity, both of which imply that it does not exist? There are such grounds. Elsewhere I have argued that, in order to solve the quantum wave/particle problem and make sense of the quantum domain we need to interpret quantum theory as a fundamentally probabilistic theory, a theory which specifies how quantum entities - electrons, photons, atoms - interact with one another probabilistically. It is conceivable that this is correct, and the ultimate laws of the universe are probabilistic in character. If so, probabilistic transitions could define unambiguous, absolute cosmic-wide "nows" at each instant. It is entirely unsurprising that special and general relativity have nothing to say about the matter. Both theories are pre-quantum mechanical, classical theories, and general relativity in particular is deterministic. The universe may indeed be three dimensional, with a past and a future, but not spread out in four dimensional space-time, despite the fact that relativity theories appear to rule this out. These considerations, finally, have implications for views about the arrow of time and free will. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.