If the world itself is metaphysically indeterminate in a specified respect, what follows? In this paper, we develop a theory of metaphysical indeterminacy answering this question.
The topic of a priori knowledge is approached through the theory of evidence. A shortcoming in traditional formulations of moderate rationalism and moderate empiricism is that they fail to explain why rational intuition and phenomenal experience count as basic sources of evidence. This explanatory gap is filled by modal reliabilism -- the theory that there is a qualified modal tie between basic sources of evidence and the truth. This tie to the truth is then explained by the (...) class='Hi'>theory of concept possession: this tie is a consequence of what, by definition, it is to possess (i.e., to understand) one’s concepts. A corollary of the overall account is that the a priori disciplines (logic, mathematics, philosophy) can be largely autonomous from the empirical sciences. (shrink)
We develop a theory of necessity operators within a version of higher-order logic that is neutral about how fine-grained reality is. The theory is axiomatized in terms of the primitive of *being a necessity*, and we show how the central notions in the philosophy of modality can be recovered from it. Various questions are formulated and settled within the framework, including questions about the ordering of necessities under strength, the existence of broadest necessities satisfying various logical conditions, and (...) questions about their logical behaviour. We also wield the framework to probe the conditions under which a logicist account of necessities is possible, in which the theory is completely reducible to logic. (shrink)
A series of papers on different aspects of practical knowledge by Roderick Chisholm, Rudolf Haller, J. C. Nyiri, Eva Picardi, Joachim Schulte Roger Scruton, Barry Smith and Johan Wrede.
Ultimately this book provides a theory of intergenerational justice that is both intellectually robust and practical with wide applicability to law and policy.
The paper is a contribution to formal ontology. It seeks to use topological means in order to derive ontological laws pertaining to the boundaries and interiors of wholes, to relations of contact and connectedness, to the concepts of surface, point, neighbourhood, and so on. The basis of the theory is mereology, the formal theory of part and whole, a theory which is shown to have a number of advantages, for ontological purposes, over standard treatments of topology in (...) set-theoretic terms. One central goal of the paper is to provide a rigorous formulation of Brentano's thesis to the effect that a boundary can exist as a matter of necessity only as part of a whole of higher dimension which it is the boundary of. It concludes with a brief survey of current applications of mereotopology in areas such as natural-language analysis, geographic information systems, machine vision, naive physics, and database and knowledge engineering. (shrink)
WE AIM HERE to outline a theory of evidence for use. More specifically we lay foundations for a guide for the use of evidence in predicting policy effectiveness in situ, a more comprehensive guide than current standard offerings, such as the Maryland rules in criminology, the weight of evidence scheme of the International Agency for Research on Cancer (IARC), or the US ‘What Works Clearinghouse’. The guide itself is meant to be well-grounded but at the same time to give (...) practicable advice, that is, advice that can be used by policy-makers not expert in the natural and social sciences, assuming they are well-intentioned and have a reasonable but limited amount of time and resources available for searching out evidence and deliberating. (shrink)
In this paper, I first present an overview of Asay’s _A Theory of Truthmaking_, highlighting what I take to be some of its most attractive features, especially his re-invigoration of the ontological understanding of truthmaking and his defence of ontology-first truthmaking over explanation-first truthmaking. Then, I articulate what I take to be a puzzling potential inconsistency: (a) he appeals to considerations to do with aboutness in criticising how well ontological views account for truth while (b) ruling out aboutness from (...) the right account of truthmaking. He argues, instead, that necessitation is both necessary and sufficient for truthmaking (§3.3). I suggest that adding aboutness to one’s account in the right way is not just compatible with but important for ontology-first truthmaking. I do all this to invite Asay to clarify his position on these matters. Overall, Asay’s worldview displays the fruitfulness of an ontologically serious approach to metaphysics that puts truthmaking centre-stage. (shrink)
This essay is divided into two parts. In the first part (§2), I introduce the idea of practical meaning by looking at a certain kind of procedural systems — the motor system — that play a central role in computational explanations of motor behavior. I argue that in order to give a satisfactory account of the content of the representations computed by motor systems (motor commands), we need to appeal to a distinctively practical kind of meaning. Defending the explanatory relevance (...) of semantic properties in a computationalist explanation of motor behavior, my argument concludes that practical meanings play a central role in an adequate psychological theory of motor skill. In the second part of this essay (§3), I generalize and clarify the notion of practical meaning, and I defend the intelligibility of practical meanings against an important objection. (shrink)
We have a variety of different ways of dividing up, classifying, mapping, sorting and listing the objects in reality. The theory of granular partitions presented here seeks to provide a general and unified basis for understanding such phenomena in formal terms that is more realistic than existing alternatives. Our theory has two orthogonal parts: the first is a theory of classification; it provides an account of partitions as cells and subcells; the second is a theory of (...) reference or intentionality; it provides an account of how cells and subcells relate to objects in reality. We define a notion of well-formedness for partitions, and we give an account of what it means for a partition to project onto objects in reality. We continue by classifying partitions along three axes: (a) in terms of the degree of correspondence between partition cells and objects in reality; (b) in terms of the degree to which a partition represents the mereological structure of the domain it is projected onto; and (c) in terms of the degree of completeness with which a partition represents this domain. (shrink)
A group is often construed as one agent with its own probabilistic beliefs (credences), which are obtained by aggregating those of the individuals, for instance through averaging. In their celebrated “Groupthink”, Russell et al. (2015) require group credences to undergo Bayesian revision whenever new information is learnt, i.e., whenever individual credences undergo Bayesian revision based on this information. To obtain a fully Bayesian group, one should often extend this requirement to non-public or even private information (learnt by not all or (...) just one individual), or to non-representable information (not representable by any event in the domain where credences are held). I pro- pose a taxonomy of six types of ‘group Bayesianism’. They differ in the information for which Bayesian revision of group credences is required: public representable information, private representable information, public non-representable information, etc. Six corre- sponding theorems establish how individual credences must (not) be aggregated to ensure group Bayesianism of any type, respectively. Aggregating through standard averaging is never permitted; instead, different forms of geometric averaging must be used. One theorem—that for public representable information—is essentially Russell et al.’s central result (with minor corrections). Another theorem—that for public non-representable information—fills a gap in the theory of externally Bayesian opinion pooling. (shrink)
The paper addresses the formation of striking patterns within originally near-homogenous tissue, the process prototypical for embryology, and represented in particularly purist form by cut sections of hydra regenerating, by internal reorganisation of the pre-existing tissue, a complete animal with head and foot. The essential requirements are autocatalytic, self-enhancing activation, combined with inhibitory or depletion effects of wider range – “lateral inhibition”. Not only de-novo-pattern formation, but also well known, striking features of developmental regulation such as induction, inhibition, and proportion (...) regulation can be explained on this basis. The theory provides a mathematical recipe for the construction of molecular models with criteria for the necessary non-linear interactions. It has since been widely applied to different developmental processes. (shrink)
The paper considers contemporary models of presumption in terms of their ability to contribute to a working theory of presumption for argumentation. Beginning with the Whatelian model, we consider its contemporary developments and alternatives, as proposed by Sidgwick, Kauffeld, Cronkhite, Rescher, Walton, Freeman, Ullmann-Margalit, and Hansen. Based on these accounts, we present a picture of presumptions characterized by their nature, function, foundation and force. On our account, presumption is a modal status that is attached to a claim and has (...) the effect of shifting, in a dialogue, a burden of proof set at a local level. Presumptions can be analysed and evaluated inferentially as components of rule-based structures. Presumptions are defeasible, and the force of a presumption is a function of its normative foundation. This picture seeks to provide a framework to guide the development of specific theories of presumption. (shrink)
This paper offers a general model of substantive moral principles as a kind of hedged moral principles that can (but don't have to) tolerate exceptions. I argue that the kind of principles I defend provide an account of what would make an exception to them permissible. I also argue that these principles are nonetheless robustly explanatory with respect to a variety of moral facts; that they make sense of error, uncertainty, and disagreement concerning moral principles and their implications; and that (...) one can grasp these principles without having to grasp any particular list of their permissibly exceptional instances. I conclude by pointing out various advantages that this model of principles has over several of its rivals. The bottom line is that we should find nothing peculiarly odd or problematic about the idea of exception-tolerating and yet robustly explanatory moral principles. (shrink)
In this article, a new, idealizing-hermeneutic methodological approach to developing a theory of philosophical arguments is presented and carried out. The basis for this is a theory of ideal philosophical theory types developed from the analysis of historical examples. According to this theory, the following ideal types of theory exist in philosophy: 1. descriptive-nomological, 2. idealizing-hermeneutic, 3. technical-constructive, 4. ontic-practical. These types of theories are characterized in particular by what their basic types of theses are. (...) The main task of this article is then to determine the types of arguments that are suitable for justifying these types of theses. Surprisingly, practical arguments play a key role here. (shrink)
The paper begins with an argument against eliminativism with respect to the propositional attitudes. There follows an argument that concepts are sui generis ante rem entities. A nonreductionist view of concepts and propositions is then sketched. This provides the background for a theory of concept possession, which forms the bulk of the paper. The central idea is that concept possession is to be analyzed in terms of a certain kind of pattern of reliability in one’s intuitions regarding the behavior (...) of the concept. The challenge is to find an analysis that is at once noncircular and fully general. Environmentalism, anti-individualism, holism, analyticity, etc. provide additional hurdles. The paper closes with a discussion of the theory’s implications for the Wittgenstein-Kripke puzzle about rule-following and the Benacerraf problem concerning mathematical knowledge. (shrink)
This article is a comparative study between predictive processing (PP, or predictive coding) and cognitive dissonance (CD) theory. The theory of CD, one of the most influential and extensively studied theories in social psychology, is shown to be highly compatible with recent developments in PP. This is particularly evident in the notion that both theories deal with strategies to reduce perceived error signals. However, reasons exist to update the theory of CD to one of “predictive dissonance.” First, (...) the hierarchical PP framework can be helpful in understanding varying nested levels of CD. If dissonance arises from a cascade of downstream and lateral predictions and consequent prediction errors, dissonance can exist at a multitude of scales, all the way up from sensory perception to higher order cognitions. This helps understand the previously problematic dichotomy between “dissonant cognitive relations” and “dissonant psychological states,” which are part of the same perception-action process while still hierarchically distinct. Second, since PP is action-oriented, it can be read to support recent action-based models of CD. Third, PP can potentially help us understand the recently speculated evolutionary origins of CD. Here, the argument is that responses to CD can instill meta-learning which serves to prevent the overfitting of generative models to ephemeral local conditions. This can increase action-oriented ecological rationality and enhanced capabilities to interact with a rich landscape of affordances. The downside is that in today’s world where social institutions such as science a priori separate noise from signal, some reactions to predictive dissonance might propagate ecologically unsound (underfitted, confirmation-biased) mental models such as climate denialism. (shrink)
While structural equations modeling is increasingly used in philosophical theorizing about causation, it remains unclear what it takes for a particular structural equations model to be correct. To the extent that this issue has been addressed, the consensus appears to be that it takes a certain family of causal counterfactuals being true. I argue that this account faces difficulties in securing the independent manipulability of the structural determination relations represented in a correct structural equations model. I then offer an alternate (...) understanding of structural determination, and I demonstrate that this theory guarantees that structural determination relations are independently manipulable. The account provides a straightforward way of understanding hypothetical interventions, as well as a criterion for distinguishing hypothetical changes in the values of variables which constitute interventions from those which do not. It additionally affords a semantics for causal counterfactual conditionals which is able to yield a clean solution to a problem case for the standard ‘closest possible world’ semantics. (shrink)
Though there is a wide and varied literature on ethical supererogation, there has been almost nothing written about its epistemic counterpart, despite an intuitive analogy between the two fields. This paper seeks to change this state of affairs. I will begin by showing that there are examples which intuitively feature epistemically supererogatory doxastic states. Next, I will present a positive theory of epistemic supererogation that can vindicate our intuitions in these examples, in an explanation that parallels a popular (...) class='Hi'>theory of ethical supererogation. Roughly, I will argue that a specific type of epistemic virtue—the ability to creatively think up plausible hypotheses given a body of evidence—is not required of epistemic agents. Thus, certain exercises of this virtue can result in supererogatory doxastic states. In presenting this theory, I will also show how thinking about epistemic supererogation can provide us a new way forward in the debate about the uniqueness thesis for epistemic rationality. (shrink)
According to one tradition, uttering an indicative conditional involves performing a special sort of speech act: a conditional assertion. We introduce a formal framework that models this speech act. Using this framework, we show that any theory of conditional assertion validates several inferences in the logic of conditionals, including the False Antecedent inference. Next, we determine the space of truth-conditional semantics for conditionals consistent with conditional assertion. The truth value of any such conditional is settled whenever the antecedent is (...) false, and whenever the antecedent is true and the consequent is false. Then, we consider the space of dynamic meanings consistent with the theory of conditional assertion. We develop a new family of dynamic conditional-assertion operators that combine a traditional test operator with an update operation. (shrink)
This paper explores whether it is possible to reformulate or re-interpret Lewis’s theory of fundamental laws of nature—his “best system analysis”—in such a way that it becomes a useful theory for special science laws. One major step in this enterprise is to make plausible how law candidates within best system competitions can tolerate exceptions—this is crucial because we expect special science laws to be so called “ceteris paribus laws ”. I attempt to show how this is possible and (...) also how we can thereby make the first step towards a solution for the infamous difficulties surrounding the troublesome ceteris paribus clause. The paper outlines the general ideas of the theory but also points out some of its difficulties and background assumptions. (shrink)
A formalism is introduced to represent the connective organization of an evolving neuronal network and the effects of environment on this organization by stabilization or degeneration of labile synapses associated with functioning. Learning, or the acquisition of an associative property, is related to a characteristic variability of the connective organization: the interaction of the environment with the genetic program is printed as a particular pattern of such organization through neuronal functioning. An application of the theory to the development of (...) the neuromuscular junction is proposed and the basic selective aspect of learning emphasized. (shrink)
Enter Object-Oriented Ontology: A New Theory of Everything.Eschewing the verbose and often obscurantist tendencies of other philosopher-authors, Harman tackles what might otherwise be a complicated, controversial and counter-intuitive philosophical stance with accessible and easy-to-follow prose. OOO has never been so clear nor so convincingly presented as it is here. Covered in seven chapters, the book gives a genealogical account of OOO, chronicling the reason for its emergence, comparing it to both the past and current philosophical traditions and arguing for (...) its potency over the competing ontologies, almost all of which are post-Kantian. (shrink)
What do we see when we look at someone's expression of fear? I argue that one of the things that we see is fear itself. I support this view by developing a theory of affect perception. The theory involves two claims. One is that expressions are patterns of facial changes that carry information about affects. The other is that the visual system extracts and processes such information. In particular, I argue that the visual system functions to detect the (...) affects of others when they are expressed in the face. I develop my theory by drawing on empirical data from psychology and brain science. Finally, I outline a theory of the semantics of affect perception. (shrink)
Though researchers working on congenital aphantasia (henceforth “aphantasia”) agree that this condition involves an impairment in the ability to voluntarily generate visual imagery, disagreement looms large as to which other impairments are exhibited by aphantasic subjects. This article offers the first extensive review of studies on aphantasia, and proposes that aphantasic subjects exhibit a cluster of impairments. It puts forward a novel cognitive theory of aphantasia, building on the constructive episodic simulation hypothesis of memory and imagination. It argues that (...) aphantasia is best explained as a malfunction of processes in the episodic system, and is therefore an episodic system condition. (shrink)
WINNER BEST SOCIAL PHILOSOPHY BOOK IN 2021 / NASSP BOOK AWARD 2022 -/- Together we can often achieve things that are impossible to do on our own. We can prevent something bad from happening or we can produce something good, even if none of us could do it by herself. But when are we morally required to do something of moral importance together with others? This book develops an original theory of collective moral obligations. These are obligations that individual (...) moral agents hold jointly, but not as unified collective agents. To think of some of our obligations as joint or collective is the best way of making sense of our intuitions regarding collective moral action problems. Where we have reason to believe that our efforts are most efficient as part of a collective endeavor we may incur collective obligations together with others who are similarly placed as long as we are able to establish compossible individual contributory strategies towards that goal. The book concludes with a discussion of “massively shared obligations” to large-scale moral problems such as global poverty. (shrink)
The present essay seeks, by way of the Austrian example, to make a contribution to what might be called the philosophy of the supranational state. More specifically, we shall attempt to use certain ideas on the philosophy of Gestalten as a basis for understanding some aspects of that political and cultural phenomenon which was variously called the Austrian Empire, the Habsburg Empire, the Danube Monarchy or Kakanien.
This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truth-values rather than probability distributions. The main hypothesis supported in the paper is that the classic quantitative theory of weakly semantic information (TWSI), based on probability distributions, assumes that truth-values supervene on factual semantic information, yet this principle is too weak and generates a well-known semantic paradox, whereas TSSI, according to which factual semantic information encapsulates truth, can avoid the paradox and is more in line (...) with the standard conception of what generally counts as semantic information. After a brief introduction, section two outlines the semantic paradox implied by TWSI, analysing it in terms of an initial conflict between two requisites of a quantitative theory of semantic information. In section three, three criteria of semantic information equivalence are used to provide a taxonomy of quantitative approaches to semantic information and introduce TSSI. In section four, some further desiderata that should be fulfilled by a quantitative TSSI are explained. From section five to section seven, TSSI is developed on the basis of a calculus of truth-values and semantic discrepancy with respect to a given situation. In section eight, it is shown how TSSI succeeds in solving the paradox. Section nine summarises the main results of the paper and indicates some future developments. (shrink)
The feeling of being offended, as a moral emotion, plays a key role in issues such as slurs, the offense principle, ethics of humor, etc. However, no adequate theory of offense has been developed in the literature, and it remains unclear what questions such a theory should answer. This paper attempts to fill the gap by performing two tasks. The first task is to clarify and summarize the questions of offense into two kinds, the descriptive questions (e.g., what (...) features differentiate offense from similar moral states like anger?) and the normative questions (e.g., what are the conditions for taking offense to be apt?). The second task is to answer these questions by developing what I call ‘the violated norm theory of offense’. According to this theory, feeling offended entails that the norm one endorses is judged to be violated by the offender. Appealing to the violated norm enables this theory to answer the descriptive questions (e.g., taking offense differs from anger because of features like not requiring victims and the difficulty of animal offense) and the normative questions of offense (e.g., taking offense is apt only if the violated norm is universalizable). (shrink)
The idea that God creates out of Himself seems quite attractive. Many find great appeal in holding that a temporally finite universe must have a cause, but I think there’s also great appeal in holding that there’s pre-existent stuff out of which that universe is created—and what could that stuff be but part of God? Though attractive, the idea of creation ex deo hasn’t been taken seriously by theistic philosophers. Perhaps this is because it seems too vague—‘could anything enlightening be (...) said about what those parts are?’—or objectionable—‘wouldn’t creating out of those parts lessen or destroy God?’ Drawing from Stephen Kosslyn and Michael Tye’s work on the ontology of mental images, I respond to the above questions by developing a theory on which God creates the universe out of His mental imagery. (shrink)
The aim of this paper is make a contribution to the ongoing search for an adequate concept of the a priori element in scientific knowledge. The point of departure is C.I. Lewis’s account of a pragmatic a priori put forward in his "Mind and the World Order" (1929). Recently, Hasok Chang in "Contingent Transcendental Arguments for Metaphysical Principles" (2008) reconsidered Lewis’s pragmatic a priori and proposed to conceive it as the basic ingredient of the dynamics of an embodied scientific reason. (...) The present paper intends to further elaborate Chang’s account by relating it with some conceptual tools from cognitive semantics and certain ideas that first emerged in the context of the category-theoretical foundations of mathematics. (shrink)
This study presents and develops in detail (a new version of) the argumental conception of meaning. The two basic principles of the argumental conception of meaning are: i) To know (implicitly) the sense of a word is to know (implicitly) all the argumentation rules concerning that word; ii) To know the sense of a sentence is to know the syntactic structure of that sentence and to know the senses of the words occurring in it. The sense of a sentence is (...) called immediate argumental role of that sentence. According to the argumental conception of meaning a theory of meaning for a particular language yields a systematic specification of the understanding of every sentence of the language which consists in a specification of the immediate argumental role of the sentence. The immediate argumental role is a particular aspect of the use of a sentence in arguments. But it is not the whole use in arguments, nor is the whole use in arguments reducible to the immediate argumental role. That is why, by accepting the argumental conception of meaning, we can have epistemological holism without linguistic holism. The argumental conception distinguishes between the understanding and the correctness of a language. Such a distinction makes it possible to account for our understanding of paradoxical languages. Redundancy theory of truth, realistic conceptions of truth or epistemic conceptions of truth are all compatible with an argumental conception of sense. But here it is argued that an epistemic conception of truth is preferrable. Acceptance of the argumental conception of meaning and of an epistemic conception of truth leads to a rejection of the idea of analytic truth. The argumental conception is pluralistic with respect to the understandability of different logics, and neutral with respect to their correctness. (shrink)
A survey of theories of part, whole and dependence from Aristotle to the Gestalt psychologists, with special attention to Husserl’s Third Logical Investigation “On the Theory of Parts and Wholes”.
I present a new kind of A-theory. On this proposal, time’s passing is a metaphysically fundamental aspect of reality. I take this to mean that there are fundamental facts like: four hours passed from 8am today until noon. This A-theory also posits fundamental facts about the state of the universe at a given time, and about cross-temporal relationships. The proposed metaphysical package attractively articulates our pre-relativistic conception of time. I defend the proposal from a number of orthodox objections: (...) fundamental facts need not be aspects of current reality (§2); our package can and should posit fundamental cross-temporal relationships (§3); it resolves the difficulty of choosing between ‘presentist’ and ‘eternalist’ A-theories (§4); it evades the so-called ‘problem of temporary intrinsics’ (§5). (shrink)
This paper argues that the theory of structured propositions is not undermined by the Russell-Myhill paradox. I develop a theory of structured propositions in which the Russell-Myhill paradox doesn't arise: the theory does not involve ramification or compromises to the underlying logic, but rather rejects common assumptions, encoded in the notation of the $\lambda$-calculus, about what properties and relations can be built. I argue that the structuralist had independent reasons to reject these underlying assumptions. The theory (...) is given both a diagrammatic representation, and a logical representation in a novel language. In the latter half of the paper I turn to some technical questions concerning the treatment of quantification, and demonstrate various equivalences between the diagrammatic and logical representations, and a fragment of the $\lambda$-calculus. (shrink)
Human behavior is an underlying cause for many of the ecological crises faced in the 21st century, and there is no escaping from the fact that widespread behavior change is necessary for socio-ecological systems to take a sustainable turn. Whilst making people and communities behave sustainably is a fundamental objective for environmental policy, behavior change interventions and policies are often implemented from a very limited non-systemic perspective. Environmental policy-makers and psychologists alike often reduce cognition ‘to the brain,’ focusing only to (...) a minor extent on how everyday environments systemically afford pro-environmental behavior. Symptomatic of this are the widely prevalent attitude–action, value–action or knowledge–action gaps, understood in this paper as the gulfs lying between sustainable thinking and behavior due to lack of affordances. I suggest that by adopting a theory of affordances as a guiding heuristic, environmental policy-makers are better equipped to promote policies that translate sustainable thinking into sustainable behavior, often self-reinforcingly, and have better conceptual tools to nudge our socio–ecological system toward a sustainable turn. Affordance theory, which studies the relations between abilities to perceive and act and environmental features, is shown to provide a systemic framework for analyzing environmental policies and the ecology of human behavior. This facilitates the location and activation of leverage points for systemic policy interventions, which can help socio–ecological systems to learn to adapt to more sustainable habits. Affordance theory is presented to be applicable and pertinent to technically all nested levels of socio–ecological systems from the studies of sustainable objects and households to sustainable urban environments, making it an immensely versatile conceptual policy tool. Finally, affordance theory is also discussed from a participatory perspective. Increasing the fit between local thinking and external behavior possibilities entails a deep understanding of tacit and explicit attitudes, values, knowledge as well as physical and social environments, best gained via inclusive and polycentric policy approaches. (shrink)
The notion of implicit commitment has played a prominent role in recent works in logic and philosophy of mathematics. Although implicit commitment is often associated with highly technical studies, it remains so far an elusive notion. In particular, it is often claimed that the acceptance of a mathematical theory implicitly commits one to the acceptance of a Uniform Reflection Principle for it. However, philosophers agree that a satisfactory analysis of the transition from a theory to its reflection principle (...) is still lacking. We provide an axiomatization of the minimal commitments implicit in the acceptance of a mathematical theory. The theory entails that the Uniform Reflection Principle is part of one's implicit commitments, and sheds light on the reason why this is so. We argue that the theory has interesting epistemological consequences in that it explains how justified belief in the axioms of a theory can be preserved to the corresponding reflection principle. The theory also improves on recent proposals for the analysis of implicit commitment based on truth or epistemic notions. (shrink)
Several variants of Lewis's Best System Account of Lawhood have been proposed that avoid its commitment to perfectly natural properties. There has been little discussion of the relative merits of these proposals, and little discussion of how one might extend this strategy to provide natural property-free variants of Lewis's other accounts, such as his accounts of duplication, intrinsicality, causation, counterfactuals, and reference. We undertake these projects in this paper. We begin by providing a framework for classifying and assessing the variants (...) of the Best System Account. We then evaluate these proposals, and identify the most promising candidates. We go on to develop a proposal for systematically modifying Lewis's other accounts so that they, too, avoid commitment to perfectly natural properties. We conclude by briefly considering a different route one might take to developing natural property-free versions of Lewis's other accounts, drawing on recent work by Williams. (shrink)
I argue that the theory of chance proposed by David Lewis has three problems: (i) it is time asymmetric in a manner incompatible with some of the chance theories of physics, (ii) it is incompatible with statistical mechanical chances, and (iii) the content of Lewis's Principal Principle depends on how admissibility is cashed out, but there is no agreement as to what admissible evidence should be. I proposes two modifications of Lewis's theory which resolve these difficulties. I conclude (...) by tentatively proposing a third modification of Lewis's theory, one which explains many of the common features shared by the chance theories of physics. (shrink)
In this work, the author tries to give an ontological foundation and framework for relationalism, by interpreting the meaning of being in terms of particular (individual) in its relationality. This work provides many an insight into how we can look at not only metaphysics but epistemology and ethics as well from a relationalist point of view.
It is widely thought that functionalism and the qualia theory are better positioned to accommodate the ‘affective’ aspect of pain phenomenology than representationalism. In this paper, we attempt to overturn this opinion by raising problems for both functionalism and the qualia theory on this score. With regard to functionalism, we argue that it gets the order of explanation wrong: pain experience gives rise to the effects it does because it hurts, and not the other way around. With regard (...) to the qualia theory, we argue that it fails to capture the sense in which pain 's affective phenomenology rationalises various bodily-directed beliefs, desires, and behaviours. Representationalism, in contrast, escapes both of these problems: it gets the order of explanation right and it explains how pain 's affective phenomenology can rationalise bodily-directed beliefs, desires, and behaviours. For this reason, we argue that representationalism has a significant advantage in the debates about pain 's affective phenomenology. We end the paper by examining objections, including the question of what representationalists should say about so-called ‘disassociation cases’, such as pain asymbolia. (shrink)
In this work I propose a theory of evolution as a process of unfolding. This theory is based on four logically concatenated principles. The principle of evolutionary order establishes that the more complex cannot be generated from the simpler. The principle of origin establishes that there must be a maximum complexity that originates the others by logical deduction. Finally, the principle of unfolding and the principle of actualization guarantee the development of the evolutionary process from the simplest to (...) the most complex. These logical principles determine the existence of a virtual ideological matrix that contains the sequence of the preformed and folded morphogenetic fields. In this manner, the evolutionary process consists of the sequential unfolding and actualization of these fields, which is motorized by a process of teleologization carried out by the opening consciousness of the forms included in the fields of the ideological matrix. This theory leads to a radical change of perspective regarding the materialist worldview, and places life at the center of the evolutionary process as an activity carried out by a consciousness that seeks to fulfill a purpose by actualizing its own potentialities. (shrink)
This essay uses a mental files theory of singular thought—a theory saying that singular thought about and reference to a particular object requires possession of a mental store of information taken to be about that object—to explain how we could have such thoughts about abstract mathematical objects. After showing why we should want an explanation of this I argue that none of three main contemporary mental files theories of singular thought—acquaintance theory, semantic instrumentalism, and semantic cognitivism—can give (...) it. I argue for two claims intended to advance our understanding of singular thought about mathematical abstracta. First, that the conditions for possession of a file for an abstract mathematical object are the same as the conditions for possessing a file for an object perceived in the past—namely, that the agent retains information about the object. Thus insofar as we are able to have memory-based files for objects perceived in the past, we ought to be able to have files for abstract mathematical objects too. Second, at least one recently articulated condition on a file’s being a device for singular thought—that it be capable of surviving a certain kind of change in the information it contains—can be satisfied by files for abstract mathematical objects. (shrink)
The Gettier Problem is the problem of revising the view that knowledge is justified true belief in a way that is immune to Gettier counter-examples. The “Gettier Problem problem”, according to Lycan, is the problem of saying what is misguided about trying to solve the Gettier Problem. In this paper I take up the Gettier Problem problem. I distinguish giving conditions that are necessary and sufficient for knowledge from giving conditions that explain why one knows when one does know. I (...) argue that the problem with the Gettier Problem is that it requires us to articulate conditions that suffice for knowledge even if those conditions are non-explanatory. After defending this view, I take up two related methodological issues, one about the evidence that can be given in favor of an account of knowledge, and one about the role that investigating justification might play in investigating knowledge. (shrink)
Review of: Husain Sarkar. A Theory of Method. xvii+ 229 pp., bibl., indexes. Berkeley/Los Angeles/London: University of California Press, 1983. $29.95. The subject of this book is best stated in the author's words: "A theory is about the world; a method is about theories; and, a theory of method is about methods" (p. 1). A theory of method seeks to offer a general framework within which to choose among methods. Through critical examination of the positions of (...) Karl Popper, Imre Lakatos, and Larry Laudan, Sarkar develops his own framework for evaluating methods, which includes two bold proposals: that the history of science cannot be used as an arbitrator among methods, and that methods should be chosen by testing the heuristic advice they give about which lines of scientific research to pursue further. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.