In this article, we present a new conception of internal relations between quantity tropes falling under determinates and determinables. We begin by providing a novel characterization of the necessary relations between these tropes as basic internal relations. The core ideas here are that the existence of the relata is sufficient for their being internally related, and that their being related does not require the existence of any specific entities distinct from the relata. We argue that quantity tropes are, as determinate (...) particular natures, internally related by certain relations of proportion and order. By being determined by the nature of tropes, the relations of proportion and order remain invariant in conventional choice of unit for any quantity and give rise to natural divisions among tropes. As a consequence, tropes fall under distinct determinables and determinates. Our conception provides an accurate account of quantitative distances between tropes but avoids commitment to determinable universals. In this important respect, it compares favorably with the standard conception taking exact similarity and quantitative distances as primitive internal relations. Moreover, we argue for the superiority of our approach in comparison with two additional recent accounts of the similarity of quantity tropes. (shrink)
Quantity is the first category that Aristotle lists after substance. It has extraordinary epistemological clarity: "2+2=4" is the model of a self-evident and universally known truth. Continuous quantities such as the ratio of circumference to diameter of a circle are as clearly known as discrete ones. The theory that mathematics was "the science of quantity" was once the leading philosophy of mathematics. The article looks at puzzles in the classification and epistemology of quantity.
In this paper we consider conditional random quantities (c.r.q.’s) in the setting of coherence. Based on betting scheme, a c.r.q. X|H is not looked at as a restriction but, in a more extended way, as \({XH + \mathbb{P}(X|H)H^c}\) ; in particular (the indicator of) a conditional event E|H is looked at as EH + P(E|H)H c . This extended notion of c.r.q. allows algebraic developments among c.r.q.’s even if the conditioning events are different; then, for instance, we can give (...) a meaning to the sum X|H + Y|K and we can define the iterated c.r.q. (X|H)|K. We analyze the conjunction of two conditional events, introduced by the authors in a recent work, in the setting of coherence. We show that the conjoined conditional is a conditional random quantity, which may be a conditional event when there are logical dependencies. Moreover, we introduce the negation of the conjunction and by applying De Morgan’s Law we obtain the disjoined conditional. Finally, we give the lower and upper bounds for the conjunction and disjunction of two conditional events, by showing that the usual probabilistic properties continue to hold. (shrink)
The paper interprets the concept “operator in the separable complex Hilbert space” (particalry, “Hermitian operator” as “quantity” is defined in the “classical” quantum mechanics) by that of “quantum information”. As far as wave function is the characteristic function of the probability (density) distribution for all possible values of a certain quantity to be measured, the definition of quantity in quantum mechanics means any unitary change of the probability (density) distribution. It can be represented as a particular case of “unitary” qubits. (...) The converse interpretation of any qubits as referring to a certain physical quantity implies its generalization to non-Hermitian operators, thus neither unitary, nor conserving energy. Their physical sense, speaking loosely, consists in exchanging temporal moments therefore being implemented out of the space-time “screen”. “Dark matter” and “dark energy” can be explained by the same generalization of “quantity” to non-Hermitian operators only secondarily projected on the pseudo-Riemannian space-time “screen” of general relativity according to Einstein's “Mach’s principle” and his field equation. (shrink)
Resemblances obtain not only between objects but between properties. Resemblances of the latter sort - in particular resemblances between quantitative properties - prove to be the downfall of a well-known theory of universals, namely the one presented by David Armstrong. This paper examines Armstrong's efforts to account for such resemblances within the framework of his theory and also explores several extensions of that theory. All of them fail.
I analyze the meaning of mass in Newtonian mechanics. First, I explain the notion of primitive ontology, which was originally introduced in the philosophy of quantum mechanics. Then I examine the two common interpretations of mass: mass as a measure of the quantity of matter and mass as a dynamical property. I claim that the former is ill-defined, and the latter is only plausible with respect to a metaphysical interpretation of laws of nature. I explore the following options for the (...) status of laws: Humeanism, primitivism about laws, dispositionalism, and ontic structural realism. (shrink)
In this essay I propose a new measure of social welfare. It captures the intuitive idea that quantity, quality, and equality of individual welfare all matter for social welfare. More precisely, it satisfies six conditions: Equivalence, Dominance, Quality, Strict Monotonicity, Equality and Asymmetry. These state that i) populations equivalent in individual welfare are equal in social welfare; ii) a population that dominates another in individual welfare is better; iii) a population that has a higher average welfare than another population is (...) better, other things being equal; iv) the addition of a well-faring individual makes a population better, whereas the addition of an ill-faring individual makes a population worse; v) a population that has a higher degree of equality than another population is better, other things being equal; and vi) individual illfare matters more for social welfare than individual welfare. By satisfying the six conditions, the measure improves on previously proposed measures, such as the utilitarian Total and Average measures, as well as different kinds of Prioritarian measures. (shrink)
This essay explores various problematical aspects of Descartes' conservation principle for the quantity of motion (size times speed), particularly its largely neglected "dual role" as a measure of both durational motion and instantaneous "tendencies towards motion". Overall, an underlying non-local, or "holistic", element of quantity of motion (largely derived from his statics) will be revealed as central to a full understanding of the conservation principle's conceptual development and intended operation; and this insight can be of use in responding to some (...) of the recent and traditional criticisms of Descartes' physics. (shrink)
Quantities like mass and temperature are properties that come in degrees. And those degrees (e.g. 5 kg) are properties that are called the magnitudes of the quantities. Some philosophers (e.g., Byrne 2003; Byrne & Hilbert 2003; Schroer 2010) talk about magnitudes of phenomenal qualities as if some of our phenomenal qualities are quantities. The goal of this essay is to explore the anti-physicalist implication of this apparently innocent way of conceptualizing phenomenal quantities. I will first argue (...) for a metaphysical thesis about the nature of magnitudes based on Yablo’s proportionality requirement of causation. Then, I will show that, if some phenomenal qualities are indeed quantities, there can be no demonstrative concepts about some of our phenomenal feelings. That presents a significant restriction on the way physicalists can account for the epistemic gap between the phenomenal and the physical. I’ll illustrate the restriction by showing how that rules out a popular physicalist response to the Knowledge Argument. (shrink)
A simple interpretation of quantity calculus is given. Quantities are described as two-place functions from objects, states or processes (or some combination of them) into numbers that satisfy the mutual measurability property. Quantity calculus is based on a notational simplification of the concept of quantity. A key element of the simplification is that we consider units to be intentionally unspecified numbers that are measures of exactly specified objects, states or processes. This interpretation of quantity calculus combines all the advantages (...) of calculating with numerical values (since the values of quantities are numbers, we can do with them everything we do with numbers) and all the advantages of calculating with standardly conceived quantities (calculus is invariant to the choice of units and has built-in dimensional analysis). This also shows that the standard metaphysics and mathematics of quantities and their magnitudes is not needed for quantity calculus. At the end of the article, arguments are given that the concept of quantity as defined here is a pivotal concept in understanding the quantitative approach to nature. As an application of this interpretation of quantity calculus, an easy proof of dimensional homogeneity of physical laws is given. (shrink)
The quantum information introduced by quantum mechanics is equivalent to that generalization of the classical information from finite to infinite series or collections. The quantity of information is the quantity of choices measured in the units of elementary choice. The qubit can be interpreted as that generalization of bit, which is a choice among a continuum of alternatives. The axiom of choice is necessary for quantum information. The coherent state is transformed into a well-ordered series of results in time after (...) measurement. The quantity of quantum information is the ordinal corresponding to the infinity series in question. Number and being (by the meditation of time), the natural and artificial turn out to be not more than different hypostases of a single common essence. This implies some kind of neo-Pythagorean ontology making related mathematics, physics, and technics immediately, by an explicit mathematical structure. (shrink)
Kant's obscure essay entitled An Attempt to Introduce the Concept of Negative Quantities into Philosophy has received virtually no attention in the Kant literature. The essay has been in English translation for over twenty years, though not widely available. In his original 1983 translation, Gordon Treash argues that the Negative Quantities essay should be understood as part of an ongoing response to the philosophy of Christian Wolff. Like Hoffmann and Crusius before him, the Kant of 1763 is at (...) odds with the Leibnizian-Wolffian tradition of deductive metaphysics. He joins his predecessors in rejecting the assumption that the law of contradiction alone can provide proof of the principle of sufficient reason: -/- In his rejection of the possibility of deducing all philosophic truth from the law of contradiction, however, and in the clear recognition that this impossibility has immediate consequences for defense of the law of sufficient reason, Kant's work most definitely and positively constitutes a line of succession from Hoffmann and Crusius (Treash, 1983, p. 25). -/- The recognition that Kant's Negative Quantities essay is part of a response to the tradition of deductive metaphysics is, without a doubt, an important contribution to the Kant literature. However, there is still more to be said about this neglected essay. The full significance of the paper becomes known through its ties to a second, empiricist line of succession. Clues to this second line of succession can be found in Kant's prefatory remarks concerning Euler's 1748 Reflections on Space and Time and Crusius' 1749 Guidance in the Orderly and Careful Consideration of Natural Events. As I will show, these prefatory remarks suggest a reading of Kant's Negative Quantities paper that reaches beyond German deductive metaphysics to engage a debate regarding the application of mathematics in philosophy initiated by George Berkeley. (shrink)
Newton published his deduction of universal gravity in Principia (first ed., 1687). To establish the universality (the particle-to-particle nature) of gravity, Newton must establish the additivity of mass. I call ‘additivity’ the property a body's quantity of matter has just in case, if gravitational force is proportional to that quantity, the force can be taken to be the sum of forces proportional to each particle's quantity of matter. Newton's argument for additivity is obscure. I analyze and assess manuscript versions of (...) Newton's initial argument within his initial deduction, dating from early 1685. Newton's strategy depends on distinguishing two quantities of matter, which I call ‘active’ and ‘passive’, by how they are measured. These measurement procedures frame conditions on the additivity of each quantity so measured. While Newton has direct evidence for the additivity of passive quantity of matter, he does not for that of the active quantity. Instead, he tries to infer the latter from the former via conceptual analyses of the third law of motion grounded largely on analogies to magnetic attractions. The conditions needed to establish passive additivity frustrate Newton's attempted inference to active additivity. (shrink)
This article aims to interpret Leibniz’s dynamics project through a theory of the causation of corporeal motion. It presents an interpretation of the dynamics that characterizes physical causation as the structural organization of phenomena. The measure of living force by mv2 must then be understood as an organizational property of motion conceptually distinct from the geometrical or otherwise quantitative magnitudes exchanged in mechanical phenomena. To defend this view, we examine one of the most important theoretical discrepancies of Leibniz’s dynamics with (...) classical mechanics, the measure of vis viva as mv2 rather than ½ mv2. This “error”, resulting from the limits of Leibniz’s methodology, reveals the systematic role of this quantity mv2 in the dynamics. In examining the evolution of the quantity mv2 in the refinement of the force concept from potentia to actio, I argue that Leibniz’s systematic limitations help clarify dynamical causality as neither strictly metaphysical nor mechanical but a distinct level of reality to which Leibniz dedicates the “dynamica” as “nova scientia”. (shrink)
The article evaluates the Domain Postulate of the Classical Model of Science and the related Aristotelian prohibition rule on kind-crossing as interpretative tools in the history of the development of mathematics into a general science of quantities. Special reference is made to Proclus’ commentary to Euclid’s first book of Elements , to the sixteenth century translations of Euclid’s work into Latin and to the works of Stevin, Wallis, Viète and Descartes. The prohibition rule on kind-crossing formulated by Aristotle in (...) Posterior analytics is used to distinguish between conceptions that share the same name but are substantively different: for example the search for a broader genus including all mathematical objects; the search for a common character of different species of mathematical objects; and the effort to treat magnitudes as numbers. (shrink)
Kant's special metaphysics is intended to provide the a priori foundation for Newtonian science, which is to be achieved by exhibiting the a priori content of Newtonian concepts and laws. Kant envisions a two-step mathematical construction of the dynamical concept of matter involving a geometrical construction of matter’s bulk and a symbolic construction of matter’s density. Since Newton himself defines quantity of matter in terms of bulk and density, there is no reason why we shouldn’t interpret Kant’s Dynamics as a (...) defence of a Newtonian concept of matter. When Kant’s reasoning is understood in relation to his criteria for mathematical construction, it is possible to maintain that matter theory is central to the Metaphysical Foundations, but that this does not undermine Kant’s stated aim of giving an a priori foundation for Newtonian science. (shrink)
In this paper, I define and study an abstract algebraic structure, the dimensive algebra, which embodies the most general features of the algebra of dimensional physical quantities. I prove some elementary results about dimensive algebras and suggest some directions for future work.
This rich book differs from much contemporary philosophy of mathematics in the author’s witty, down to earth style, and his extensive experience as a working mathematician. It accords with the field in focusing on whether mathematical entities are real. Franklin holds that recent discussion of this has oscillated between various forms of Platonism, and various forms of nominalism. He denies nominalism by holding that universals exist and denies Platonism by holding that they are concrete, not abstract - looking to Aristotle (...) for inspiration. (shrink)
A new computational methodology for executing calculations with infinite and infinitesimal quantities is described in this paper. It is based on the principle ‘The part is less than the whole’ introduced by Ancient Greeks and applied to all numbers (finite, infinite, and infinitesimal) and to all sets and processes (finite and infinite). It is shown that it becomes possible to write down finite, infinite, and infinitesimal numbers by a finite number of symbols as particular cases of a unique framework. (...) The new methodology has allowed us to introduce the Infinity Computer working with such numbers (its simulator has already been realized). Examples dealing with divergent series, infinite sets, and limits are given. (shrink)
This study is about the Quality. Here I have dealt with the quality that differs significantly from the common understanding of quality /as determined quality/ that arise from the law of dialectics. This new quality is the quality of the quantity /quality of the quantitative changes/, noticed in philosophy by Plato as “quality of numbers”, and later developed by Hegel as “qualitative quantity. The difference between the known determined quality and qualitative quantity is evident in the exhibit form of these (...) two qualities. The exhibit form of the known determined quality from the law of dialectics /or it transformation/ is related with discreteness and abrupt changes. The exhibit form of the qualitative quantity /and it transformation/ is related with the continuity and gradual transition from one condition, to a different condition, without any abrupt changes. In my paper “Quality of the quantity”, I have argued that one of the most ancient implementation of quality of the number can be found in the dimensional mathematical model of point – line – surface – figure - introduced by Plato. The most whole presentation of the idea of quality of number in Plato is embeded in his teaching about the "eidical number". The quality of the quantity emerges as criteria for recognizing the difference between the eidical numbers and natural arithmetical number. The thesis concerning Plato is based on the The Unwritten Doctrine of Plato and one of the most original works in the history of philosophy written in the 20th century - “Arete bei Platon und Aristoteles” – “Arete in Plato and Aristotle” /Heidelberg 1959/ written by Hans Joachim Krämer. The new quality as the quality of the quantity /quality of the quantitative changes/, first noticed in philosophy by Plato as “quality of numbers” was developed in Hegel as “qualitative quantity”. Hegel proclaimed the Qualitative quantity, or Measure in the both of his Logics -The Science of Logic / the Greater Logic/ and The Lesser Logic/ Part One of the Encyclopedia of Philosophical Sciences: The Logic. In my paper I have offered the arguments that the concept of quality of the quantity should be enhanced with the adopted methodological approach of analogy with an implementation in the field of the Topology - Analysis Situs, developed by the Jules Henri Poincare. In the topology we could see homeomorphism as exhibit form of Quality on the Quantity. The explicit form of the quality of the quantity transformation is the continuous deformation – typically known in topology as homeomorphism. The concept of qualitative quantity is linked with the concept “structural stability” and nonequilibrum phase transition. The concept of structural stability is related with the topological homeomorphism. In his book “Synergetics: Introduction and Advanced Topics” /Springer, ISBN 3-540-40824/, in the Chapter 1.13. Qualitative Changes: General approach, p. 434-435, Hermann Haken explores and illustrate the structural stability with an example /figure 1.13, p.434/ given by of D'Arcy W. Thompson, the Scottish biologist, mathematician and classics scholar and pioneering mathematical biologist, Nobel Laureate in Medicine /1960/, the author of the book, On Growth and Form, /1917/. The quality of the quantity could be seen in the Herman Haken’s citation on the D'Arcy W. Thompson. My thesis is that the topological homeomorphism is the explicit form of the quality of the quantity transformation. The qualitative quantity change which becomes phenomenon, according to Émile Boutroux, is the subject of study in Cultural Phenomenology of Qualitative quantity. Our approach to this subject is Poincaré Model of the Subconscious Mind in Mathematics, which is the most suitable tool to unfold the arhetype of qualitative quantity. (shrink)
If the import of a book can be assessed by the problem it takes on, how that problem unfolds, and the extent of the problem’s fruitfulness for further exploration and experimentation, then Duffy has produced a text worthy of much close attention. Duffy constructs an encounter between Deleuze’s creation of a concept of difference in Difference and Repetition (DR) and Deleuze’s reading of Spinoza in Expressionism in Philosophy: Spinoza (EP). It is surprising that such an encounter has not already been (...) explored, at least not to this extent and in this much detail. Since the two works were written simultaneously, as Deleuze’s primary and secondary dissertations, it is to be expected that there is much to learn from their interaction. Duffy proceeds by explicating, in terms of the differential calculus, a logic of what Deleuze in DR calls different/ciation, and then maps this onto Deleuze’s account of modal expression in EP. (shrink)
This study has demonstrated that entropy is not a physical quantity, that is, the physical quantity called entropy does not exist. If the efficiency of heat engine is defined as η = W/W1, and the reversible cycle is considered to be the Stirling cycle, then, given ∮dQ/T = 0, we can prove ∮dW/T = 0 and ∮d/T = 0. If ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 are thought to define new system state variables, such definitions would (...) be absurd. The fundamental error of entropy is that in any reversible process, the polytropic process function Q is not a single-valued function of T, and the key step of Σ[(ΔQ)/T)] to ∫dQ/T doesn’t hold, P-V fig. should be P-V-T fig.in thermodynamics. Similarly, ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 do not hold, either. Since the absolute entropy of Boltzmann is used to explain Clausius entropy and the unit (J/K) of the former is transformed from the latter, the non-existence of Clausius entropy simultaneously denies Boltzmann entropy. (shrink)
Two grams mass, three coulombs charge, five inches long – these are examples of quantitative properties. Quantitative properties have certain structural features that other sorts of properties lack. What are the metaphysical underpinnings of quantitative structure? This paper considers several accounts of quantity and assesses the merits of each.
The paper discusses some changes in Bolzano's definition of mathematics attested in several quotations from the Beyträge, Wissenschaftslehre and Grössenlehre: is mathematics a theory of forms or a theory of quantities? Several issues that are maintained throughout Bolzano's works are distinguished from others that were accepted in the Beyträge and abandoned in the Grössenlehre. Changes are interpreted as a consequence of the new logical theory of truth introduced in the Wissenschaftslehre, but also as a consequence of the overcome of (...) Kant's terminology, and of the radicalization of Bolzano's anti‐Kantianism. Bolzano's evolution is understood as a coherent move, once the criticism expressed in the Beyträge on the notion of quantity is compared with a different and larger notion of quantity that Bolzano developed already in 1816. This discussion is enriched by the discovery that two unknown texts mentioned by Bolzano in the Beyträge can be identified with works by von Spaun and Vieth respectively. Bolzano's evolution is interpreted as a radicalization of the criticism of the Kantian definition of mathematics and as an effect of Bolzano's unaltered interest in the Leibnizian notion of mathesis universalis. As a conclusion, the author claims that Bolzano never abandoned his original idea of considering mathematics as a scientia universalis, i.e. as the science of quantities in general, and suggests that the question of ideal elements in mathematics, apart from being a main reason for the development of a new logical theory, can also be considered as a main reason for developing a different definition of quantity. (shrink)
Revelation, or the view that the essence of phenomenal properties is presented to us, is as intuitively attractive as it is controversial. It is notably at the core of defences of anti-physicalism. I propose in this paper a new argument against Revelation. It is usually accepted that low-level sensory phenomenal properties, like phenomenal red, loudness or brightness, stand in relation of similarity and quantity. Furthermore, these similarity and quantitative relations are taken to be internal, that is, to be fixed by (...) what their relata are. I argue that, under some plausible additional premises, no account of what grounds these relations in the essence of their relata is consistent with Revelation, at least if we take common phenomenological descriptions for granted. As a result, the plausibility of Revelation is undermined. One might however resist this conclusion by weakening the epistemic relation postulated between subjects and their phenomenal properties. (shrink)
The trope bundle theories of objects are capable of analyzing monadic inherence (objects having tropes), which is one of their main advantage. However, the best current trope theoretical account of relational tropes, namely, the relata specific view leaves relational inherence (a relational trope relating two or more entities) primitive. This article presents the first trope theoretical analysis of relational inherence by generalizing the trope theoretical analysis of inherence to relational tropes. The analysis reduces the holding of relational inherence to the (...) obtaining of certain other facts about entities of the trope theoretical category system. Moreover, I show that the analysis can deal with asymmetric and non-symmetric relations by assuming that all relation-like tropes are quantities. Finally, I provide an account of the spatial location of tropes in the difficult case in which tropes contribute to determining of the location of other entities. (shrink)
We generalize, by a progressive procedure, the notions of conjunction and disjunction of two conditional events to the case of n conditional events. In our coherence-based approach, conjunctions and disjunctions are suitable conditional random quantities. We define the notion of negation, by verifying De Morgan’s Laws. We also show that conjunction and disjunction satisfy the associative and commutative properties, and a monotonicity property. Then, we give some results on coherence of prevision assessments for some families of compounded conditionals; in (...) particular we examine the Fréchet-Hoeffding bounds. Moreover, we study the reverse probabilistic inference from the conjunction Cn+1 of n + 1 conditional events to the family {Cn,En+1|Hn+1}. We consider the relation with the notion of quasi-conjunction and we examine in detail the coherence of the prevision assessments related with the conjunction of three conditional events. Based on conjunction, we also give a characterization of p-consistency and of p-entailment, with applications to several inference rules in probabilistic nonmonotonic reasoning. Finally, we examine some non p-valid inference rules; then, we illustrate by an example two methods which allow to suitably modify non p-valid inference rules in order to get inferences which are p-valid. (shrink)
Starting from a recent paper by S. Kaufmann, we introduce a notion of conjunction of two conditional events and then we analyze it in the setting of coherence. We give a representation of the conjoined conditional and we show that this new object is a conditional random quantity, whose set of possible values normally contains the probabilities assessed for the two conditional events. We examine some cases of logical dependencies, where the conjunction is a conditional event; moreover, we give the (...) lower and upper bounds on the conjunction. We also examine an apparent paradox concerning stochastic independence which can actually be explained in terms of uncorrelation. We briefly introduce the notions of disjunction and iterated conditioning and we show that the usual probabilistic properties still hold. (shrink)
In Science Without Numbers (1980), Hartry Field defends a theory of quantity that, he claims, is able to provide both i) an intrinsic explanation of the structure of space, spacetime, and other quantitative properties, and ii) an intrinsic explanation of why certain numerical representations of quantities (distances, lengths, mass, temperature, etc.) are appropriate or acceptable while others are not. But several philosophers have argued otherwise. In this paper I focus on arguments from Ellis and Milne to the effect that (...) one cannot provide an account of quantity in ''purely intrinsic'' terms. I show, first, that these arguments are confused. Second, I show that Field's treatment of quantity can provide an intrinsic explanation of the structure of quantitative properties; what it cannot do is provide an intrinsic explanation of why certain numerical representations are more appropriate than others. Third, I show that one could provide an intrinsic explanation of this sort if one modified Field's account in certain ways. (shrink)
The topic of this Handbook entry is the relationship between similarity and dimensional analysis, and some of the philosophical issues involved in understanding and making use of that relationship. Discusses basics of the relationship between units, dimensions, and quantities. It explains the significance of dimensionless parameters, and explains that similarity of a physical systems is established by showing equality of a certain set of dimensionless parameters that characterizes the system behavior. Similarity is always relative -- to some system behavior. (...) Other topics discussed: generalization of the notion of similarity, the difference between relative similarity and partial similarity; how the notion of similarity in science differs from similarity as it has been discussed in recent philosophy. Philosophers' views discussed: R. Giere, N. Goodman, P. Bridgman, and B. Ellis. (shrink)
In this paper we motivate and develop the analytic theory of measurement, in which autonomously specified algebras of quantities (together with the resources of mathematical analysis) are used as a unified mathematical framework for modeling (a) the time-dependent behavior of natural systems, (b) interactions between natural systems and measuring instruments, (c) error and uncertainty in measurement, and (d) the formal propositional language for describing and reasoning about measurement results. We also discuss how a celebrated theorem in analysis, known as (...) Gelfand representation, guarantees that autonomously specified algebras of quantities can be interpreted as algebras of observables on a suitable state space. Such an interpretation is then used to support (i) a realist conception of quantities as objective characteristics of natural systems, and (ii) a realist conception of measurement results (evaluations of quantities) as determined by and descriptive of the states of a target natural system. As a way of motivating the analytic approach to measurement, we begin with a discussion of some serious philosophical and theoretical problems facing the well-known representational theory of measurement. We then explain why we consider the analytic approach, which avoids all these problems, to be far more attractive on both philosophical and theoretical grounds. (shrink)
There is a wide range of realist but non-Platonist philosophies of mathematics—naturalist or Aristotelian realisms. Held by Aristotle and Mill, they played little part in twentieth century philosophy of mathematics but have been revived recently. They assimilate mathematics to the rest of science. They hold that mathematics is the science of X, where X is some observable feature of the (physical or other non-abstract) world. Choices for X include quantity, structure, pattern, complexity, relations. The article lays out and compares these (...) options, including their accounts of what X is, the examples supporting each theory, and the reasons for identifying the science of X with (most or all of) mathematics. Some comparison of the options is undertaken, but the main aim is to display the spectrum of viable alternatives to Platonism and nominalism. It is explained how these views answer Frege’s widely accepted argument that arithmetic cannot be about real features of the physical world, and arguments that such mathematical objects as large infinities and perfect geometrical figures cannot be physically realized. (shrink)
This paper introduces the reader to Meinong's work on the metaphysics of magnitudes and measurement in his Über die Bedeutung des Weber'schen Gesetzes. According to Russell himself, who wrote a review of Meinong's work on Weber's law for Mind, Meinong's theory of magnitudes deeply influenced Russell's theory of quantities in the Principles of Mathematics. The first and longest part of the paper discusses Meinong's analysis of magnitudes. According to Meinong, we must distinguish between divisible and indivisible magnitudes. He argues (...) that relations of distance, or dissimilarity, are indivisible magnitudes that coincide with divisible magnitudes called "stretches". The second part of the paper is concerned with Meinong's account of measurement as a comparison of parts. According to Meinong, since measuring consists in comparing parts only divisible magnitudes are directly measurable. Indivisible magnitudes can only be measured indirectly, by measuring the divisible stretches that coincide with them. (shrink)
We deepen the study of conjoined and disjoined conditional events in the setting of coherence. These objects, differently from other approaches, are defined in the framework of conditional random quantities. We show that some well known properties, valid in the case of unconditional events, still hold in our approach to logical operations among conditional events. In particular we prove a decomposition formula and a related additive property. Then, we introduce the set of conditional constituents generated by $n$ conditional events (...) and we show that they satisfy the basic properties valid in the case of unconditional events. We obtain a generalized inclusion-exclusion formula, which can be interpreted by introducing a suitable distributive property. Moreover, under logical independence of basic unconditional events, we give two necessary and sufficient coherence conditions. The first condition gives a geometrical characterization for the coherence of prevision assessments on a family F constituted by n conditional events and all possible conjunctions among them. The second condition characterizes the coherence of prevision assessments defined on $F\cup K$, where $K$ is the set of conditional constituents associated with the conditional events in $F$. Then, we give some further theoretical results and we examine some examples and counterexamples. Finally, we make a comparison with other approaches and we illustrate some theoretical aspects and applications. (shrink)
Quantum mechanics involves a generalized form of information, that of quantum information. It is the transfinite generalization of information and re-presentable by transfinite ordinals. The physical world being in the current of time shares the quality of “choice”. Thus quantum information can be seen as the universal substance of the world serving to describe uniformly future, past, and thus the present as the frontier of time. Future is represented as a coherent whole, present as a choice among infinitely many alternatives, (...) and past as a well-ordering obtained as a result of a series of choices. The concept of quantum information describes the frontier of time, that “now”, which transforms future into past. Quantum information generalizes information from finite to infinite series or collections. The concept of quantum information allows of any physical entity to be interpreted as some nonzero quantity of quantum information. The fundament of quantum information is the concept of ‘quantum bit’, “qubit”. A qubit is a choice among an infinite set of alternatives. It generalizes the unit of classical information, a bit, which refer to a finite set of alternatives. The qubit is also isomorphic to a ball in Euclidean space, in which two points are chosen. (shrink)
Historically, laws and policies to criminalize drug use or possession were rooted in explicit racism, and they continue to wreak havoc on certain racialized communities. We are a group of bioethicists, drug experts, legal scholars, criminal justice researchers, sociologists, psychologists, and other allied professionals who have come together in support of a policy proposal that is evidence-based and ethically recommended. We call for the immediate decriminalization of all so-called recreational drugs and, ultimately, for their timely and appropriate legal regulation. We (...) also call for criminal convictions for nonviolent offenses pertaining to the use or possession of small quantities of such drugs to be expunged, and for those currently serving time for these offenses to be released. In effect, we call for an end to the “war on drugs.”. (shrink)
This report reviews what quantum physics and information theory have to tell us about the age-old question, How come existence? No escape is evident from four conclusions: (1) The world cannot be a giant machine, ruled by any preestablished continuum physical law. (2) There is no such thing at the microscopic level as space or time or spacetime continuum. (3) The familiar probability function or functional, and wave equation or functional wave equation, of standard quantum theory provide mere continuum idealizations (...) and by reason of this circumstance conceal the information-theoretic source from which they derive. (4) No element in the description of physics shows itself as closer to primordial than the elementary quantum phenomenon, that is, the elementary device-intermediated act of posing a yes-no physical question and eliciting an answer or, in brief, the elementary act of observer-participancy. Otherwise stated, every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications, a conclusion which we epitomize in the phrase, it from bit. (shrink)
The philosophy of measurement studies the conceptual, ontological, epistemic, and technological conditions that make measurement possible and reliable. A new wave of philosophical scholarship has emerged in the last decade that emphasizes the material and historical dimensions of measurement and the relationships between measurement and theoretical modeling. This essay surveys these developments and contrasts them with earlier work on the semantics of quantity terms and the representational character of measurement. The conclusions highlight four characteristics of the emerging research program in (...) philosophy of measurement: it is epistemological, coherentist, practice oriented, and model based. (shrink)
It is a familiar point that many ordinary dispositions are multi-track, that is, not fully and adequately characterisable by a single conditional. In this paper, I argue that both the extent and the implications of this point have been severely underestimated. First, I provide new arguments to show that every disposition whose stimulus condition is a determinable quantity must be infinitely multi-track. Secondly, I argue that this result should incline us to move away from the standard assumption that dispositions are (...) in some way importantly linked to conditionals, as presupposed by the debate about various versions of the ‘conditional analysis’ of dispositions. I introduce an alternative conception of dispositionality, which is motivated by linguistic observations about dispositional adjectives and links dispositions to possibility instead of conditionals. I argue that, because of the multi-track nature of dispositions, the possibility-based conception of dispositions is to be preferred. (shrink)
The shape of the Earth's surface, its topography, is a fundamental dimension of the environment, shaping or mediating many other environmental flows or functions. But there is a major divergence in the way that topography is conceptualized in different domains. Topographic cartographers, information scientists, geomorphologists and environmental modelers typically conceptualize topographic variability as a continuous field of elevations or as some discrete approximation to such a field. Pilots, explorers, anthropologists, ecologists, hikers, and archeologists, on the other hand, typically conceptualize this (...) same variability in terms of hills and valleys, mountains and plains, barrows and trenches, that is, as (special sorts of) objects, with locations, shapes, and often names of their own. In this chapter, we sketch an approach to bridging this fundamental gap in geographic information infrastructure. (shrink)
Matthew Adler's Measuring Social Welfare is an introduction to the social welfare function (SWF) methodology. This essay questions some ideas at the core of the SWF methodology having to do with the relation between the SWF and the measure of well-being. The facts about individual well-being do not single out a particular scale on which well-being must be measured. As with physical quantities, there are multiple scales that can be used to represent the same information about well-being; no one (...) scale is special. Like physical laws, the SWF and its ranking of distributions cannot depend on exactly which of these scales we use. Adler and other theorists in the SWF tradition have used this idea to derive highly restrictive constraints on the shape of the SWF. These constraints rule out seemingly plausible views about distributive justice and population ethics. I argue, however, that these constraints stem from a simple but instructive mistake. The SWF should not be applied to vectors of numbers such as 1 and 2, but rather to vectors of dimensioned quantities such as 1 util and 2 utils. This seemingly pedantic suggestion turns out to have far-reaching consequences. Unlike the orthodox SWF approach, treating welfare levels as dimensioned quantities lets us distinguish between real changes in well-being and mere changes in the unit of measurement. It does this without making the SWF depend on the scale on which welfare is measured, and in a way that avoids the restrictive constraints on the shape of the SWF. (shrink)
According to lexical views in population axiology, there are good lives x and y such that some number of lives equally good as x is not worse than any number of lives equally good as y. Such views can avoid the Repugnant Conclusion without violating Transitivity or Separability, but they imply a dilemma: either some good life is better than any number of slightly worse lives, or else the ‘at least as good as’ relation on populations is radically incomplete, in (...) a sense to be explained. One might judge that the Repugnant Conclusion is preferable to each of these horns and hence embrace an Archimedean view. This is, roughly, the claim that quantity can always substitute for quality: each population is worse than a population of enough good lives. However, Archimedean views face an analogous dilemma: either some good life is better than any number of slightly worse lives, or else the ‘at least as good as’ relation on populations is radically and symmetrically incomplete, in a sense to be explained. Therefore, the lexical dilemma gives us little reason to prefer Archimedean views. Even if we give up on lexicality, problems of the same kind remain. (shrink)
Most of our best scientific descriptions of the world employ rates of change of some continuous quantity with respect to some other continuous quantity. For instance, in classical physics we arrive at a particle’s velocity by taking the time-derivative of its position, and we arrive at a particle’s acceleration by taking the time-derivative of its velocity. Because rates of change are defined in terms of other continuous quantities, most think that facts about some rate of change obtain in virtue (...) of facts about those other continuous quantities. For example, on this view facts about a particle’s velocity at a time obtain in virtue of facts about how that particle’s position is changing at that time. In this paper we raise a puzzle for this orthodox reductionist account of rate of change quantities and evaluate some possible replies. We don’t decisively come down in favour of one reply over the others, though we say some things to support taking our puzzle to cast doubt on the standard view that spacetime is continuous. (shrink)
What is it to know more? By what metric should the quantity of one's knowledge be measured? I start by examining and arguing against a very natural approach to the measure of knowledge, one on which how much is a matter of how many. I then turn to the quasi-spatial notion of counterfactual distance and show how a model that appeals to distance avoids the problems that plague appeals to cardinality. But such a model faces fatal problems of its own. (...) Reflection on what the distance model gets right and where it goes wrong motivates a third approach, which appeals not to cardinality, nor to counterfactual distance, but to similarity. I close the paper by advocating this model and briefly discussing some of its significance for epistemic normativity. In particular, I argue that the 'trivial truths' objection to the view that truth is the goal of inquiry rests on an unstated, but false, assumption about the measure of knowledge, and suggest that a similarity model preserves truth as the aim of belief in an intuitively satisfying way. (shrink)
Non-Humean accounts of the metaphysics of nature posit either laws or powers in order to account for natural necessity and world-order. We argue that such monistic views face fundamental problems. On the one hand, neo-Aristotelians cannot give unproblematic power-based accounts of the functional laws among quantities offered by physical theories, as well as of the place of conservation laws and symmetries in a lawless ontology; in order to capture these characteristics, commitment to governing laws is indispensable. On the other (...) hand, ontologies that entirely exclude some kind of power ascription to worldly entities face what we call the Governing Problem: such ontologies do not have the resources to give an adequate account of how laws play their governing role. We propose a novel dualist model, which, we argue, has the resources to solve the difficulties encountered by its two dominant competitors, without inheriting the problems of either view. According to the dualist model, both laws and powers are equally fundamental and irreducible to each other, and both are needed in order to give a satisfactory account of the nomological structure of the world. The dualist model constitutes thus a promising alternative to current monistic views in the metaphysics of science. (shrink)
We make the case that the Prisoner’s Dilemma, notwithstanding its fame and the quantity of intellectual resources devoted to it, has largely failed to explain any phenomena of social scientific or biological interest. In the heart of the paper we examine in detail a famous purported example of Prisoner’s Dilemma empirical success, namely Axelrod’s analysis of WWI trench warfare, and argue that this success is greatly overstated. Further, we explain why this negative verdict is likely true generally and not just (...) in our case study. We also address some possible defenses of the Prisoner’s Dilemma. (shrink)
According to Critical-Level Views in population axiology, an extra life improves a population only if that life’s welfare exceeds some fixed ‘critical level.’ An extra life at the critical level leaves the new population equally good as the original. According to Critical-Range Views, an extra life improves a population only if that life’s welfare exceeds some fixed ‘critical range.’ An extra life within the critical range leaves the new population incommensurable with the original. -/- In this paper, I sharpen some (...) old objections to these views and offer some new ones. Critical-Level Views cannot avoid certain Repugnant and Sadistic Conclusions. Critical-Range Views imply that lives featuring no good or bad components whatsoever can nevertheless swallow up and neutralise goodness or badness. Both classes of view entail that certain small changes in welfare correspond to worryingly large differences in contributive value. -/- I then offer a view that retains much of the appeal of Critical-Level and Critical-Range Views while avoiding the above pitfalls. On the Imprecise Exchange Rates View, the quantity of some good required to outweigh a given unit of some bad is imprecise. This imprecision is the source of incommensurability between lives and populations. (shrink)
Cancer research is experiencing ‘paradigm instability’, since there are two rival theories of carcinogenesis which confront themselves, namely the somatic mutation theory and the tissue organization field theory. Despite this theoretical uncertainty, a huge quantity of data is available thanks to the improvement of genome sequencing techniques. Some authors think that the development of new statistical tools will be able to overcome the lack of a shared theoretical perspective on cancer by amalgamating as many data as possible. We think instead (...) that a deeper understanding of cancer can be achieved by means of more theoretical work, rather than by merely accumulating more data. To support our thesis, we introduce the analytic view of theory development, which rests on the concept of plausibility, and make clear in what sense plausibility and probability are distinct concepts. Then, the concept of plausibility is used to point out the ineliminable role played by the epistemic subject in the development of statistical tools and in the process of theory assessment. We then move to address a central issue in cancer research, namely the relevance of computational tools developed by bioinformaticists to detect driver mutations in the debate between the two main rival theories of carcinogenesis. Finally, we briefly extend our considerations on the role that plausibility plays in evidence amalgamation from cancer research to the more general issue of the divergences between frequentists and Bayesians in the philosophy of medicine and statistics. We argue that taking into account plausibility-based considerations can lead to clarify some epistemological shortcomings that afflict both these perspectives. (shrink)
We introduce a ranking of multidimensional alternatives, including uncertain prospects as a particular case, when these objects can be given a matrix form. This ranking is separable in terms of rows and columns, and continuous and monotonic in the basic quantities. Owing to the theory of additive separability developed here, we derive very precise numerical representations over a large class of domains (i.e., typically notof the Cartesian product form). We apply these representationsto (1)streams of commodity baskets through time, (2)uncertain (...) social prospects, (3)uncertain individual prospects. Concerning(1), we propose a finite horizon variant of Koopmans’s (1960) axiomatization of infinite discounted utility sums. The main results concern(2). We push the classic comparison between the exanteand expostsocial welfare criteria one step further by avoiding any expected utility assumptions, and as a consequence obtain what appears to be the strongest existing form of Harsanyi’s (1955) Aggregation Theorem. Concerning(3), we derive a subjective probability for Anscombe and Aumann’s (1963) finite case by merely assuming that there are two epistemically independent sources of uncertainty. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.