Results for 'Measure theory'

934 found
Order:
  1. Measurement in biology is methodized by theory.Maël Montévil - 2019 - Biology and Philosophy 34 (3):35.
    We characterize access to empirical objects in biology from a theoretical perspective. Unlike objects in current physical theories, biological objects are the result of a history and their variations continue to generate a history. This property is the starting point of our concept of measurement. We argue that biological measurement is relative to a natural history which is shared by the different objects subjected to the measurement and is more or less constrained by biologists. We call symmetrization the theoretical and (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  2. Quantum theory without measurement or state reduction problems.Alan Macdonald - manuscript
    There is a consistent and simple interpretation of the quantum theory of isolated systems. The interpretation suffers no measurement problem and provides a quantum explanation of state reduction, which is usually postulated. Quantum entanglement plays an essential role in the construction of the interpretation.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  3. (1 other version)Fundamental Measurements in Economics and in the Theory of Consciousness.S. I. Melnyk & I. G. Tuluzov - manuscript
    A new constructivist approach to modeling in economics and theory of consciousness is proposed. The state of elementary object is defined as a set of its measurable consumer properties. A proprietor's refusal or consent for the offered transaction is considered as a result of elementary economic measurement. Elementary (indivisible) technology, in which the object's consumer values are variable, in this case can be formalized as a generalized economic measurement. The algebra of such measurements has been constructed. It has been (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  4. Protective measurement and the de Broglie-Bohm theory.Shan Gao - manuscript
    We investigate the implications of protective measurement for de Broglie-Bohm theory, mainly focusing on the interpretation of the wave function. It has been argued that the de Broglie-Bohm theory gives the same predictions as quantum mechanics by means of quantum equilibrium hypothesis. However, this equivalence is based on the premise that the wave function, regarded as a Ψ-field, has no mass and charge density distributions. But this premise turns out to be wrong according to protective measurement; a charged (...)
    Download  
     
    Export citation  
     
    Bookmark  
  5. Measurement and Quantum Dynamics in the Minimal Modal Interpretation of Quantum Theory.Jacob A. Barandes & David Kagan - 2020 - Foundations of Physics 50 (10):1189-1218.
    Any realist interpretation of quantum theory must grapple with the measurement problem and the status of state-vector collapse. In a no-collapse approach, measurement is typically modeled as a dynamical process involving decoherence. We describe how the minimal modal interpretation closes a gap in this dynamical description, leading to a complete and consistent resolution to the measurement problem and an effective form of state collapse. Our interpretation also provides insight into the indivisible nature of measurement—the fact that you can't stop (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  6. Feyerabend on the Quantum Theory of Measurement: A Reassessment.Daniel Kuby & Patrick Fraser - 2022 - International Studies in the Philosophy of Science 35 (1):23-49.
    In 1957, Feyerabend delivered a paper titled ‘On the Quantum-Theory of Measurement’ at the Colston Research Symposium in Bristol to sketch a completion of von Neumann's measurement scheme without collapse, using only unitary quantum dynamics and well-motivated statistical assumptions about macroscopic quantum systems. Feyerabend's paper has been recognised as an early contribution to quantum measurement, anticipating certain aspects of decoherence. Our paper reassesses the physical and philosophical content of Feyerabend's contribution, detailing the technical steps as well as its overall (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  7. Measuring the non-existent: validity before measurement.Kino Zhao - 2023 - Philosophy of Science 90 (2):227–244.
    This paper examines the role existence plays in measurement validity. I argue that existing popular theories of measurement and of validity follow a correspondence framework, which starts by assuming that an entity exists in the real world with certain properties that allow it to be measurable. Drawing on literature from the sociology of measurement, I show that the correspondence framework faces several theoretical and practical challenges. I suggested the validity-first framework of measurement, which starts with a practice-based validation process as (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  8. Prisoners of Abstraction? The Theory and Measure of Genetic Variation, and the Very Concept of 'Race'.Jonathan Michael Kaplan & Rasmus Grønfeldt Winther - 2013 - Biological Theory 7 (1):401-412.
    It is illegitimate to read any ontology about "race" off of biological theory or data. Indeed, the technical meaning of "genetic variation" is fluid, and there is no single theoretical agreed-upon criterion for defining and distinguishing populations (or groups or clusters) given a particular set of genetic variation data. Thus, by analyzing three formal senses of "genetic variation"—diversity, differentiation, and heterozygosity—we argue that the use of biological theory for making epistemic claims about "race" can only seem plausible when (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  9. Towards a Logic of Epistemic Theory of Measurement.Daniele Porello & Claudio Macolo - 2019 - In Gabor Bella & Paolo Bouquet (eds.), Modeling and Using Context - 11th International and Interdisciplinary Conference, {CONTEXT} 2019, Trento, Italy, November 20-22, 2019, Proceedings. Lecture Notes in Computer Science 11939. pp. 175-188.
    We propose a logic to reason about data collected by a num- ber of measurement systems. The semantic of this logic is grounded on the epistemic theory of measurement that gives a central role to measure- ment devices and calibration. In this perspective, the lack of evidences (in the available data) for the truth or falsehood of a proposition requires the introduction of a third truth-value (the undetermined). Moreover, the data collected by a given source are here represented (...)
    Download  
     
    Export citation  
     
    Bookmark  
  10. The Analytic Versus Representational Theory of Measurement: A Philosophy of Science Perspective.Zoltan Domotor & Vadim Batitsky - 2008 - Measurement Science Review 8 (6):129-146.
    In this paper we motivate and develop the analytic theory of measurement, in which autonomously specified algebras of quantities (together with the resources of mathematical analysis) are used as a unified mathematical framework for modeling (a) the time-dependent behavior of natural systems, (b) interactions between natural systems and measuring instruments, (c) error and uncertainty in measurement, and (d) the formal propositional language for describing and reasoning about measurement results. We also discuss how a celebrated theorem in analysis, known as (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  11. Extensive Measurement in Social Choice.Jacob M. Nebel - forthcoming - Theoretical Economics.
    Extensive measurement is the standard measurement-theoretic approach for constructing a ratio scale. It involves the comparison of objects that can be concatenated in an additively representable way. This paper studies the implications of extensively measurable welfare for social choice theory. We do this in two frameworks: an Arrovian framework with a fixed population and no interpersonal comparisons, and a generalized framework with variable populations and full interpersonal comparability. In each framework we use extensive measurement to introduce novel domain restrictions, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  12.  63
    Calibrating the theory of model mediated measurement: metrological extension, dimensional analysis, and high pressure physics.Mahmoud Jalloh - 2024 - European Journal for Philosophy of Science 14 (40):1-32.
    I argue that dimensional analysis provides an answer to a skeptical challenge to the theory of model mediated measurement. The problem arises when considering the task of calibrating a novel measurement procedure, with greater range, to the results of a prior measurement procedure. The skeptical worry is that the agreement of the novel and prior measurement procedures in their shared range may only be apparent due to the emergence of systematic error in the exclusive range of the novel measurement (...)
    Download  
     
    Export citation  
     
    Bookmark  
  13. Quantum mechanical measurement in monistic systems theory.Klaus Fröhlich - 2023 - Science and Philosophy 11 (2):76-83.
    The monistic worldview aims at a uniform description of nature based on scientific models. Quantum physical systems are mutually part of the other quantum physical systems. An aperture distributes the subsystems and the wave front in all possible ways. The system only takes one of the possible paths, as measurements show. Conclusion from Bell's theorem: Before the quantum physical measurement, there is no point-like location in the universe where all the information that explains the measurement is available. Distributed information is (...)
    Download  
     
    Export citation  
     
    Bookmark  
  14. Have underground radiation measurements refuted the Orch OR theory?Kelvin J. McQueen - forthcoming - Physics of Life Reviews.
    In [1] it is claimed that, based on radiation emission measurements described in [2], a certain “variant” of the Orch OR theory has been refuted. I agree with this claim. However, the significance of this result for Orch OR per se is unclear. After all, the refuted “variant” was never advocated by anyone, and it contradicts the views of Hameroff and Penrose (hereafter: HP) who invented Orch OR [3]. My aim is to get clear on this situation. I argue (...)
    Download  
     
    Export citation  
     
    Bookmark  
  15. Paul of Venice’s Theory of Quantification and Measurement of Properties.Sylvain Roudaut - 2022 - Noctua 9 (2):104-158.
    This paper analyzes Paul of Venice’s theory of measurement of natural properties and changes. The main sections of the paper correspond to Paul’s analysis of the three types of accidental changes, for which the Augustinian philosopher sought to provide rules of measurement. It appears that Paul achieved an original synthesis borrowing from both Parisian and Oxfordian sources. It is also argued that, on top of this theoretical synthesis, Paul managed to elaborate a quite original theory of intensive properties (...)
    Download  
     
    Export citation  
     
    Bookmark  
  16. A New Look into Peter Townsend’s Holy Grail: The Theory and Measure of Poverty as Relative Deprivation.Samuel Maia - 2024 - Dissertation, Federal University of Minas Gerais
    The development of the science of poverty has largely been driven by the need to define more precisely what poverty is, as well as to provide theoretical and empirical criteria for identifying those who suffer from it. This thesis focuses on a notable response to these and related questions: the conception and measure of poverty by the British sociologist Peter Townsend. Townsend defines poverty as relative deprivation caused by lack of resources. This conception, along with his corresponding cut-off (...), constitutes some of the key components of his theory of poverty. The theory discusses what poverty is, its causes, and how it can be eradicated, and is detailed in Poverty in the United Kingdom: A Survey of Household Resources and Standards of Living (1979), his magnum opus. The primary aim of this thesis is to interpret the theory from Townsend’s adherence to the value-free ideal. He pursued this ideal in response to what he perceived as the pervasive and deleterious influence of moral and political values on measures, theories, and policies on poverty throughout its scientific and political history. I argue that to be free from such influence, Townsend aimed to be as guided by epistemic values as possible, which resulted in a systematic theory of poverty. -/- The thesis is structured as follows: In Chapter 1, I situate Townsend’s conception and measurement of poverty within the context of both the recent history of scientific poverty lines and the most important approaches to measuring the phenomenon. Chapter 2 discusses the systematic nature of the theory, including its role as a middle-range theory that bridges broad-range theories with empirical data. This connection is achieved through several elements: a conception of poverty; hypotheses and anthropological observations related to the elements of the conception; measures to test these hypotheses; an explanation of poverty; and reports on how the research was conducted. Furthermore, Townsend’s theory aligns with Bradburn et al.’s (2017) measurement theory, which asserts that a good measure must include a conception of the relevant phenomenon, a representation, and procedures to gather data. Additionally, all three components must be supported by arguments showing that they fit together properly. -/- In the remaining chapters, I present and discuss the main elements of the theory: his conception of poverty as relative deprivation caused by lack of resources, along with the related hypotheses and observations (Chapter 3); the representations of this conception and the procedures for data collection, processing, as well as ensuring their reliability (Chapter 4); and the explanation of poverty (Chapter 5). I conclude by presenting what Townsend considered the “policy prescriptions” of his theory. Despite its potential shortcomings, I also argue that Townsend’s theory fulfills an important epistemic value: fruitfulness. (shrink)
    Download  
     
    Export citation  
     
    Bookmark  
  17. Editorial: Inner Experiences: Theory, Measurement, Frequency, Content, and Functions.Alain Morin, Jason D. Runyan & Thomas M. Brinthaupt - 2015 - Frontiers in Psychology 6.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  18. The Ontic Probability Interpretation of Quantum Theory - Part III: Schrödinger’s Cat and the ‘Basis’ and ‘Measurement’ Pseudo-Problems (2nd edition).Felix Alba-Juez - manuscript
    Most of us are either philosophically naïve scientists or scientifically naïve philosophers, so we misjudged Schrödinger’s “very burlesque” portrait of Quantum Theory (QT) as a profound conundrum. The clear signs of a strawman argument were ignored. The Ontic Probability Interpretation (TOPI) is a metatheory: a theory about the meaning of QT. Ironically, equating Reality with Actuality cannot explain actual data, justifying the century-long philosophical struggle. The actual is real but not everything real is actual. The ontic character of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  19. Modeling Measurement: Error and Uncertainty.Alessandro Giordani & Luca Mari - 2014 - In Marcel Boumans, Giora Hon & Arthur C. Petersen (eds.), Error and Uncertainty in Scientific Practice. Pickering & Chatto. pp. 79-96.
    In the last few decades the role played by models and modeling activities has become a central topic in the scientific enterprise. In particular, it has been highlighted both that the development of models constitutes a crucial step for understanding the world and that the developed models operate as mediators between theories and the world. Such perspective is exploited here to cope with the issue as to whether error-based and uncertainty-based modeling of measurement are incompatible, and thus alternative with one (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  20. Measurement scales and welfarist social choice.Michael Morreau & John A. Weymark - 2016 - Journal of Mathematical Psychology 75:127-136.
    The social welfare functional approach to social choice theory fails to distinguish a genuine change in individual well-beings from a merely representational change due to the use of different measurement scales. A generalization of the concept of a social welfare functional is introduced that explicitly takes account of the scales that are used to measure well-beings so as to distinguish between these two kinds of changes. This generalization of the standard theoretical framework results in a more satisfactory formulation (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  21. Perception and Coincidence in Helmholtz’s Theory of Measurement.Matthias Neuber - 2018 - Journal for the History of Analytical Philosophy 6 (3).
    The present paper is concerned with Helmholtz’s theory of measurement. It will be argued that an adequate understanding of this theory depends on how Helmholtz’s application of the concepts of perception and coincidence is interpreted. In contrast both to conventionalist and Kantian readings of Helmholtz’s theory, a more realistic interpretation will be suggested.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  22. Dataset on Islamic ethical work behavior among Bruneian Malay Muslim teachers with measures concerning religiosity and theory of planned behavior.Nur Amali Aminnuddin - 2020 - Data in Brief 29:105157.
    The data presents an examination of Islamic ethical work behavior of Malay Muslim teachers in Brunei through religiosity and theory of planned behavior. The total number of participants was 370 Bruneian Malay Muslim teachers. The participants were sampled from two different types of school systems being non-religious schools and religious schools, with five schools each. By documenting information of the data, this data article presented the demographic characteristics of participants, and reliability and correlation of measures involved. Analyses of the (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  23. The Problem of Measure Sensitivity Redux.Peter Brössel - 2013 - Philosophy of Science 80 (3):378-397.
    Fitelson (1999) demonstrates that the validity of various arguments within Bayesian confirmation theory depends on which confirmation measure is adopted. The present paper adds to the results set out in Fitelson (1999), expanding on them in two principal respects. First, it considers more confirmation measures. Second, it shows that there are important arguments within Bayesian confirmation theory and that there is no confirmation measure that renders them all valid. Finally, the paper reviews the ramifications that this (...)
    Download  
     
    Export citation  
     
    Bookmark   40 citations  
  24. (1 other version)Team reasoning and a measure of mutual advantage in games.Jurgis Karpus & Mantas Radzvilas - 0201 - Economics and Philosophy 34 (1):1-30.
    The game theoretic notion of best-response reasoning is sometimes criticized when its application produces multiple solutions of games, some of which seem less compelling than others. The recent development of the theory of team reasoning addresses this by suggesting that interacting players in games may sometimes reason as members of a team – a group of individuals who act together in the attainment of some common goal. A number of properties have been suggested for team-reasoning decision-makers’ goals to satisfy, (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  25. Prioritarianism and the Measure of Utility.Michael Otsuka - 2013 - Journal of Political Philosophy 23 (1):1-22.
    I argue that prioritarianism cannot be assessed in abstraction from an account of the measure of utility. Rather, the soundness of this view crucially depends on what counts as a greater, lesser, or equal increase in a person’s utility. In particular, prioritarianism cannot accommodate a normatively compelling measure of utility that is captured by the axioms of John von Neumann and Oskar Morgenstern’s expected utility theory. Nor can it accommodate a plausible and elegant generalization of this (...) that has been offered in response to challenges to von Neumann and Morgenstern. This is, I think, a theoretically interesting and unexpected source of difficulty for prioritarianism, which I explore in this article. (shrink)
    Download  
     
    Export citation  
     
    Bookmark   19 citations  
  26. The Confirmational Significance of Agreeing Measurements.Casey Helgeson - 2013 - Philosophy of Science 80 (5):721-732.
    Agreement between "independent" measurements of a theoretically posited quantity is intuitively compelling evidence that a theory is, loosely speaking, on the right track. But exactly what conclusion is warranted by such agreement? I propose a new account of the phenomenon's epistemic significance within the framework of Bayesian epistemology. I contrast my proposal with the standard Bayesian treatment, which lumps the phenomenon under the heading of "evidential diversity".
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  27. Measuring Belief and Risk Attitude.Sven Neth - 2019 - Electronic Proceedings in Theoretical Computer Science 297:354–364.
    Ramsey (1926) sketches a proposal for measuring the subjective probabilities of an agent by their observable preferences, assuming that the agent is an expected utility maximizer. I show how to extend the spirit of Ramsey's method to a strictly wider class of agents: risk-weighted expected utility maximizers (Buchak 2013). In particular, I show how we can measure the risk attitudes of an agent by their observable preferences, assuming that the agent is a risk-weighted expected utility maximizer. Further, we can (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  28. Measuring Impartial Beneficence: A Kantian Perspective on the Oxford Utilitarianism Scale.Emilian Mihailov - 2022 - Review of Philosophy and Psychology 14 (3):989-1004.
    To capture genuine utilitarian tendencies, (Kahane et al., Psychological Review 125:131, 2018) developed the Oxford Utilitarianism Scale (OUS) based on two subscales, which measure the commitment to impartial beneficence and the willingness to cause harm for the greater good. In this article, I argue that the impartial beneficence subscale, which breaks ground with previous research on utilitarian moral psychology, does not distinctively measure utilitarian moral judgment. I argue that Kantian ethics captures the all-encompassing impartial concern for the well-being (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  29. Sensory Measurements: Coordination and Standardization.Ann-Sophie Barwich & Hasok Chang - 2015 - Biological Theory 10 (3):200-211.
    Do sensory measurements deserve the label of “measurement”? We argue that they do. They fit with an epistemological view of measurement held in current philosophy of science, and they face the same kinds of epistemological challenges as physical measurements do: the problem of coordination and the problem of standardization. These problems are addressed through the process of “epistemic iteration,” for all measurements. We also argue for distinguishing the problem of standardization from the problem of coordination. To exemplify our claims, we (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  30. Modeling in Economics and in the Theory of Consciousness on the Basis of Generalized Measurements.Sergiy Melnyk & Igor Tuluzov - 2014 - Neuroquantology 12 (2).
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  31. Measuring the mental.Garris Rogonyan - 2016 - Epistemology and Philosophy of Science 50 (4):168-186.
    The article considers pros and cons for a theoretic-measurement analogy, proposed by some philosophers as an illustration of semantic indeterminacy. Within this analogy ascribing of meanings to a certain linguistic expressions is compared with attribution of numbers according to a certain theory of measurement. Donald Davidson used this analogy in order to extend W.V.O. Quine's thesis of indeterminacy of translation to the interpretation of all human behavior. In other words, not only linguistic meanings, but all mental states are considered (...)
    Download  
     
    Export citation  
     
    Bookmark  
  32. Symmetries and Measurements.Sebastián Murgueitio Ramírez - 2024 - Philosophy Compass 19 (6).
    According to the orthodox view, one can appeal to the symmetries of a theory in order to show that it is impossible to measure the properties that are not invariant under such symmetries. For example, it is widely believed that the fact that boosts are symmetries of Newtonian mechanics entails that it is impossible to measure states of absolute motion in a Newtonian world (these states vary under boosts). This paper offers an overview of the various ways (...)
    Download  
     
    Export citation  
     
    Bookmark  
  33. Measuring Ontological Simplicity.Noël B. Saenz - 2024 - Ergo: An Open Access Journal of Philosophy 11 (25):652-688.
    Standard approaches to ontological simplicity focus either on the number of things or types a theory posits or on the number of fundamental things or types a theory posits. In this paper, I suggest a ground-theoretic approach that focuses on the number of something else. After getting clear on what this approach amounts to, I motivate it, defend it, and complete it.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  34. Confirmation, Increase in Probability, and the Likelihood Ratio Measure: a Reply to Glass and McCartney.William Roche - 2017 - Acta Analytica 32 (4):491-513.
    Bayesian confirmation theory is rife with confirmation measures. Zalabardo focuses on the probability difference measure, the probability ratio measure, the likelihood difference measure, and the likelihood ratio measure. He argues that the likelihood ratio measure is adequate, but each of the other three measures is not. He argues for this by setting out three adequacy conditions on confirmation measures and arguing in effect that all of them are met by the likelihood ratio measure (...)
    Download  
     
    Export citation  
     
    Bookmark  
  35. A conjecture concerning determinism, reduction, and measurement in quantum mechanics.Arthur Jabs - 2016 - Quantum Studies: Mathematics and Foundations 3 (4):279-292.
    Determinism is established in quantum mechanics by tracing the probabilities in the Born rules back to the absolute (overall) phase constants of the wave functions and recognizing these phase constants as pseudorandom numbers. The reduction process (collapse) is independent of measurement. It occurs when two wavepackets overlap in ordinary space and satisfy a certain criterion, which depends on the phase constants of both wavepackets. Reduction means contraction of the wavepackets to the place of overlap. The measurement apparatus fans out the (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  36. Addressing the Conflict Between Relativity and Quantum Theory: Models, Measurement and the Markov Property.Gareth Ernest Boardman - 2013 - Cosmos and History 9 (2):86-115.
    Twenty-first century science faces a dilemma. Two of its well-verified foundation stones - relativity and quantum theory - have proven inconsistent. Resolution of the conflict has resisted improvements in experimental precision leaving some to believe that some fundamental understanding in our world-view may need modification or even radical reform. Employment of the wave-front model of electrodynamics, as a propagation process with a Markov property, may offer just such a clarification.
    Download  
     
    Export citation  
     
    Bookmark  
  37. ALGEBRA OF FUNDAMENTAL MEASUREMENTS AS A BASIS OF DYNAMICS OF ECONOMIC SYSTEMS.Sergiy Melnyk - 2012 - arXiv.
    We propose an axiomatic approach to constructing the dynamics of systems, in which one the main elements 9e8 is the consciousness of a subject. The main axiom is the statements that the state of consciousness is completely determined by the results of measurements performed on it. In case of economic systems we propose to consider an offer of transaction as a fundamental measurement. Transactions with delayed choice, discussed in this paper, represent a logical generalization of incomplete transactions and allow for (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  38. A Sense So Rare: Measuring Olfactory Experiences and Making a Case for a Process Perspective on Sensory Perception.Ann-Sophie Barwich - 2014 - Biological Theory 9 (3):258-268.
    Philosophical discussion about the reality of sensory perceptions has been hijacked by two tendencies. First, talk about perception has been largely centered on vision. Second, the realism question is traditionally approached by attaching objects or material structures to matching contents of sensory perceptions. These tendencies have resulted in an argumentative impasse between realists and anti-realists, discussing the reliability of means by which the supposed causal information transfer from object to perceiver takes place. Concerning the nature of sensory experiences and their (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  39. Quantum Measurement, Complexity and Discrete Physics.Martin Leckey - 2003 - arXiv.
    This paper presents a new modified quantum mechanics, Critical Complexity Quantum Mechanics, which includes a new account of wavefunction collapse. This modified quantum mechanics is shown to arise naturally from a fully discrete physics, where all physical quantities are discrete rather than continuous. I compare this theory with the spontaneous collapse theories of Ghirardi, Rimini, Weber and Pearle and discuss some implications of these theories and CCQM for a realist view of the quantum realm.
    Download  
     
    Export citation  
     
    Bookmark  
  40. Zeno Goes to Copenhagen: A Dilemma for Measurement-Collapse Interpretations of Quantum Mechanics.David J. Chalmers & Kelvin J. McQueen - 2023 - In M. C. Kafatos, D. Banerji & D. C. Struppa (eds.), Quantum and Consciousness Revisited. DK Publisher.
    A familiar interpretation of quantum mechanics (one of a number of views sometimes labeled the "Copenhagen interpretation'"), takes its empirical apparatus at face value, holding that the quantum wave function evolves by the Schrödinger equation except on certain occasions of measurement, when it collapses into a new state according to the Born rule. This interpretation is widely rejected, primarily because it faces the measurement problem: "measurement" is too imprecise for use in a fundamental physical theory. We argue that this (...)
    Download  
     
    Export citation  
     
    Bookmark  
  41. Meinong on magnitudes and measurement.Ghislain Guigon - 2005 - Meinong Studies 1:255-296.
    This paper introduces the reader to Meinong's work on the metaphysics of magnitudes and measurement in his Über die Bedeutung des Weber'schen Gesetzes. According to Russell himself, who wrote a review of Meinong's work on Weber's law for Mind, Meinong's theory of magnitudes deeply influenced Russell's theory of quantities in the Principles of Mathematics. The first and longest part of the paper discusses Meinong's analysis of magnitudes. According to Meinong, we must distinguish between divisible and indivisible magnitudes. He (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  42. Understanding scientific types: holotypes, stratotypes, and measurement prototypes.Alisa Bokulich - 2020 - Biology and Philosophy 35 (5):1-28.
    At the intersection of taxonomy and nomenclature lies the scientific practice of typification. This practice occurs in biology with the use of holotypes (type specimens), in geology with the use of stratotypes, and in metrology with the use of measurement prototypes. In this paper I develop the first general definition of a scientific type and outline a new philosophical theory of types inspired by Pierre Duhem. I use this general framework to resolve the necessity-contingency debate about type specimens in (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  43. Quantum Theory, Objectification and Some Memories of Giovanni Morchio.Luca Sciortino - 2023 - In Alessandro Michelangeli & Andrea Cintio (eds.), Trails in Modern Theoretical and Mathematical Physics. Springer. pp. 301-310.
    In this contribution I will retrace the main stages of my research on the objectification problem in quantum mechanics by highlighting some personal memories of my supervisor, the theoretical physicist Giovanni Morchio. The central aim of my MSc thesis was to ask whether the hypothesis of objectification, which is currently added to the formalism, is not, at least in one case, deducible from it and in particular from the dynamics of the temporal evolution. The case study we were looking for (...)
    Download  
     
    Export citation  
     
    Bookmark  
  44. Semantic Information Measure with Two Types of Probability for Falsification and Confirmation.Lu Chenguang - manuscript
    Logical Probability (LP) is strictly distinguished from Statistical Probability (SP). To measure semantic information or confirm hypotheses, we need to use sampling distribution (conditional SP function) to test or confirm fuzzy truth function (conditional LP function). The Semantic Information Measure (SIM) proposed is compatible with Shannon’s information theory and Fisher’s likelihood method. It can ensure that the less the LP of a predicate is and the larger the true value of the proposition is, the more information there (...)
    Download  
     
    Export citation  
     
    Bookmark  
  45. Assessing Theories: The Coherentist Approach.Peter Brössel - 2014 - Erkenntnis 79 (3):593-623.
    In this paper we show that the coherence measures of Olsson (J Philos 94:246–272, 2002), Shogenji (Log Anal 59:338–345, 1999), and Fitelson (Log Anal 63:194–199, 2003) satisfy the two most important adequacy requirements for the purpose of assessing theories. Following Hempel (Synthese 12:439–469, 1960), Levi (Gambling with truth, New York, A. A. Knopf, 1967), and recently Huber (Synthese 161:89–118, 2008) we require, as minimal or necessary conditions, that adequate assessment functions favor true theories over false theories and true and informative (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  46. Quantum Set Theory Extending the Standard Probabilistic Interpretation of Quantum Theory.Masanao Ozawa - 2016 - New Generation Computing 34 (1):125-152.
    The notion of equality between two observables will play many important roles in foundations of quantum theory. However, the standard probabilistic interpretation based on the conventional Born formula does not give the probability of equality between two arbitrary observables, since the Born formula gives the probability distribution only for a commuting family of observables. In this paper, quantum set theory developed by Takeuti and the present author is used to systematically extend the standard probabilistic interpretation of quantum (...) to define the probability of equality between two arbitrary observables in an arbitrary state. We apply this new interpretation to quantum measurement theory, and establish a logical basis for the difference between simultaneous measurability and simultaneous determinateness. (shrink)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  47. A Remark on Probabilistic Measures of Coherence.Sergi Oms - 2020 - Notre Dame Journal of Formal Logic 61 (1):129-140.
    In recent years, some authors have proposed quantitative measures of the coherence of sets of propositions. Such probabilistic measures of coherence (PMCs) are, in general terms, functions that take as their argument a set of propositions (along with some probability distribution) and yield as their value a number that is supposed to represent the degree of coherence of the set. In this paper, I introduce a minimal constraint on PMC theories, the weak stability principle, and show that any correct, coherent, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  48. Performing agency theory and the neoliberalization of the state.Tim Christiaens - 2020 - Critical Sociology 46 (3):393-411.
    According to Streeck and Vogl, the neoliberalization of the state has been the result of political-economic developments that render the state dependent on financial markets. However, they do not explain the discursive shifts that would have been required for demoting the state to the role of an agent to bondholders. I propose to explain this shift via the performative effect of neoliberal agency theory. In 1976, Michael Jensen and William Meckling claimed that corporate managers are agents to shareholding principals, (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  49. Les deux théories marxiennes de la valeur-travail et le problème de la mesure immanente.Philippe Mongin - 1989 - Archives de Philosophie 52 (2):247-266.
    From the comparison of the Grundrisse (1857-58) manuscripts with Marx's subsequent writings, it is clear that the so-called « deduction » of fundamental economic categories follows two distinctive patterns, one of which is close to ordinary logical analysis, the other being inspired by Hegel's dialectics of essence. This duality is reflected in the double meaning of the concept of « presupposition » (Voraussetzung) and, finally, in the simultaneous endorsement by the Grundrisse of two labour-value theories, one of which is Smithian-like, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  50. A graphic measure for game-theoretic robustness.Randy Au Patrick Grim, Robert Rosenberger Nancy Louie, Evan Selinger William Braynen & E. Eason Robb - 2008 - Synthese 163 (2):273-297.
    Robustness has long been recognized as an important parameter for evaluating game-theoretic results, but talk of ‘robustness’ generally remains vague. What we offer here is a graphic measure for a particular kind of robustness (‘matrix robustness’), using a three-dimensional display of the universe of 2 × 2 game theory. In such a measure specific games appear as specific volumes (Prisoner’s Dilemma, Stag Hunt, etc.), allowing a graphic image of the extent of particular game-theoretic effects in terms of (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
1 — 50 / 934