Majority cycling and related social choice paradoxes are often thought to threaten the meaningfulness of democracy. But deliberation can prevent majority cycles – not by inducing unanimity, which is unrealistic, but by bringing preferences closer to single-peakedness. We present the first empirical test of this hypothesis, using data from Deliberative Polls. Comparing preferences before and after deliberation, we find increases in proximity to single-peakedness. The increases are greater for lower versus higher salience issues and for individuals (...) who seem to have deliberated more versus less effectively. They are not merely a byproduct of increased substantive agreement. Our results both refine and support the idea that deliberation, by increasing proximity to single-peakedness, provides an escape from the problem of majority cycling. (shrink)
Among the possible solutions to the paradoxes of collective preferences, single-peakedness is significant because it has been associated to a suggestive conceptual interpretation: a single-peaked preference profile entails that, although individuals may disagree on which option is the best, they conceptualize the choice along a shared unique dimension, i.e. they agree on the rationale of the collective decision. In this article, we discuss the relationship between the structural property of singlepeakedness and its suggested interpretation as uni-dimensionality of (...) a social choice. In particular, we offer a formalization of the relationship between single-peakedness and its conceptual counterpart, we discuss their logical relations, and we question whether single-peakedness provides a rationale for collective choices. (shrink)
It has been claimed that deliberation is capable of overcoming so- cial choice theory impossibility results, by bringing about single- peakedness. Our aim is to better understand the relationship be- tween single-peakedness and collective justifications of preferences.
Public deliberation has been defended as a rational and noncoercive way to overcome paradoxical results from democratic voting, by promoting consensus on the available alternatives on the political agenda. Some critics have argued that full consensus is too demanding and inimical to pluralism and have pointed out that single-peakedness, a much less stringent condition, is sufficient to overcome voting paradoxes. According to these accounts, deliberation can induce single-peakedness through the creation of a ‘meta-agreement’, that is, agreement (...) on the dimension according to which the issues at stake are ‘conceptualized’. We argue here that once all the conditions needed for deliberation to bring about single-peakedness through meta-agreement are unpacked and made explicit, meta-agreement turns out to be a highly demanding condition, and one that is very inhospitable to pluralism. (shrink)
In normative political theory, it is widely accepted that democracy cannot be reduced to voting alone, but that it requires deliberation. In formal social choice theory, by contrast, the study of democracy has focused primarily on the aggregation of individual opinions into collective decisions, typically through voting. While the literature on deliberation has an optimistic flavour, the literature on social choice is more mixed. It is centred around several paradoxes and impossibility results identifying conflicts between different intuitively plausible desiderata. In (...) recent years, there has been a growing dialogue between the two literatures. This paper discusses the connections between them. Important insights are that (i) deliberation can complement aggregation and open up an escape route from some of its negative results; and (ii) the formal models of social choice theory can shed light on some aspects of deliberation, such as the nature of deliberation-induced opinion change. (shrink)
De Neys (2021) argues that the debate between single- and dual-process theorists of thought has become both empirically intractable and scientifically inconsequential. I argue that this is true only under the traditional framing of the debate—when single- and dual-process theories are understood as claims about whether thought processes share the same defining properties (e.g., making mathematical judgments) or have two different defining properties (e.g., making mathematical judgments autonomously versus via access to a central working memory capacity), respectively. But (...) if single- and dual-process theories are understood in cognitive modeling terms as claims about whether thought processes function to implement one or two broad types of algorithms, respectively, then the debate becomes scientifically consequential and, presumably, empirically tractable. So, I argue, the correct response to the current state of the debate is not to abandon it, as De Neys suggests, but to reframe it as a debate about cognitive models. (shrink)
It is tempting to think that multi premise closure creates a special class of paradoxes having to do with the accumulation of risks, and that these paradoxes could be escaped by rejecting the principle, while still retaining single premise closure. I argue that single premise deduction is also susceptible to risks. I show that what I take to be the strongest argument for rejecting multi premise closure is also an argument for rejecting single premise closure. Because of (...) the symmetry between the principles, they come as a package: either both will have to be rejected or both will have to be revised. (shrink)
It’s often thought that the phenomenon of risk aggregation poses a problem for multi-premise closure but not for single-premise closure. But recently, Lasonen-Aarnio and Schechter have challenged this thought. Lasonen-Aarnio argues that, insofar as risk aggregation poses a problem for multi-premise closure, it poses a similar problem for single-premise closure. For she thinks that, there being such a thing as deductive risk, risk may aggregate over a single premise and the deduction itself. Schechter argues that single-premise (...) closure succumbs to risk aggregation outright. For he thinks that there could be a long sequence of competent single-premise deductions such that, even though we are justified in believing the initial premise of the sequence, intutively, we are not justified in believing the final conclusion. This intuition, Schechter thinks, vitiates single-premise closure. In this paper, I defend single-premise closure against the arguments offered by Lasonen-Aarnio and Schechter. (shrink)
Single Parents.Andrzej Klimczuk - 2014 - In Encyclopedia of Human Services and Diversity. Sage Publications. pp. 1191--1194.details
Services for single parents constitute a category of child and family services. These services are carried out by public and non-governmental bodies for people who are single parents by the unfortunate events or by their own choice. Individuals come to single parenthood mainly through divorce, separation, birth outside of marriage, child abuse/neglect, death of a partner/widowhood, and adoption.
Ethical objectivists hold that there is one and only one correct system of moral beliefs. From such a standpoint it follows that conflicting basic moral principles cannot both be true and that the only moral principles which are binding on rational human agents are those described by the single true morality. However sincerely they may be held, all other moral principles are incorrect. Objectivism is an influential tradition, covering most of the rationalist and naturalist standpoints which have dominated nineteenth (...) and twentieth century moral philosophy: there is widespread agreement amongst relativists themselves that objectivism is firmly rooted in common sense. (shrink)
It is argued that there are ways of individuating the objects of perception without using sortal concepts. The result is an moderate anti-sortalist position on which one can single out objects using demonstrative expressions without knowing exactly what sort of thing those objects are.
Recent work in cognitive modelling has found that most of the data that has been cited as evidence for the dual-process theory (DPT) of reasoning is best explained by non-linear, “monotonic” one-process models (Stephens et al., 2018, 2019). In this paper, I consider an important caveat of this research: it uses models that are committed to unrealistic assumptions about how effectively task conditions can isolate Type-1 and Type-2 reasoning. To avoid this caveat, I develop a coordinated theoretical, experimental, and modelling (...) strategy to better test DPT. First, I propose that Type-1 and Type-2 reasoning are defined as reasoning that precedes and follows metacognitive control, respectively. Second, I argue that reasoning that precedes and follows metacognitive control can be effectively isolated using debiasing paradigms that manipulate metacognitive heuristics (e.g., processing fluency) to prevent or trigger metacognitive control, respectively. Third, I argue that monotonic modelling can allow us to decisively test DPT only when we use them to analyse data from this particular kind of debiasing paradigm. (shrink)
A popular view has it that the mental representations underlying human pretense are not beliefs, but are “belief-like” in important ways. This view typically posits a distinctive cognitive attitude (a “DCA”) called “imagination” that is taken toward the propositions entertained during pretense, along with correspondingly distinct elements of cognitive architecture. This paper argues that the characteristics of pretense motivating such views of imagination can be explained without positing a DCA, or other cognitive architectural features beyond those regulating normal belief and (...) desire. On the present “Single Attitude” account of imagination, propositional imagining just is a form of believing. The Single Attitude account is also distinguished from “metarepresentational” accounts of pretense, which hold that both pretending and recognizing pretense in others require one to have concepts of mental states. It is argued, to the contrary, that pretending and recognizing pretense require neither a DCA nor possession of mental state concepts. (shrink)
In a recent article on Reid’s theory of single and double vision, James Van Cleve considers an argument against direct realism presented by Hume. Hume argues for the mind-dependent nature of the objects of our perception from the phenomenon of double vision. Reid does not address this particular argument, but Van Cleve considers possible answers Reid might have given to Hume. He finds fault with all these answers. Against Van Cleve, I argue that both appearances in double vision could (...) be considered visible figures of the object, and show how this solution might preserve Reid’s direct realism. However, this solution is not compatible with the single appearance of an object predicted by Reid’s theory of single and double vision. This consequence will appear evident once we consider the critique of Reid’s theory of single and double vision formulated by William Charles Wells (1757-1817). Wells argues that Reid’s theory is either incomplete or incompatible with other claims made by Reid. It is incomplete since it fails to specify the unique direction in which we the object in single vision; if it not incomplete and is compatible with the law of monocular direction given by Reid, then it is incompatible with Reid’s claim that we do not perceive immediately distance by sight. (shrink)
The paper investigates how the mathematical languages used to describe and to observe automatic computations influence the accuracy of the obtained results. In particular, we focus our attention on Single and Multi-tape Turing machines which are described and observed through the lens of a new mathematical language which is strongly based on three methodological ideas borrowed from Physics and applied to Mathematics, namely: the distinction between the object (we speak here about a mathematical object) of an observation and the (...) instrument used for this observation; interrelations holding between the object and the tool used for the observation; the accuracy of the observation determined by the tool. Results of the observation executed by the traditional and new languages are compared and discussed. (shrink)
The philosophy of education, being an integrative and anthropologic knowledge, has to perform a prognostic and axiological function, forming a perspective of a world-view genesis of personality and provide theoretical and methodological background for the innovation processes in the education. The forming of harmonious, intellectually developed, creative, conscientious, responsible, purposeful and healthy human personality – these are all the main tasks of the educational system. There are many approaches in performing of such strategic task. One of them, starting from the (...) urgency of a problem of sexual indifference of a modern school education, is presented in a single-sex format of education and is based upon individual approach to the education and upbringing of each and every pupil, taking into account the gender peculiarities of development. In this article we analyze the influence of a single-sex format of education on the process of forming of pupils’ personality, taking into account the age periodization of individual ontogenesis. We developed cognitive, motivational and psychological peculiarities of boys and girls during the periods of childhood and youth. Theoretical comprehension of a need in taking into consideration of the gender characteristics of pupils within the educational process – has been proved on practices, implementing the gender-orientated separate education in schools that demonstrates very positive results. There was made a conclusion about the fact, that the system of gender-orientated separated education has a strong potential of enhancing quality of a pedagogical process and helps to form the personalities of those who study. This can be achieved by taking into account the psychological, physiological and pedagogical peculiarities of boys and girls, by following in the process of educational activities the principals of egalitarianism, nature conformity, self-actualization, creative initiative, democracy and humanism, by creating of an environment, that will be free from impact of gender stereotypes and prejudice. (shrink)
A descriptive polytheist thinks there are at least two gods. John Hick and Richard Swinburne are descriptive polytheists. In this respect, they are like Thomas Aquinas and many other theists. What sets Swinburne and Hick apart from Aquinas, however, is that unlike him they are normative polytheists. That is, Swinburne and Hick think that it is right that we, or at least some of us, worship more than one god. However, the evidence available to me shows that only Swinburne, and (...) not Hick, is a cultic polytheist: he actually worships more than one god. I conclude that only Swinburne is a polytheist par excellence. (shrink)
Upon abolition of the sultanate, proclamation of the Republic, and termination of the seemingly existing caliphate position, the 1921 Constitution was replaced with the 1924 Constitution, which would remain in full force until 1961. It is observed that as a result of all such consecutive developments, the no. 677 Law on Preclusion and Abolition of Lodges, Zawiyahs, Tomb Keepers and Some Titles, which had been presented to the assembly with a bill prepared by Refik Koraltan, a member of the parliament (...) from Konya province, along with his friends, accepted in November 30th, 1925 and published in the official gazette in December 13th, 1925, interrupted, in general, the religious Turkish music; and in particular, the music of Mevleviyeh and Bektashism cults -lodge/dervish music- during the production stages, which has been determined to be directly connected with the following problems to be dealt with, and which will constitute the focus of our study. Therefore, our "Music Policies in Turkish Single-party Era: Religious Music Example" titled study, as mentioned above, has been the subject of the research concentrated around especially the no. 677 Law came into force in December 13th, 1925, accepted with a bill, within the scope of transitions paving the way for the proclamation of the republic, and legitimacy of various revolutions implemented, and it is important that the effects of the political mobility experienced in our country between 1923 and 1950 during the single-party era on our society, and the extent of such effects on the music life, in particular Turkish religious music education, culture and art in our country can be observed and evaluated. (shrink)
After a number of decades of research into the dynamics of rational belief, the belief revision theory community remains split on the appropriate handling of sequences of changes in view, the issue of so-called iterated revision. It has long been suggested that the matter is at least partly settled by facts pertaining to the results of various single revisions of one’s initial state of belief. Recent work has pushed this thesis further, offering various strong principles that ultimately result in (...) a wholesale reduction of iterated to one-shot revision. The present paper offers grounds to hold that these principles should be significantly weakened and that the reductionist thesis should ultimately be rejected. Furthermore, the considerations provided suggest a close connection between the logic of iterated belief change and the logic of evidential relevance. (shrink)
In response to suspicions concerning the use of possible worlds in philosophy, this brief paper proposes an analysis of possibility that requires only a single world, using a combination of temporal logic and a potentiality operator.
This article defends Marjorie Suchocki’s position against two main objections raised by David E. Conner. Conner objects that God as a single actual entity must be temporal because there is succession in God’s experience ofthe world. The reply is that time involves at least two successive occasions separated by perishing, but in God nothing ever perishes. Conner also objects that Suchocki’s personalistic process theism is not experiential but is instead theoretical and not definitive. The reply is that his dismissal (...) of Part V of PR is arbitrary, the interpretation of all experience is theoretical, and no metaphysical interpretations are absolutely definitive, including PR as a whole. Also, Conner ignores religious experience. (shrink)
In this paper we argue that the use of survey data or intuitions about single person cases as a dialectically neutral data point for favouring telic egalitarianism over prioritarianism has dim prospects for success. We take as a case study Otsuka and Voorhoeve (2009)'s now well known paper and show that it either is either argumentatively irrelevant or question-begging, depending on whether the survey data about people's judgements concerning single-person cases is interpreted as being prudential or moral in (...) character. We suggest that this problem is likely to generalise to other ways of trying to use intuitions or survey data about single-person cases, where those data or intuitions are not just treated as further direct moral intuitions about prioritarianism and telic egalitarianism. (shrink)
The number of independent messages a physical system can carry is limited by the number of its adjustable properties. In particular, systems that have only one adjustable property cannot carry more than a single message at a time. We demonstrate this is the case for the single photons in the double-slit experiment, and the root of the fundamental limit on measuring the complementary aspect of the photons. Next, we analyze the other ‘quantal’ behavior of the systems with a (...)single adjustable property, such as noncommutativity and no-cloning. Finally, we formulate a mathematical theory to describe the dynamics of such systems and derive the standard Hilbert-space formalism of quantum mechanics. Our derivation demonstrates the physical foundation of the quantum theory. (shrink)
Pessimism is, roughly, the view that life is not worth living. In chapter 46 of the second volume of The World as Will and Representation, Arthur Schopenhauer provides an oft-neglected argument for this view. The argument is that a life is worth living only if it does not contain any uncompensated evils; but since all our lives happen to contain such evils, none of them are worth living. The now standard interpretation of this argument (endorsed by Kuno Fischer and Christopher (...) Janaway) proceeds from the claim that the value—or rather valuelessness—of life’s goods makes compensation impossible. But this interpretation is neither philosophically attractive nor faithful to the text. In this paper, I develop and defend an alternative interpretation (suggested by Wilhelm Windelband and Mark Migotti) according to which it is instead the actual temporal arrangement of life’s goods and evils that makes compensation impossible. (shrink)
In a reflective and richly entertaining piece from 1979, Doug Hofstadter playfully imagined a conversation between ‘Achilles’ and an anthill (the eponymous ‘Aunt Hillary’), in which he famously explored many ideas and themes related to cognition and consciousness. For Hofstadter, the anthill is able to carry on a conversation because the ants that compose it play roughly the same role that neurons play in human languaging; unfortunately, Hofstadter’s work is notably short on detail suggesting how this magic might be achieved1. (...) Conversely in this paper - finally reifying Hofstadter’s imagination - we demonstrate how populations of simple ant-like creatures can be organised to solve complex problems; problems that involve the use of forward planning and strategy. Specifically we will demonstrate that populations of such creatures can be configured to play a strategically strong - though tactically weak - game of HeX (a complex strategic game).We subsequently demonstrate how tactical play can be improved by introducing a form of forward planning instantiated via multiple populations of agents; a technique that can be compared to the dynamics of interacting populations of social insects via the concept of meta-population. In this way although, pace Hofstadter, we do not establish that a meta-population of ants could actually hold a conversation with Achilles, we do successfully introduce Aunt Hillary to the complex, seductive charms of HeX. (shrink)
An argument map visually represents the structure of an argument, outlining its informal logical connections and informing judgments as to its worthiness. Argument mapping can be augmented with dedicated software that aids the mapping process. Empirical evidence suggests that semester‐length subjects using argument mapping along with dedicated software can produce remarkable increases in students’ critical thinking abilities. Introducing such specialised subjects, however, is often practically and politically difficult. This study ascertains student perceptions of the use of argument mapping in two (...) large, regular, semester‐length classes in a Business and Economics Faculty at the University of Melbourne. Unlike the semester‐length expert‐led trials in prior research, in our study only one expert‐led session was conducted at the beginning of the semester and followed by class practice. Survey results conducted at the end of the semester, show that, with reservations, even this minimalist, ‘one‐shot inoculation’ of argument mapping is effective in terms of students’ perceptions of improvements in their critical thinking skills. (shrink)
This article presents two related challenges to the idea that, to ensure policy evaluation is comprehensive, all costs and benefits should be aggregated into a single, equity-weighted wellbeing metric. The first is to point out how, even allowing for equity-weighting, the use of a single metric limits the extent to which we can take distributional concerns into account. The second challenge starts from the observation that in this and many other ways, aggregating diverse effects into a single (...) metric of evaluation necessarily involves settling many moral questions that reasonable people disagree about. This raises serious questions as to what role such a method of policy evaluation can and should play in informing policy-making in liberal democracies. Ultimately, to ensure comprehensiveness of policy evaluation in a wider sense, namely, that all the diverse effects that reasonable people might think matter are kept score of, we need multiple metrics as inputs to public deliberation. (shrink)
For centuries, science was considered as something radically different from religion. Yet, the foundations of true science are deeply religious in nature. This paper seeks to show how religion is the only foundation needed for the formulation of scientific theories, since it provides the core principles on which the building of exact sciences is based upon. Our need to understand the cosmos and our faith in us being able to do so, are the main prerequisites for conducting science; prerequisites that (...) are derived from our belief in us being the sons of God and, thus, being able to read His mind. From its birth on 7 March 1277 up to today, science seems to be the only logical attitude of religious people towards the unknown cosmos. (shrink)
The problem of multiple-computations discovered by Hilary Putnam presents a deep difficulty for functionalism (of all sorts, computational and causal). We describe in out- line why Putnam’s result, and likewise the more restricted result we call the Multiple- Computations Theorem, are in fact theorems of statistical mechanics. We show why the mere interaction of a computing system with its environment cannot single out a computation as the preferred one amongst the many computations implemented by the system. We explain why (...) nonreductive approaches to solving the multiple- computations problem, and in particular why computational externalism, are dualistic in the sense that they imply that nonphysical facts in the environment of a computing system single out the computation. We discuss certain attempts to dissolve Putnam’s unrestricted result by appealing to systems with certain kinds of input and output states as a special case of computational externalism, and show why this approach is not workable without collapsing to behaviorism. We conclude with some remarks about the nonphysical nature of mainstream approaches to both statistical mechanics and the quantum theory of measurement with respect to the singling out of partitions and observables. (shrink)
Peter Baumann uses the Monty Hall game to demonstrate that probabilities cannot be meaningfully applied to individual games. Baumann draws from this first conclusion a second: in a single game, it is not necessarily rational to switch from the door that I have initially chosen to the door that Monty Hall did not open. After challenging Baumann's particular arguments for these conclusions, I argue that there is a deeper problem with his position: it rests on the false assumption that (...) what justifies the switching strategy is its leading me to win a greater percentage of the time. In fact, what justifies the switching strategy is not any statistical result over the long run but rather the "causal structure" intrinsic to each individual game itself. Finally, I argue that an argument by Hilary Putnam will not help to save Baumann's second conclusion above. (shrink)
The quantization error in a fixed-size Self-Organizing Map (SOM) with unsupervised winner-take-all learning has previously been used successfully to detect, in minimal computation time, highly meaningful changes across images in medical time series and in time series of satellite images. Here, the functional properties of the quantization error in SOM are explored further to show that the metric is capable of reliably discriminating between the finest differences in local contrast intensities and contrast signs. While this capability of the QE is (...) akin to functional characteristics of a specific class of retinal ganglion cells (the so-called Y-cells) in the visual systems of the primate and the cat, the sensitivity of the QE surpasses the capacity limits of human visual detection. Here, the quantization error in the SOM is found to reliably signal changes in contrast or colour when contrast information is removed from or added to the image, but not when the amount and relative weight of contrast information is constant and only the local spatial position of contrast elements in the pattern changes. While the RGB Mean reflects coarser changes in colour or contrast well enough, the SOM-QE is shown to outperform the RGB Mean in the detection of single-pixel changes in images with up to five million pixels. This could have important implications in the context of unsupervised image learning and computational building block approaches to large sets of image data (big data). (shrink)
Recently, the term ‘aphantasia’ has become current in scientific and public discourse to denote the absence of mental imagery. However, new terms for aphantasia or its subgroups have recently been proposed, e.g. ‘dysikonesia’ or ‘anauralia’, which complicates the literature, research communication and understanding for the general public. Before further terms emerge, we advocate the consistent use of the term ‘aphantasia’ as it can be used flexibly and precisely, and is already widely known in the scientific community and among the general (...) public. (shrink)
Although social scientists have identified diverse behavioral patterns among children from dissimilarly structured families, marketing scholars have progressed little in relating family structure to consumption-related decisions. In particular, the roles played by members of single-mother families—which may include live-in grandparents, mother’s unmarried partner, and step-father with or without step-sibling(s)—may affect children’s influence on consumption-related decisions. For example, to offset a parental authority dynamic introduced by a new stepfather, the work-related constraints imposed on a breadwinning mother, or the imposition of (...) adult-level household responsibilities on children, single-mother families may attend more to their children’s product preferences. -/- Without a profile that includes socio-economic, behavioral, and psychological aspects, efficient and socially responsible marketing to single-mother households is compromised. Relative to dual-parent families, single-mother families tend to have fewer resources and less buying power, children who consume more materialistic and compulsively, and children who more strongly influence decision making for both own-use and family-use products. Timely research would ensure that these and other tendencies now differentiate single-mother from dual-parent families in ways that marketers should address. Hence, our threefold goal is (1) to consolidate and highlight gaps in existing theory applied to studying children’s influence on consumption-related decision making in single-mother families, and (2) to propose a hybrid framework that merges two theories conducive to such research, and (3) to identify promising research propositions for future research. (shrink)
Temporal epistemic logics are known, from results of Halpern and Vardi, to have a wide range of complexities of the satisfiability problem: from PSPACE, through non-elementary, to highly undecidable. These complexities depend on the choice of some key parameters specifying, inter alia, possible interactions between time and knowledge, such as synchrony and agents' abilities for learning and recall. In this work we develop practically implementable tableau-based decision procedures for deciding satisfiability in single-agent synchronous temporal-epistemic logics with interactions between time (...) and knowledge. We discuss some complications that occur, even in the single-agent case, when interactions between time and knowledge are assumed and show how the method of incremental tableaux can be adapted to work in EXPSPACE, respectively 2EXPTIME, for these logics, thereby also matching the upper bounds obtained for them by Halpern and Vardi. (shrink)
Is it possible to know anything about life we have not yet encountered? We know of only one example of life: our own. Given this, many scientists are inclined to doubt that any principles of Earth’s biology will generalize to other worlds in which life might exist. Let’s call this the “N = 1 problem.” By comparison, we expect the principles of geometry, mechanics, and chemistry would generalize. Interestingly, each of these has predictable consequences when applied to biology. The surface-to-volume (...) property of geometry, for example, limits the size of unassisted cells in a given medium. This effect is real, precise, universal, and predictive. Furthermore, there are basic problems all life must solve if it is to persist, such as resistance to radiation, faithful inheritance, and energy regulation. If these universal problems have a limited set of possible solutions, some common outcomes must consistently emerge. In this chapter, I discuss the N = 1 problem, its implications, and my response (Mariscal 2014). I hold that our current knowledge of biology can justify believing certain generalizations as holding for life anywhere. Life on Earth may be our only example of life, but this is only a reason to be cautious in our approach to life in the universe, not a reason to give up altogether. In my account, a candidate biological generalization is assessed by the assumptions it makes. A claim is accepted only if its justification includes principles of evolution, but no contingent facts of life on Earth. (shrink)
When a technique purports to provide information that is not available to the unaided senses, it is natural to think that the only way to validate that technique is by appealing to a theory of the processes that lead from the object of study to the raw data. In fact, scientists have a variety of strategies for validating their techniques. Those strategies can yield multiple independent arguments that support the validity of the technique. Thus, it is possible to produce a (...) robust body of data with a single technique. I illustrate and support these claims with a historical case study. *Received September 2009; revised October 2009. †To contact the author, please write to: Department of History and Philosophy of Science, 1017 Cathedral of Learning, University of Pittsburgh, Pittsburgh, PA 15260; e‐mail: [email protected] (shrink)
Simultaneous observation of the wave-like and particle-like aspects of the photon in the double-slit experiment is unallowed. The underlying reason behind this limitation is not understood. In this paper, we explain this unique behavior by considering the communicational properties of the photons. Photons have three independently adjustable properties (energy, direction, and spin) that can be used to communicate messages. The double-slit experiment setup fixes two of these properties and confines the single photon’s capacity for conveying messages to no more (...) than one message. With such a low communication capacity, information theory dictates that measurements associated only with one proposition can obtain consistent results, and a second measurement associated with an independent proposition must necessarily lead to randomness. In the double-slit example, these are the wave or particle properties of the photon. The interpretation we offer is based on the formalism of information theory and does not make use of Heisenberg’s uncertainty relation in any form. (shrink)
Physical systems can store information and their informational properties are governed by the laws of information. In particular, the amount of information that a physical system can convey is limited by the number of its degrees of freedom and their distinguishable states. Here we explore the properties of the physical systems with absolutely one degree of freedom. The central point in these systems is the tight limitation on their information capacity. Discussing the implications of this limitation we demonstrate that such (...) systems exhibit a number of features, such as randomness, no-cloning, and non-commutativity, which are peculiarities attributed to quantum mechanics (QM). After demonstrating many astonishing parallels to quantum behavior, we postulate an interpretation of quantum physics as the physics of systems with a single degree of freedom. We then show how a number of other quantum conundrum can be understood by considering the informational properties of the systems and also resolve the EPR paradox. In the present work, we assume that the formalism of the QM is correct and well-supported by experimental verification and concentrate on the interpretational aspects of the theory. (shrink)
Over the years, ethylene-diamine-tetra-acetate (EDTA) has been widely used for many purposes. However, there are inadequate phytoassessment studies conducted using EDTA in Vetiver grass. Hence, this study evaluates the phytoassessment (growth performance, accumulation trends, and proficiency of metal uptake) of Vetiver grass, Vetiveria zizanioides (Linn.) Nash in both single and mixed heavy metal (Cd, Pb, Cu, and Zn)—disodium EDTA-enhanced contaminated soil. The plant growth, metal accumulation, and overall efficiency of metal uptake by different plant parts (lower root, upper root, (...) lower tiller, and upper tiller) were thoroughly examined. The relative growth performance, metal tolerance, and phytoassessment of heavy metal in roots and tillers of Vetiver grass were examined. Metals in plants were measured using the flame atomic absorption spectrometry (F-AAS) after acid digestion. The root-tiller (R/T) ratio, biological concentration factor (BCF), biological accumulation coefficient (BAC), tolerance index (TI), translocation factor (TF), and metal uptake efficacy were used to estimate the potential of metal accumulation and translocation in Vetiver grass. All accumulation of heavy metals were significantly higher (p < 0.05) in both lower and upper roots and tillers of Vetiver grass for Cd + Pb + Cu + Zn + EDTA treatments as compared with the control. The single Zn + EDTA treatment accumulated the highest overall total amount of Zn (8068 ± 407 mg/kg) while the highest accumulation for Cu (1977 ± 293 mg/kg) and Pb (1096 ± 75 mg/kg) were recorded in the mixed Cd + Pb + Cu + Zn + EDTA treatment, respectively. Generally, the overall heavy metal accumulation trends of Vetiver grass were in the order of Zn >>> Cu > Pb >> Cd for all treatments. Furthermore, both upper roots and tillers of Vetiver grass recorded high tendency of accumulation for appreciably greater amounts of all heavy metals, regardless of single and/or mixed metal treatments. Thus, Vetiver grass can be recommended as a potential phytoextractor for all types of heavy metals, whereby its tillers will act as the sink for heavy metal accumulation in the presence of EDTA for all treatments. (shrink)
We argue that human consciousness may be a property of single electron in the brain. We suppose that each electron in the universe has at least primitive consciousness. Each electron subjectively “observes” its quantum dynamics (energy, momentum, “shape” of wave function) in the form of sensations and other mental phenomena. However, some electrons in neural cells have complex “human” consciousnesses due to complex quantum dynamics in complex organic environment. We discuss neurophysiological and physical aspects of this hypothesis and show (...) that: (1) single chemically active electron has enough informational capacity to “contain” the richness of human subjective experience; (2) quantum states of some electrons might be directly influenced by human sensory data and have direct influence upon human behavior in real brain; (3) main physical and philosophical drawbacks of “conventional” “quantum theories of consciousness” may be solved by our hypothesis without much changes in their conceptual basis. We do not suggest any “new physics”, and our neuroscientific assumptions are similar to those used by other proponents of “quantum consciousness”. However, our hypothesis suggests radical changes in our view on human and physical reality. (shrink)
Closure for justification is the claim that thinkers are justified in believing the logical consequences of their justified beliefs, at least when those consequences are competently deduced. Many have found this principle to be very plausible. Even more attractive is the special case of Closure known as Single-Premise Closure. In this paper, I present a challenge to Single-Premise Closure. The challenge is based on the phenomenon of rational self-doubt – it can be rational to be less than fully (...) confident in one's beliefs and patterns of reasoning. In rough outline, the argument is as follows: Consider a thinker who deduces a conclusion from a justified initial premise via an incredibly long sequence of small competent deductions. Surely, such a thinker should suspect that he has made a mistake somewhere. And surely, given this, he should not believe the conclusion of the deduction even though he has a justified belief in the initial premise. (shrink)
The idea that knowledge can be extended by inference from what is known seems highly plausible. Yet, as shown by familiar preface paradox and lottery-type cases, the possibility of aggregating uncertainty casts doubt on its tenability. We show that these considerations go much further than previously recognized and significantly restrict the kinds of closure ordinary theories of knowledge can endorse. Meeting the challenge of uncertainty aggregation requires either the restriction of knowledge-extending inferences to single premises, or eliminating epistemic uncertainty (...) in known premises. The first strategy, while effective, retains little of the original idea—conclusions even of modus ponens inferences from known premises are not always known. We then look at the second strategy, inspecting the most elaborate and promising attempt to secure the epistemic role of basic inferences, namely Timothy Williamson’s safety theory of knowledge. We argue that while it indeed has the merit of allowing basic inferences such as modus ponens to extend knowledge, Williamson’s theory faces formidable difficulties. These difficulties, moreover, arise from the very feature responsible for its virtue- the infallibilism of knowledge. (shrink)
This paper defines the form of prior knowledge that is required for sound inferences by analogy and single-instance generalizations, in both logical and probabilistic reasoning. In the logical case, the first order determination rule defined in Davies (1985) is shown to solve both the justification and non-redundancy problems for analogical inference. The statistical analogue of determination that is put forward is termed 'uniformity'. Based on the semantics of determination and uniformity, a third notion of "relevance" is defined, both logically (...) and probabilistically. The statistical relevance of one function in determining another is put forward as a way of defining the value of information: The statistical relevance of a function F to a function G is the absolute value of the change in one's information about the value of G afforded by specifying the value of F. This theory provides normative justifications for conclusions projected by analogy from one case to another, and for generalization from an instance to a rule. The soundness of such conclusions, in either the logical or the probabilistic case, can be identified with the extent to which the corresponding criteria (determination and uniformity) actually hold for the features being related. (shrink)
The philosophy of education, being an integrative and anthropologic knowledge, has to perform a prognostic and axiological function, forming a perspective of a world-view genesis of personality and provide theoretical and methodological background for the innovation processes in the education. The forming of harmonious, intellectually developed, creative, conscientious, responsible, purposeful and healthy human personality – these are all the main tasks of the educational system. There are many approaches in performing of such strategic task. One of them, starting from the (...) urgency of a problem of sexual indifference of a modern school education, is presented in a single-sex format of education and is based upon individual approach to the education and upbringing of each and every pupil, taking into account the gender peculiarities of development. In this article we analyze the influence of a single-sex format of education on the process of forming of pupils’ personality, taking into account the age periodization of individual ontogenesis. We developed cognitive, motivational and psychological peculiarities of boys and girls during the periods of childhood and youth. Theoretical comprehension of a need in taking into consideration of the gender characteristics of pupils within the educational process – has been proved on practices, implementing the gender-orientated separate education in schools that demonstrates very positive results. There was made a conclusion about the fact, that the system of gender-orientated separated education has a strong potential of enhancing quality of a pedagogical process and helps to form the personalities of those who study. This can be achieved by taking into account the psychological, physiological and pedagogical peculiarities of boys and girls, by following in the process of educational activities the principals of egalitarianism, nature conformity, self-actualization, creative initiative, democracy and humanism, by creating of an environment, that will be free from impact of gender stereotypes and prejudice. (shrink)
We discuss the no-go theorem of Frauchiger and Renner based on an "extended Wigner's friend" thought experiment which is supposed to show that any single-world interpretation of quantum mechanics leads to inconsistent predictions if it is applicable on all scales. We show that no such inconsistency occurs if one considers a complete description of the physical situation. We then discuss implications of the thought experiment that have not been clearly addressed in the original paper, including a tension between relativity (...) and nonlocal effects predicted by quantum mechanics. Our discussion applies in particular to Bohmian mechanics. (shrink)
We analyze the logical form of the domain knowledge that grounds analogical inferences and generalizations from a single instance. The form of the assumptions which justify analogies is given schematically as the "determination rule", so called because it expresses the relation of one set of variables determining the values of another set. The determination relation is a logical generalization of the different types of dependency relations defined in database theory. Specifically, we define determination as a relation between schemata of (...) first order logic that have two kinds of free variables: (1) object variables and (2) what we call "polar" variables, which hold the place of truth values. Determination rules facilitate sound rule inference and valid conclusions projected by analogy from single instances, without implying what the conclusion should be prior to an inspection of the instance. They also provide a way to specify what information is sufficiently relevant to decide a question, prior to knowledge of the answer to the question. (shrink)
Within the sedimentation diagram of infective RNA preparations isolated from Tobacco Mosaic Virus, undegraded molecules form a sharp peak with a molecular weight corresponding to the total RNA content of the virus particle. Degradation kinetics by ribonuclease is of the linear, single-target type, indicating that the RNA is single-stranded. The intact RNA of a virus particle thus forms one big single-stranded molecule. Quantitative evaluation of the effect degradation by RNA-ase on the infectivity of the RNA shows that (...) the integrity of the entire molecule is required for its biological activity. (shrink)
ABSTRACTThe Necessity of Origins is the thesis that, necessarily, if a material object wholly originates from some particular material, then it could not have wholly originated from any significantly non-overlapping material. Several philosophers have argued for this thesis using as a premise a principle that we call ‘Single Origin Necessity’. However, we argue that Single Origin Necessity is false. So any arguments for The Necessity of Origins that rely on Single Origin Necessity are unsound. We also argue (...) that the Necessity of Origins itself is false. Our arguments rely on a thesis in the ontology of art that we find plausible: Multi-Work Materialism. It is the thesis that works of art that have multiple concrete manifestations are co-located with those manifestations. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.