Instead of the half-century old foundational feud between set theory and category theory, this paper argues that they are theories about two different complementary types of universals. The set-theoretic antinomies forced naïve set theory to be reformulated using some iterative notion of a set so that a set would always have higher type or rank than its members. Then the universal u_{F}={x|F(x)} for a property F() could never be self-predicative in the sense of u_{F}∈u_{F}. But the mathematical theory of categories, (...) dating from the mid-twentieth century, includes a theory of always-self-predicative universals--which can be seen as forming the "other bookend" to the never-self-predicative universals of set theory. The self-predicative universals of category theory show that the problem in the antinomies was not self-predication per se, but negated self-predication. They also provide a model (in the Platonic Heaven of mathematics) for the self-predicative strand of Plato's Theory of Forms as well as for the idea of a "concrete universal" in Hegel and similar ideas of paradigmatic exemplars in ordinary thought. (shrink)
The purpose of this paper is to show that the dual notions of elements & distinctions are the basic analytical concepts needed to unpack and analyze morphisms, duality, and universal constructions in the Sets, the category of sets and functions. The analysis extends directly to other concrete categories (groups, rings, vector spaces, etc.) where the objects are sets with a certain type of structure and the morphisms are functions that preserve that structure. Then the elements & distinctions-based definitions can be (...) abstracted in purely arrow-theoretic way for abstract category theory. In short, the language of elements & distinctions is the conceptual language in which the category of sets is written, and abstract category theory gives the abstract arrows version of those definitions. (shrink)
In finite probability theory, events are subsets S⊆U of the outcome set. Subsets can be represented by 1-dimensional column vectors. By extending the representation of events to two dimensional matrices, we can introduce "superposition events." Probabilities are introduced for classical events, superposition events, and their mixtures by using density matrices. Then probabilities for experiments or `measurements' of all these events can be determined in a manner exactly like in quantum mechanics (QM) using density matrices. Moreover the transformation of the density (...) matrices induced by the experiments or `measurements' is the Lüders mixture operation as in QM. And finally by moving the machinery into the n-dimensional vector space over ℤ₂, different basis sets become different outcome sets. That `non-commutative' extension of finite probability theory yields the pedagogical model of quantum mechanics over ℤ₂ that can model many characteristic non-classical results of QM. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the idea arises of a dual (...) logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
In her recent book Private Government, Elizabeth Anderson makes a powerful but pragmatic case against the abuses experienced by employees in conventional corporations. The purpose of this review-essay is to contrast Anderson’s pragmatic critique of many abuses in the employment relation with a principled critique of the employment relationship itself. This principled critique is based on the theory of inalienable rights that descends from the Reformation doctrine of the inalienability of conscience down through the Enlightenment in the abolitionist, democratic, and (...) feminist movements. That theory was the basis for the abolition of the voluntary slavery or self-sale contract, the voluntary non-democratic constitution (pactum subjectionis), and the voluntary coverture marriage contract in today’s democratic countries. When understood in modern terms, that same theory applies as well against the voluntary self-rental or employment contract that is the basis for our current economic system. (shrink)
After Marx, dissenting economics almost always used 'the labour theory' as a theory of value. This paper develops a modern treatment of the alternative labour theory of property that is essentially the property theoretic application of the juridical principle of responsibility: impute legal responsibility in accordance with who was in fact responsible. To understand descriptively how assets and liabilities are appropriated in normal production, a 'fundamental myth' needs to be cleared away, and then the market mechanism of appropriation can be (...) understood. On the normative side, neoclassical theory represents marginal productivity theory as showing that the imputation principle is satisfied in competitive enterprises. Since that shows the moral commitment of neoclassical economics to the imputation principle, the labour theory of property is presented here as the actual non-metaphorical application of the imputation principle to property appropriation. The property-theoretic analysis at the firm level shows how the neoclassical analysis in terms of 'distributive shares' wholly misframed the basic questions. Finally, the paper shows how the imputation principle is systematically violated in the present wage labour system of renting persons. The paper can be seen as taking up the recent challenge posed by Donald Katzner for a dialogue between neoclassical and heterodox microeconomics. (shrink)
Classical liberalism is skeptical about governmental organizations "doing good" for people. Instead governments should create the conditions so that people individually (Adam Smith) and in associations (Tocqueville) are empowered to do good for themselves. The market implications of classical liberalism are well-known, but the implications for organizations are controversial. We will take James Buchanan as our guide (with assists from Mill and Dewey). Unpacking the implications of classical liberalism for the "science of associations" (Tocqueville) requires a tour through the intellectual (...) history of the voluntary slavery contract and the voluntary non-democratic constitution. The argument concludes that the classical liberal endorsement of sovereign individuals acting in the marketplace generalizes to the joint action of individuals as the principals in their own organizations and associations. (shrink)
Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...) probability theory by taking the (Laplacian) probability as the normalized size of each subset-event of a finite universe. The analogous step in the logic of partitions is to assign to a partition the number of distinctions made by a partition normalized by the total number of ordered pairs |U|² from the finite universe. That yields a notion of "logical entropy" for partitions and a "logical information theory." The logical theory directly counts the (normalized) number of distinctions in a partition while Shannon's theory gives the average number of binary partitions needed to make those same distinctions. Thus the logical theory is seen as providing a conceptual underpinning for Shannon's theory based on the logical notion of "distinctions.". (shrink)
Nancy MacLean’s book, Democracy in Chains, raised questions about James M. Buchanan’s commitment to democracy. This paper investigates the relationship of classical liberalism in general and of Buchanan in particular to democratic theory. Contrary to the simplistic classical liberal juxtaposition of “coercion vs. consent,” there have been from Antiquity onwards voluntary contractarian defenses of non-democratic government and even slavery—all little noticed by classical liberal scholars who prefer to think of democracy as just “government by the consent of the governed” and (...) slavery as being inherently coercive. Historically, democratic theory had to go beyond that simplistic notion of democracy to develop a critique of consent-based non-democratic government, e.g., the Hobbesian pactum subjectionis. That critique was based firstly on the distinction between contracts or constitutions of alienation (translatio) versus delegation (concessio). Then the contracts of alienation were ruled out based on the theory of inalienable rights that descends from the Reformation doctrine of inalienability of conscience down through the Enlightenment to modern times in the abolitionist and democratic movements. While he developed no theory of inalienability, the mature Buchanan explicitly allowed only a constitution of delegation, contrary to many modern classical liberals or libertarians who consider the choice between consent-based democratic or non-democratic governments (e.g., private cities or shareholder states) to be a pragmatic one. But Buchanan seems to not even realize that his at-most-delegation dictum would also rule out the employer-employee or human rental contract which is a contract of alienation “within the scope of the employment.”. (shrink)
In this chapter I seek to provide a theoretical defense of workplace democracy that is independent from and outside the lineage of Marxist and communist theory. Common to the council movements, anarcho- syndicalism and many other forms of libertarian socialism was the idea “that workers’ self- management was central.” Yet the idea of workers’ control has not been subject to the same theoretical development as Marx’s theory, not to mention capitalist economic theory. This chapter aims to contribute at a theoretical (...) level by providing a justification and defense of self- managed workplaces that is independent of the particular historical tradition of the council movements. There is a clear and definitive case for workplace democracy based on first principles that descends to modern times through the Reformation and Enlightenment in the abolitionist, democratic and feminist movements. By the twentieth century, the arguments had been scattered and lost – like the bones of some ancient beast scattered in a desert – partly due to misconceptions, mental blocks and misinterpretations embodied in Marxism, liberalism and economic theory. When one has worked through some of these intellectual roadblocks, then one may be better able to reassemble the case for workplace democracy from well- known first principles developed in the abolitionist, democratic and feminist movements. (shrink)
This paper shows how the universals of category theory in mathematics provide a model (in the Platonic Heaven of mathematics) for the self-predicative strand of Plato's Theory of Forms as well as for the idea of a "concrete universal" in Hegel and similar ideas of paradigmatic exemplars in ordinary thought. The paper also shows how the always-self-predicative universals of category theory provide the "opposite bookend" to the never-self-predicative universals of iterative set theory and thus that the paradoxes arose from having (...) one theory (e.g., Frege's Paradise) where universals could be either self-predicative or non-self-predicative (instead of being always one or always the other). (shrink)
There is some consensus among orthodox category theorists that the concept of adjoint functors is the most important concept contributed to mathematics by category theory. We give a heterodox treatment of adjoints using heteromorphisms that parses an adjunction into two separate parts. Then these separate parts can be recombined in a new way to define a cognate concept, the brain functor, to abstractly model the functions of perception and action of a brain. The treatment uses relatively simple category theory and (...) is focused on the interpretation and application of the mathematical concepts. (shrink)
Liberal-contractarian philosophies of justice see the unjust systems of slavery and autocracy in the past as being based on coercion—whereas the social order in modern democratic market societies is based on consent and contract. However, the ‘best’ case for slavery and autocracy in the past were consent-based contractarian arguments. Hence, our first task is to recover those ‘forgotten’ apologia for slavery and autocracy. To counter those consent-based arguments, the historical anti-slavery and democratic movements developed a theory of inalienable rights. Our (...) second task is to recover that theory and to consider several other applications of the theory. Finally, the liberal theories of justice expounded by John Rawls and by Robert Nozick are briefly examined from this perspective. (shrink)
This article is a review of Erik Olin Wright’s 2010 book Envisioning Real Utopias. The review focuses on certain topics such as his understanding of ‘capitalism,’ his conception of worker cooperatives, and the general issues surrounding markets, the Left, and Marxism.
Modern categorical logic as well as the Kripke and topological models of intuitionistic logic suggest that the interpretation of ordinary “propositional” logic should in general be the logic of subsets of a given universe set. Partitions on a set are dual to subsets of a set in the sense of the category-theoretic duality of epimorphisms and monomorphisms—which is reflected in the duality between quotient objects and subobjects throughout algebra. If “propositional” logic is thus seen as the logic of subsets of (...) a universe set, then the question naturally arises of a dual logic of partitions on a universe set. This paper is an introduction to that logic of partitions dual to classical subset logic. The paper goes from basic concepts up through the correctness and completeness theorems for a tableau system of partition logic. (shrink)
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic (...) of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition. (shrink)
The lattice operations of join and meet were defined for set partitions in the nineteenth century, but no new logical operations on partitions were defined and studied during the twentieth century. Yet there is a simple and natural graph-theoretic method presented here to define any n-ary Boolean operation on partitions. An equivalent closure-theoretic method is also defined. In closing, the question is addressed of why it took so long for all Boolean operations to be defined for partitions.
A theory of property needs to give an account of the whole life-cycle of a property right: how it is initiated, transferred, and terminated. Economics has focused on the transfers in the market and has almost completely neglected the question of the initiation and termination of property in normal production and consumption (not in some original state or in the transition from common to private property). The institutional mechanism for the normal initiation and termination of property is an invisible-hand function (...) of the market, the market mechanism of appropriation. Does this mechanism satisfy an appropriate normative principle? The standard normative juridical principle is to assign or impute legal responsibility according to de facto responsibility. It is given a historical tag of being "Lockean" but the basis is contemporary jurisprudence, not historical exegesis. Then the fundamental theorem of the property mechanism is proven which shows that if "Hume's conditions" (no transfers without consent and all contracts fulfilled) are satisfied, then the market automatically satisfies the Lockean responsibility principle, i.e., "Hume implies Locke." As a major application, the results in their contrapositive form, "Not Locke implies Not Hume," are applied to a market economy based on the employment contract. It is shown the production based on the employment contract violates the Lockean principle (all who work in an employment enterprise are de facto responsible for the positive and negative results) and thus Hume's conditions must also be violated in the marketplace (de facto responsible human action cannot be transferred from one person to another—as is readily recognized when and employer and employee together commit a crime). (shrink)
This paper shows how the classical finite probability theory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or toy model of quantum mechanics over sets (QM/sets). There have been several previous attempts to develop a quantum-like model with the base field of ℂ replaced by ℤ₂. Since there are no inner products on vector spaces over finite fields, the problem is to define the Dirac brackets and the probability calculus. The previous attempts (...) all required the brackets to take values in ℤ₂. But the usual QM brackets <ψ|ϕ> give the "overlap" between states ψ and ϕ, so for subsets S,T⊆U, the natural definition is <S|T>=|S∩T| (taking values in the natural numbers). This allows QM/sets to be developed with a full probability calculus that turns out to be a non-commutative extension of classical Laplace-Boole finite probability theory. The pedagogical model is illustrated by giving simple treatments of the indeterminacy principle, the double-slit experiment, Bell's Theorem, and identical particles in QM/Sets. A more technical appendix explains the mathematics behind carrying some vector space structures between QM over ℂ and QM/Sets over ℤ₂. (shrink)
There is a fallacy that is often involved in the interpretation of quantum experiments involving a certain type of separation such as the: double-slit experiments, which-way interferometer experiments, polarization analyzer experiments, Stern-Gerlach experiments, and quantum eraser experiments. The fallacy leads not only to flawed textbook accounts of these experiments but to flawed inferences about retrocausality in the context of delayed choice versions of separation experiments.
There is a fault line running through classical liberalism as to whether or not democratic self-governance is a necessary part of a liberal social order. The democratic and non-democratic strains of classical liberalism are both present today—particularly in America. Many contemporary libertarians and neo-Austrian economists represent the non-democratic strain in their promotion of non-democratic sovereign city-states (startup cities or charter cities). We will take the late James M. Buchanan as a representative of the democratic strain of classical liberalism. Since the (...) fundamental norm of classical liberalism is consent, we must start with the intellectual history of the voluntary slavery contract, the coverture marriage contract, and the voluntary non-democratic constitution (or pactum subjectionis). Next we recover the theory of inalienable rights that descends from the Reformation doctrine of the inalienability of conscience through the Enlightenment (e.g., Spinoza and Hutcheson) in the abolitionist and democratic movements. Consent-based governments divide into those based on the subjects' alienation of power to a sovereign and those based on the citizens' delegation of power to representatives. Inalienable rights theory rules out that alienation in favor of delegation, so the citizens remain the ultimate principals and the form of government is democratic. Thus the argument concludes in agreement with Buchanan that the classical liberal endorsement of sovereign individuals acting in the marketplace generalizes to the joint action of individuals as the principals in their own organizations. (shrink)
Since its formal definition over sixty years ago, category theory has been increasingly recognized as having a foundational role in mathematics. It provides the conceptual lens to isolate and characterize the structures with importance and universality in mathematics. The notion of an adjunction (a pair of adjoint functors) has moved to center-stage as the principal lens. The central feature of an adjunction is what might be called “determination through universals” based on universal mapping properties. A recently developed “heteromorphic” theory about (...) adjoints suggests a conceptual structure, albeit abstract and atemporal, for how new relatively autonomous behavior can emerge within a system obeying certain laws. The focus here is on applications in the life sciences (e.g., selectionist mechanisms) and human sciences (e.g., the generative grammar view of language). (shrink)
In the 1990s, a debate raged across the whole postsocialist world as well as in Western development agencies such as the World Bank about the best approach to the transition from various forms of socialism or communism to a market economy and political democracy. One of the most hotly contested topics was the question of the workplace being organized based on workplace democracy (e.g., various forms of worker ownership) or based on the conventional employer-employee relationship. Well before 1989, many of (...) the socialist countries had started experimenting with various forms of "self-management" operating in more of a market setting, Yugoslavia being the most developed example. Thus one "path to the market" would .. (shrink)
Since the pioneering work of Birkhoff and von Neumann, quantum logic has been interpreted as the logic of (closed) subspaces of a Hilbert space. There is a progression from the usual Boolean logic of subsets to the "quantum logic" of subspaces of a general vector space--which is then specialized to the closed subspaces of a Hilbert space. But there is a "dual" progression. The notion of a partition (or quotient set or equivalence relation) is dual (in a category-theoretic sense) to (...) the notion of a subset. Hence the Boolean logic of subsets has a dual logic of partitions. Then the dual progression is from that logic of partitions to the quantum logic of direct-sum decompositions (i.e., the vector space version of a set partition) of a general vector space--which can then be specialized to the direct-sum decompositions of a Hilbert space. This allows the logic to express measurement by any self-adjoint operators rather than just the projection operators associated with subspaces. In this introductory paper, the focus is on the quantum logic of direct-sum decompositions of a finite-dimensional vector space (including such a Hilbert space). The primary special case examined is finite vector spaces over ℤ₂ where the pedagogical model of quantum mechanics over sets (QM/Sets) is formulated. In the Appendix, the combinatorics of direct-sum decompositions of finite vector spaces over GF(q) is analyzed with computations for the case of QM/Sets where q=2. (shrink)
Liberal thought is based on the juxtaposition of consent to coercion. Autocracy and slavery were seen as based on coercion whereas today's political democracy and economic 'employment system' are based on consent to voluntary contracts. This paper retrieves an almost forgotten dark side of contractarian thought that based autocracy and slavery on explicit or implicit voluntary contracts. To answer these 'best case' arguments for slavery and autocracy, the democratic and abolitionist movements forged arguments not simply in favour of consent, but (...) arguments that voluntary contracts to legally alienate aspects of personhood were invalid 'even with consent' – which made the underlying rights inherently inalienable. Once understood, those arguments have the perhaps 'unintended consequence' of making the neo-abolitionist case for ruling out today's self-rental contract, the employer-employee contract. The paper has to also retrieve these inalienable rights arguments since they have been largely lost on the Left, not to mention in liberal thought. (shrink)
This paper shows that implicit assumptions about the numeraire good in the Kaldor-Hicks efficiency-equity analysis involve a "same-yardstick" fallacy (a fallacy pointed out by Paul Samuelson in another context). These results have negative implications for cost-benefit analysis, the wealth-maximization approach to law and economics, and other parts of applied welfare economics--as well as for the whole vision of economics based on the "production and distribution of social wealth.".
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper (...) is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
Recent developments in pure mathematics and in mathematical logic have uncovered a fundamental duality between "existence" and "information." In logic, the duality is between the Boolean logic of subsets and the logic of quotient sets, equivalence relations, or partitions. The analogue to an element of a subset is the notion of a distinction of a partition, and that leads to a whole stream of dualities or analogies--including the development of new logical foundations for information theory parallel to Boole's development of (...) logical finite probability theory. After outlining these dual concepts in mathematical terms, we turn to a more metaphysical speculation about two dual notions of reality, a fully definite notion using Boolean logic and appropriate for classical physics, and the other objectively indefinite notion using partition logic which turns out to be appropriate for quantum mechanics. The existence-information duality is used to intuitively illustrate these two dual notions of reality. The elucidation of the objectively indefinite notion of reality leads to the "killer application" of the existence-information duality, namely the interpretation of quantum mechanics. (shrink)
Evolutionary economics often focuses on the comparison between economic competition and the process of natural selection to select the fitter members of a given population. But that neglects the other "half" of an evolutionary process, the mechanism for the generation of new possibilities that is key to dynamic efficiency. My topic is the process of parallel experimentation which I take to be a process of multiple experiments running concurrently with some form of common goal, with some semi-isolation between the experiments, (...) with benchmarking comparisons made between the experiments, and with the "migration" of discoveries between experiments wherever possible to ratchet up the performance of the group. The thesis is that parallel experimentation is a fundamental dynamic efficiency scheme to enhance and accelerate variation, innovation, and learning in contexts of genuine uncertainty or known ignorance. Within evolutionary biology, this type of parallel experimentation scheme was developed in Sewall Wright's shifting balance theory of evolution. It addressed the rather neglected topic of how a population on a low fitness peak might eventually be able to go "downhill" against selective pressures, traverse a valley of low fitness, and then ascend a higher fitness peak. The theme of parallel experimentation is used to recast and pull together dynamic and pluralistic theories in economics, political theory, philosophy of science, and social learning. (shrink)
Saunders Mac Lane famously remarked that "Bourbaki just missed" formulating adjoints in a 1948 appendix (written no doubt by Pierre Samuel) to an early draft of Algebre--which then had to wait until Daniel Kan's 1958 paper on adjoint functors. But Mac Lane was using the orthodox treatment of adjoints that only contemplates the object-to-object morphisms within a category, i.e., homomorphisms. When Samuel's treatment is reconsidered in view of the treatment of adjoints using heteromorphisms or hets (object-to-object morphisms between objects in (...) different categories), then he, in effect, isolated the concept of a left representation solving a universal mapping problem. When dualized to obtain the concept of a right representation, the two halves only need to be united to obtain an adjunction. Thus Samuel was only a now-simple dualization away for formulating adjoints in 1948. Apparently, Bodo Pareigis' 1970 text was the first and perhaps only text to give the heterodox "new characterization" (i.e., heteromorphic treatment) of adjoints. Orthodox category theory uses various relatively artificial devices to avoid formally recognizing hets--even though hets are routinely used by the working mathematician. Finally we consider a "philosophical" question as to whether the most important concept in category theory is the notion of an adjunction or the notion of a representation giving a universal mapping property (where adjunctions arise as the special case of a bi-representation of dual universal mapping problems). (shrink)
Today it would be considered "bad Platonic metaphysics" to think that among all the concrete instances of a property there could be a universal instance so that all instances had the property by virtue of participating in that concrete universal. Yet there is a mathematical theory, category theory, dating from the mid-20th century that shows how to precisely model concrete universals within the "Platonic Heaven" of mathematics. This paper, written for the philosophical logician, develops this category-theoretic treatment of concrete universals (...) along with a new concept to abstractly model the functions of a brain. (shrink)
Double-entry bookkeeping (DEB) implicitly uses a specific mathematical construction, the group of differences using pairs of unsigned numbers ("T-accounts"). That construction was only formulated abstractly in mathematics in the 19th century—even though DEB had been used in the business world for over five centuries. Yet the connection between DEB and the group of differences (here called the "Pacioli group") is still largely unknown both in mathematics and accounting. The precise mathematical treatment of DEB allows clarity on certain conceptual questions and (...) it immediately yields the generalization of the double-entry method to multi-dimensional vectors typically representing the different types of property involved in an enterprise or household. (shrink)
John Tomasi's new book, Free Market Fairness, has been well-received as "one of the very best philosophical treatments of libertarian thought, ever" and as a "long and friendly conversation between Friedrich Hayek and John Rawls—a conversation which, astonishingly, reaches agreement". The book does present an authoritative state-of-the-debate across the spectrum from right-libertarianism on the one side to high liberalism on the other side. My point is not to question where Tomasi comes down with his own version of "market democracy" as (...) a remix of Hayek and Rawls. My point is to use his sympathetic restatements of views across the liberal spectrum to show the basic misframings and common misunderstandings that cut across the liberal-libertarian viewpoints surveyed in the book. As usual, the heart of the debate is not in the answers to carefully framed questions, but in the framing itself. (shrink)
The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...) subsets so there is a dual concept of logical entropy which is the normalized counting measure on distinctions of partitions. Thus the logical notion of information is a measure of distinctions. Classical logical entropy naturally extends to the notion of quantum logical entropy which provides a more natural and informative alternative to the usual Von Neumann entropy in quantum information theory. The quantum logical entropy of a post-measurement density matrix has the simple interpretation as the probability that two independent measurements of the same state using the same observable will have different results. The main result of the paper is that the increase in quantum logical entropy due to a projective measurement of a pure state is the sum of the absolute squares of the off-diagonal entries ("coherences") of the pure state density matrix that are zeroed ("decohered") by the measurement, i.e., the measure of the distinctions ("decoherences") created by the measurement. (shrink)
Following the development of the selectionist theory of the immune system, there was an attempt to characterize many biological mechanisms as being "selectionist" as juxtaposed to "instructionist." But this broad definition would group Darwinian evolution, the immune system, embryonic development, and Chomsky's language-acquisition mechanism as all being "selectionist." Yet Chomsky's mechanism (and embryonic development) are significantly different from the selectionist mechanisms of biological evolution or the immune system. Surprisingly, there is a very abstract way using two dual mathematical logics to (...) make the distinction between genuinely selectionist mechanisms and what are better called "generative" mechanisms. This note outlines that distinction. (shrink)
Category theory has foundational importance because it provides conceptual lenses to characterize what is important and universal in mathematics—with adjunctions being the primary lens. If adjunctions are so important in mathematics, then perhaps they will isolate concepts of some importance in the empirical sciences. But the applications of adjunctions have been hampered by an overly restrictive formulation that avoids heteromorphisms or hets. By reformulating an adjunction using hets, it is split into two parts, a left and a right semiadjunction. Semiadjunctions (...) (essentially a formulation of universal mapping properties using hets) can then be combined in a new way to define the notion of a brain functor that provides an abstract model of the intentionality of perception and action (as opposed to the passive reception of sense-data or the reflex generation of behavior). (shrink)
This paper shows how the classical finite probability theory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or "toy" model of quantum mechanics over sets (QM/sets). There are two parts. The notion of an "event" is reinterpreted from being an epistemological state of indefiniteness to being an objective state of indefiniteness. And the mathematical framework of finite probability theory is recast as the quantum probability calculus for QM/sets. The point is not to (...) clarify finite probability theory but to elucidate quantum mechanics itself by seeing some of its quantum features in a classical setting. (shrink)
Classical physics and quantum physics suggest two meta-physical types of reality: the classical notion of a objectively definite reality with properties "all the way down," and the quantum notion of an objectively indefinite type of reality. The problem of interpreting quantum mechanics (QM) is essentially the problem of making sense out of an objectively indefinite reality. These two types of reality can be respectively associated with the two mathematical concepts of subsets and quotient sets (or partitions) which are category-theoretically dual (...) to one another and which are developed in two mathematical logics, the usual Boolean logic of subsets and the more recent logic of partitions. Our sense-making strategy is "follow the math" by showing how the logic and mathematics of set partitions can be transported in a natural way to Hilbert spaces where it yields the mathematical machinery of QM--which shows that the mathematical framework of QM is a type of logical system over ℂ. And then we show how the machinery of QM can be transported the other way down to the set-like vector spaces over ℤ₂ showing how the classical logical finite probability calculus (in a "non-commutative" version) is a type of "quantum mechanics" over ℤ₂, i.e., over sets. In this way, we try to make sense out of objective indefiniteness and thus to interpret quantum mechanics. (shrink)
Much of the recent discussion in progressive circles [e.g., Stiglitz; Galbraith; Piketty] has focused the obscene mal-distribution of wealth and income as if that was "the" problem in our economic system. And the proposed redistributive reforms have all stuck to that framing of the question. To put the question in historical perspective, one might note that there was a similar, if not more extreme, mal-distribution of wealth, income, and political power in the Antebellum system of slavery. Yet, it should be (...) obvious to modern eyes that redistributions in favor of the slaves, while leaving the institution of owning workers intact, would not address the root of the problem. The system of slavery was eventually abolished in favor of the system we have today which differs in two important respects: the workers are only rented, hired, or employed ; and the rental relationship between employer and employee is voluntary. Today, the root of the problem is the whole institution for the voluntary renting of human beings, the employment system itself, not the terms of the contract or the accumulated consequences in the form of the mal-distribution of income and wealth. (shrink)
When this book was first published in 1990, there were massive economic changes in the East and significant economic challenges to the West. This critical analysis of democratic theory discusses the principles and forces that push both socialist and capitalist economies toward a common ground of workplace democratization. This book is a comprehensive approach to the theory and practice of the "Democratic firm" – from philosophical first principles to legal theory and finally to some of the details of financial structure. (...) The argument for economic democracy supports private property, free markets and entrepreneurship for instance, but fundamentally it replaces the employer/employee relationship with democratic membership in the firm. For students, teachers, policy makers and others interested in the application of democracy to the workplace, this book will serve as a manifesto and a standard reference on the topic. (shrink)
Among our conscious states are conscious thoughts. The question at the center of the recent growing literature on cognitive phenomenology is this: In consciously thinking P, is there thereby any phenomenology—is there something it’s like? One way of clarifying the question is to say that it concerns whether there is any proprietary phenomenology associated with conscious thought. Is there any phenomenology due to thinking, as opposed to phenomenology that is due to some co-occurring sensation or mental image? In this paper (...) we will present two arguments that a “yes” answer to this question of cognitive phenomenology can be obtained via appeal to the HOT theory of consciousness, especially the version articulated and defended by David Rosenthal. (shrink)
In his book Intuitionism, David Kaspar is after the truth. That is to say, on his view, “philosophy is the search for the whole truth” (p. 7). Intuitionism, then, “reflects that standpoint” (p. 7). My comments are meant to reflect the same standpoint. More explicitly, my aim in these comments is to evaluate the arguments for intuitionism, as I understand them from reading Kaspar’s book. In what follows, I focus on three arguments in particular, which can be found in (...) Chapters 1, 2, and 3 of Intuitionism: an inference to the best explanation, an argument from the analogy between mathematical knowledge and moral knowledge, and an argument from the epistemic preferability of the intuitive principles. I will discuss them in this order. (shrink)
David Skrbina opens this timely and intriguing text with a suitably puzzling line from the Diamond Sutra: ‘‘Mind that abides nowhere must come forth.’’, and he urges us to ‘‘de-emphasise the quest for the specifically human embodiment of mind’’ and follow Empedocles, progressing ‘‘with good will and unclouded attention’’ into the text which he has drawn together as editor. If we do, we are assured that it will ‘‘yield great things’’ (p. xi). This, I am pleased to say, is (...) not an exercise in hyperbole. (shrink)
A number of philosophers endorse, without argument, the view that there’s something it’s like consciously to think that p, which is distinct from what it’s like consciously to think that q. This thesis, if true, would have important consequences for philosophy of mind and cognitive science. In this paper I offer two arguments for it. The first argument claims it would be impossible introspectively to distinguish conscious thoughts with respect to their content if there weren’t something it’s like to think (...) them. This argument is defended against several objections. The second argument uses what I call “minimal pair” experiences—sentences read without and with understanding—to induce in the reader an experience of the kind I claim exists. Further objects are considered and rebutted. (shrink)
Recently several papers have reported relevance effects on the cognitive assessments of indicative conditionals, which pose an explanatory challenge to the Suppositional Theory of conditionals advanced by David Over, which is influential in the psychology of reasoning. Some of these results concern the “Equation” (P(if A, then C) = P(C|A)), others the de Finetti truth table, and yet others the uncertain and-to-inference task. The purpose of this chapter is to take a Birdseye view on the debate and investigate some (...) of the open theoretical issues posed by the empirical results. Central among these is whether to count these effects as belonging to pragmatics or semantics. (shrink)
Obwohl dieser Band ein wenig datiert ist, gibt es nur wenige aktuelle populäre Bücher, die sich speziell mit der Psychologie des Mordes beschäftigen und es ist ein schneller Überblick für ein paar Dollar, also noch wert die Mühe. Es macht keinen Versuch, umfassend zu sein und ist stellenweise etwas oberflächlich, wobei der Leser erwartet, die Lücken aus seinen vielen anderen Büchern und der umfangreichen Literatur über Gewalt zu füllen. Für ein Update siehe z.B. Buss, The Handbook of Evolutionary Psychology 2nd (...) ed. V1 (2016) S. 265, 266, 270–282, 388–389, 545–546, 547, 566 und Buss, Evolutionary Psychology 5th ed. (2015) P 26, 96–97,223, 293-4, 300, 309–312, 410 und Shackelford and Hansen, The Evolution of Violence (2014). Er gehört seit mehreren Jahrzehnten zu den beste Evolutions psychologen und deckt in seinen Arbeiten ein breites Spektrum an Verhaltensweisen ab, aber hier konzentriert er sich fast ausschließlich auf die psychologischen Mechanismen, die einzelne Menschen zum Mord führen, und ihre mögliche evolutionäre Funktion im EWR (Environment of Evolutionary Adaptation – d.h. die Ebenen Afrikas in den letzten Millionen Jahren oder so). -/- Buss beginnt mit der Erkenntnis, dass wie bei anderen Verhaltensweisen "alternative" Erklärungen wie Psychopathologie, Eifersucht, soziales Umfeld, Gruppendruck, Drogen und Alkohol usw. nicht wirklich erklären, da die Frage nach wie vor bleibt, warum diese mörderische Impulse erzeugen, d.h. sie sind die nahen Ursachen und nicht die ultimativen evolutionären (genetischen) Ursachen. Wie immer läuft es unweigerlich auf inklusive Fitness (genetische Fitness von Verwandten) und damit auf den Kampf um den Zugang zu Kumpels und Ressourcen hinaus, der die ultimative Erklärung für das gesamte Verhalten in allen Organismen ist. Soziologische Daten (und der gesunde Menschenverstand) machen deutlich, dass jüngere ärmere Männchen am ehesten töten. Er präsentiert seine eigenen und andere Morddaten aus Industrienationen und Stammeskulturen, konspeziertes Töten von Tieren, Archäologie, FBI-Daten und seine eigene Forschung über die Mordfantasien normaler Menschen. Viele archäologische Beweise häufen sich weiterhin von Morden, einschließlich der von ganzen Gruppen oder von Gruppen abzüglich junger Frauen, in prähistorischen Zeiten. -/- Nachdem ich Buss' Kommentare untersucht habe, präsentiere ich eine sehr kurze Zusammenfassung der absichtlichen Psychologie (die logische Struktur der Rationalität), die in meinen vielen anderen Artikeln und Büchern ausführlich behandelt wird. -/- Diejenigen, die viel Zeit haben, die eine detaillierte Geschichte mörderischer Gewalt aus evolutionärer Perspektive wollen, können Steven Pinkers "The Better Angels of Our Nature Why Violence Has Declined" (2012) und meineRezension, leicht im Netz und in zwei meiner jüngsten Bücher lesen. Kurz, Pinker stellt fest, dass Mord hat stetig und dramatisch um den Faktor etwa 30 seit unseren Tagen als Forager gesunken. Obwohl Waffen es jetzt für jeden extrem einfach machen, zu töten, ist Tötung viel seltener. Pinker glaubt, dass dies auf verschiedene soziale Mechanismen zurückzuführen ist, die unsere "besseren Engel" hervorbringen, aber ich denke, es ist hauptsächlich auf die vorübergehende Fülle von Ressourcen durch die gnadenlose Vergewaltigung unseres Planeten zurückzuführen, gepaart mit erhöhter Polizeipräsenz, mit Kommunikations- und Überwachungs- und Rechtssystemen, die es viel wahrscheinlicher machen, bestraft zu werden. Dies wird jedes Mal deutlich, wenn es sogar eine kurze und lokale Abwesenheit der Polizei gibt. -/- Wer aus der modernen zweisystems-Sichteinen umfassenden, aktuellen Rahmen für menschliches Verhalten wünscht, kann mein Buch "The Logical Structure of Philosophy, Psychology, Mindand Language in Ludwig Wittgenstein and John Searle' 2nd ed (2019) konsultieren. Diejenigen,die sich für mehr meiner Schriften interessieren, können 'Talking Monkeys--Philosophie, Psychologie, Wissenschaft, Religion und Politik auf einem verdammten Planeten --Artikel und Rezensionen 2006-2019 3rd ed (2019) und Suicidal Utopian Delusions in the 21st Century 5th ed (2019) und andere sehen. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.