This paper presents a LeastSquareMonteCarlo approach for accurately calculating credit value adjustment (CVA). In contrast to previous studies, the model relies on the probability distribution of a default time/jump rather than the default time itself, as the default time is usually inaccessible. As such, the model can achieve a high order of accuracy with a relatively easy implementation. We find that the valuation of a defaultable derivative is normally determined via backward induction when (...) their payoffs could be positive or negative. Moreover, the model can naturally capture wrong or right way risk. (shrink)
Fodor argued that learning a concept by hypothesis testing would involve an impossible circularity. I show that Fodor's argument implicitly relies on the assumption that actually φ-ing entails an ability to φ. But this assumption is false in cases of φ-ing by luck, and just such luck is involved in testing hypotheses with the kinds of generative random sampling methods that many cognitive scientists take our minds to use. Concepts thus can be learned by hypothesis testing without circularity, and it (...) is plausible that this is how humans in fact acquire at least some of their concepts. (shrink)
Economic diversification has been the glamour of successive administrations in Nigeria, especially amidst the dwindling oil-revenue in recent years, which has resulted from the fluctuations in world crude oil prices. This study aims at investigating the impact of diversifying the economy on the economic growth in Nigeria. Secondary data on GDP growth rate as a proxy for economic growth, non-oil GDP as a proxy for GDP diversification, non-oil export as a proxy for export diversification, investment and exchange rate, between 1981 (...) and 2016, were adopted in the study. An econometric approach of Ordinary Least Squares (OLS) was adopted to empirically analyze the collected data and the result revealed that non-oil gross domestic product impacted positively and significantly on economic growth while exchange rate had an inverse but significant nexus on economic growth in Nigeria, within the period covered in the study. However, non-oil export and investment impacted positively but insignificantly on economic growth in Nigeria. The study recommends the encouragement of increased productivity in the real sector as well as the adoption of stable and favourable exchange rate policies by the government in order to accelerate economic growth in Nigeria. (shrink)
Unification of natural science and social science is a centuries-old, unmitigated debate. Natural science has a chronological advantage over social science because the latter took time to include many social phenomena in its fold. History of science witnessed quite a number of efforts by social scientists to fit this discipline in a rational if not mathematical framework. On the other hand a tendency among some physicists has been observed especially since the last century to recast a number of social phenomena (...) in the mould of events taking place in physical world and governed by well-known systems and equations of physics. It necessitated the introduction of social physics as a new inter-disciplinary subject. Obviously this attempt is aimed at explaining hitherto unsolved or highly debated issues of social science. Physicists are showing special interest on problems on economics, ranging from some topics of normative economics to the movement of prices of derivatives. Statistics has been widely used in these attempts and at least two sub-disciplines of the subject, namely, stochastic process and time series analysis deserve special mention. All these research activities gave birth to another inter-disciplinary subject named as econophysics. Interestingly, global financial crisis of 2007–08 has revived the need of determination of prices of derivatives in a more accurate manner. This article adumbrates a sketch of the theoretical synthesis between physics and economics and the role played by statistics in this process. (shrink)
A review of Carl Huffman's new edition of the fragments of Archytas of Tarentum. Praises the extensive commentary on four fragments, but argues that at least two dubious works not included in the edition ("On Law and Justice" and "On Wisdom") deserve further consideration and contain important information for the interpretation of Archytas. Provides a complete translation for the fragments of those works.
We re-examine the problem of existential import by using classical predicate logic. Our problem is: How to distribute the existential import among the quantified propositions in order for all the relations of the logical square to be valid? After defining existential import and scrutinizing the available solutions, we distinguish between three possible cases: explicit import, implicit non-import, explicit negative import and formalize the propositions accordingly. Then, we examine the 16 combinations between the 8 propositions having the first two kinds (...) of import, the third one being trivial and rule out the squares where at least one relation does not hold. This leads to the following results: (1) three squares are valid when the domain is non-empty; (2) one of them is valid even in the empty domain: the square can thus be saved in arbitrary domains and (3) the aforementioned eight propositions give rise to a cube, which contains two more (non-classical) valid squares and several hexagons. A classical solution to the problem of existential import is thus possible, without resorting to deviant systems and merely relying upon the symbolism of First-order Logic (FOL). Aristotle’s system appears then as a fragment of a broader system which can be developed by using FOL. (shrink)
Development has been themain strategy in addressing the problemof sustainability since at least the mid-1980s. The results of this strategy have been mixed, if not disappointing. In their objections to this approach, critics frequently invoke constraints imposed by physical reality of which the most important one is entropy production. They question the belief that technological innovations are capable of solving the problem of sustainability. Is development the right response to this problem and is the current course capable of attaining (...) sustainability? The article examines closely and critiques the principal theoretical objection to sustainable development that emphasizes physical constraints, and more specifically entropy production. It also offers a critique of the current approach to sustainable development. The article advocates a systems approach as a way to anchor a broad consensus in the ongoing sustainability debates. (shrink)
In recent decades much literature has been produced on disagreement; the puzzling conclusion being that epistemic disagreement is, for the most part, either impossible (e.g. Aumann (Ann Stat 4(6):1236–1239, 1976)), or at least easily resolvable (e.g. Elga (Noûs 41(3):478–502, 2007)). In this paper I show that, under certain conditions, an equally puzzling result arises: that is, disagreement cannot be rationally resolved by belief updating. I suggest a solution to the puzzle which makes use of some of the principles of (...) Hintikka’s Socratic epistemology. (shrink)
Albert Einstein once made the following remark about "the world of our sense experiences": "the fact that it is comprehensible is a miracle." (1936, p. 351) A few decades later, another physicist, Eugene Wigner, wondered about the unreasonable effectiveness of mathematics in the natural sciences, concluding his classic article thus: "the miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve" (1960, p. 14). (...) At least three factors are involved in Einstein's and Wigner's miracles: the physical world, mathematics, and human cognition. One way to relate these factors is to ask how the universe could possibly be structured in such a way that mathematics would be applicable to it, and we would be able to understand that application. This is roughly Wigner's question. Alternatively, the way of the mathematical naturalist is to argue that we abstract certain properties from the world, perhaps using our bodies and physical tools, thereby articulating basic mathematical concepts, which we continue building into the complex formal structures of mathematics. John Stuart Mill, Penelope Maddy, and Rafael Nuñez teach this strategy of cognitive abstraction, in very different manners. But what if the very concepts and basic principles of mathematics were built into our cognitive structure itself? Given such a cognitive a priori mathematical endowment, would the miracles of the link between world and cognition (Einstein) and mathematics and world (Wigner) not vanish, or at least significantly diminish? This is the stance of Stanislas Deheane and Elizabeth Brannon's 2011 anthology, following a venerable rationalist tradition including Plato and Immanuel Kant. (shrink)
In this paper, I will discuss a well-known oscillation in Frege’s conception of sense. My point is only partially concerned with his two different criteria of sense identity, and touches upon a more specific point: what happens if we apply Frege’s intuitive criterion for the difference of thoughts to logically equivalent sentences? I will try to make a schematic argument here that will preempt any endeavor to make Frege more coherent than he really is. In sections A and B, I (...) will present two alternative Fregean ways to treat the sense of logically equivalent sentences. Frege really oscillated between two alternative conceptions of sense, and his inability to detect the contrast between the two alternative conceptions is partly due to his strong conception of rationality. To apply the criterion of difference of thoughts to logical matters, we may also use a weak notion of rationality, or at least a notion of rationality of human agents, with limited computational resources. The distinctions towards which Frege was striving are better understood nowadays from the point of view of the treatment of limited rationality, which imposes itself even in logical matters. (shrink)
Context is a concept used by philosophers and scientists with many different definitions. Since Dummett we speak of "context principle" in Frege and Wittgenstein: "an expression has a meaning only in the context of a sentence". The context principle finds an extension in some of Wittgenstein's ideas, especially in his famous passage where he says that "to understand a sentence is to understand a language". Given that Wittgenstein believes that "the" language does not exist but only language games exist, we (...) should conclude that he is speaking of the need to consider any sentence always in the context of a language game1. This general attitude is certainly attuned with the contemporary tendency to place contextual restrictions to the interpretations of our sentences. However we find so many kinds and forms of restrictions that this general attitude is not enough to give us a viable tool to find an order in the web of so many different theories of context. To look for an order or, at least a clarification, we may start with two contrasting paradigms of theories: the "objective" theory of contexts, where context is a set of features of the world, and the "subjective" theories of context, where context is the cognitive background of a speaker or agent in respect to a situation2. We have here not only two different ways of using the term "context" but also two different conceptions of semantics and philosophy. The different conceptions are normally associated, respectively, with the classical paradigm of model theoretic semantics (Kaplan, Lewis Stalnaker) on one hand and with the A.I. paradigm (McCarthy, Buvac, Giunchiglia) on the other hand. For sake of simplicity I will restrict my attention3 mainly to Kaplan 1989 and to McCarthy 1993 and Giunchiglia 1993. The two different conceptions can be summarised with the following schema: a) context as: set of features of the world.. (shrink)
The goal of ethical veganism is a vegan world or, at least, a significantly vegan world. However, despite the hard work done by vegan activists, global meat consumption has been increasing (Saiidi 2019; Christen 2021). Vegan advocates have focused on ethics but have ignored the importance of tradition and identity. And the advent of veggie meat alternatives has promoted food that emulates animal products thereby perpetuating the meat paradigm. I suggest that, in order to make significant changes toward ending (...) animal exploitation, ethical vegans give more attention to tradition and identity. Furthermore, I propose that raw veganism is the most ethical diet and can be the best way to move away from animal-based food. (shrink)
Substructural logics and their application to logical and semantic paradoxes have been extensively studied, but non-reflexive systems have been somewhat neglected. Here, we aim to fill this lacuna, at least in part, by presenting a non-reflexive logic and theory of naive consequence (and truth). We also investigate the semantics and the proof-theory of the system. Finally, we develop a compositional theory of truth (and consequence) in our non-reflexive framework.
When “Sinning Against Frege” was published in 1979 I thought it should have given a real turn in the discussion on Frege’s ideas. Actually the impact was less then I imagined, and the problem was that – at the end of the story – Tyler Burge’s interpretation should have posed a shadow on the direct reference theories and the Millean criticism of descriptivist theories of proper names, based on the criticism of the identification of Frege’s notion of sense with linguistic (...) meaning or connotation1. In fact Burge (1979) claims that the identification of Frege’s notion of sense with the notion of linguistic meaning is a «basic misunderstanding» of Frege’s work2. This claim implies that Fregean senses are not like Mill’s connotations; therefore many direct-reference criticisms against Frege, which are grounded on Mill’s claims that proper names have no connotation, lose their efficacy. Burge, in giving specifications3, apparently accepts at least the idea that sense is an aspect of meaning, in particular «the aspect of meaning relevant to fixing the truth value of sentences». This feature is the “harmless” part of the assimilation of sense and linguistic meaning; but this assimilation becomes dangerous when context dependence is concerned. Revisiting Burge (1979), after more than two decades of debate on indexicals, may help to better understand the originality and the limitation of his claims. (shrink)
The human person has been defined in a variety of ways along human history. As time goes on, reflections from philosophy, theology, psycology, anthropology, sociology, and history have done great contributions to understand and explain what the human person is. We have focused in the proposals that personalism have done so far which states that the human person should be the center and start point of every reflection about the human being. Among all the characteristics of personalism, we find that (...) the human person is above all a being for the encounter with each other and its first encounter is the filial relationship. Even though being children is more than evident for every single person, we usually do not live as them or at least it does not seem that we fully understand this fact. This is why we try to shed light on the keys that could help today men and women to understand the value of being children and the possible causes that hinder us to live or accept that condition in our own lives. We hope that this work could help understand the richness of being children, so that we can facilitate the comprehension and opening to christian faith which states that we all are children of God. -/- La persona humana ha sido definida de muchas maneras a lo largo de la historia. Con el paso del tiempo, la reflexión filosófica, teológica, psicológica, antropológica, sociológica, histórica, etc, ha hecho grandes aportes a la comprensión de la persona humana y aportado otras definiciones. Nos hemos enfocado en las reflexiones que ha hecho el personalismo que sostiene que la persona humana es el centro y punto de partida de toda reflexión. De todas sus características, la persona humana es ante todo un ser para el encuentro y el primer encuentro de su existencia es la relación filial. No obstante lo evidente que resulta el ser hijos, no vivimos como tales o al menos no parece que lo entendamos así.. Es por eso que hemos querido sacar a la luz algunas claves que sirvan a hombres y mujeres del mundo actual para comprender el valor de la dimensión filial de la persona humana y qué puede estar impidiendo vivirla o aceptarla en la propia vida; esperando que a partir de esta reflexión se pueda ayudar a reconocer la riqueza del ser hijos, y de esta manera, se tiendan puentes para facilitar la comprensión y la apertura a la fe cristiana que entiende a todo ser humano como hijo de Dios. (shrink)
Rational Procedures.Carlo Penco - 2009 - The Dialogue - Yearbook of Philosophical Hermenutics, Lit Verlag. Berlin, 2009 4 (1):137-153.details
In this paper I shall deal with the role of "understanding a thought" in the debate on the definition of the content of an assertion. I shall present a well known tension in Frege's writings, between a cognitive and semantic notion of sense. This tension is at the source of some of the major contemporary discussions, mainly because of the negative influence of Wittgenstein's Tractatus, which did not give in-depth consideration to the tension found in Frege. However many contemporary authors, (...) after the first attempt by Carnap himself, have tried to make room both for a cognitive and semantic aspect of meaning. I claim that at least some of these attempts (Dummett, Perry and Chalmers) are seriously flawed, mainly due to the difficulty in making a proper connection between the two different conceptions of sense. I shall outline an alternative project, which takes into consideration Frege's requirements on antipsychologism and of the objectivity of thought, while maintaining a close connection between the two aspects of sense. (shrink)
⦿ In my dissertation I introduce, motivate and take the first steps in the realization of, the project of naturalising modal metaphysics: the transformation of the field into a chapter of the philosophy of science rather than speculative, autonomous metaphysics. ⦿ In the introduction, I explain the concept of naturalisation that I apply throughout the dissertation, which I argue to be an improvement on Ladyman and Ross' proposal for naturalised metaphysics. I also object to Williamson's proposal that modal metaphysics --- (...) or some view in the area --- is already a quasi-scientific discipline. ⦿ Recently, some philosophers have argued that the notion of metaphysical modality is as ill defined as to be of little theoretical utility. In the second chapter I intend to contribute to such skepticism. First, I observe that each of the proposed marks of the concept, except for factivity, is highly controversial; thus, its logical structure is deeply obscure. With the failure of the "first principles" approach, I examine the paradigmatic intended applications of the concept, and argue that each makes it a device for a very specific and controversial project: a device, therefore, for which a naturalist will find no use for. I conclude that there is no well-defined or theoretically useful notion of objective necessity other than logical or physical necessity, and I suggest that naturalising modal metaphysics can provide more stable methodological foundations. ⦿ In the third chapter I answer a possible objection against the in-principle viability of the project: that the concept of metaphysical modality cannot be understood through the philosophical analysis of any scientific theory, since metaphysical necessity "transcends'' natural necessity, and science only deals with the latter. I argue that the most important arguments for this transcendence thesis fail or face problems that, as of today, remain unsolved. ⦿ Call the idea that science doesn't need modality, "demodalism''. Demodalism is a first step in a naturalistic argument for modal antirealism. In the fourth chapter I examine six versions of demodalism to explain why a family of formalisms, that I call "spaces of possibility'', are (i) used in a quasi-ubiquitous way in mathematised sciences (I provide examples from theoretical computer science to microeconomics), (ii) scientifically interpreted in modal terms, and (iii) used for at least six important tasks: (1) defining laws and theories; (2) defining important concepts from different sciences (I give several examples); (3) making essential classifications; (4) providing different types of explanations; (5) providing the connection between theory and statistics, and (6) understanding the transition between a theory and its successor (as is the case with quantisation). ⦿ In fifth chapter I propose and defend a naturalised modal ontology. This is a realism about modal structure: my realism about constraints. The modal structure of a system are the relationships between its possible states and between its possible states and those of other systems. It is given by the plurality of restrictions to which said system is subject. A constraint is a factor that explains the impossibility of a class of states; I explain this concept further. First, I defend my point of view by rejecting some of its main rivals: constructive empiricism, Humean conventionalism, and wave function realism, as they fail to make sense of quantum chaos. This is because the field requires the notion of objective modal structure, and the mentioned views have trouble explaining the modal facts of quantum dynamics. Then, I argue that constraint realism supersedes these views in the context of Bohm's standard theory and mechanics, and underpins the study of quantum chaos. Finally, I consider and reject two possible problems for my point of view. ⦿ A central concern of modal metaphysicians has been to understand the logical system that best characterises necessity. In the sixth chapter I intend to recover the logical project applied to my naturalistic modal metaphysics. Scientists and philosophers of science accept different degrees of physical necessity, ranging from purely mathematically necessary facts that restrict physical behaviour, to kinetic principles, to particular dynamical constraints. I argue that this motivates a multimodal approach to modal logic, and that the time dependence of dynamics motivates a logic of historical necessity. I propose multimodal propositional (classical) logics for Bohmian mechanics and the Everettian theory of many divergent worlds, and I close with a criticism of Williamson's approach to the logic of state spaces of dynamic systems. (shrink)
The idea that anyone, with the right critical knowledge and a certain amount of spare time and resources, could become a globally responsible citizen has been skeptically questioned at least since the time of Rousseau. But, during the last two decades, the specific concern that has troubled critical qualitative researchers has been the possible complicity of the active citizen with a neoliberal regime of governmentality, a regime that often uses the injunction to volunteer as a political tactic of responsibilization. (...) The article seeks to address this latent concern through the study of a particularly marketized act of global citizenship: the immersive experience of volunteer travel. Through an innovative Foucauldian analysis and original qualitative method, designed to excavate deeply seated skeptical insights among returned volunteers in Australia, this study elucidates, first, how a personal sense of complicity actually surfaces within the market-mediated volunteer experience, and, second, how the ensuing predicament can be tackled, both from the perspective of the critical academic and of the citizen on the ground. (shrink)
Much forensic inference based upon DNA evidence is made assuming Hardy-Weinberg Equilibrium (HWE) for the genetic loci being used. Several statistical tests to detect and measure deviation from HWE have been devised, and their limitations become more obvious when testing for deviation within multiallelic DNA loci. The most popular methods-Chi-square and Likelihood-ratio tests-are based on asymptotic results and cannot guarantee a good performance in the presence of low frequency genotypes. Since the parameter space dimension increases at a quadratic rate (...) on the number of alleles, some authors suggest applying sequential methods, where the multiallelic case is reformulated as a sequence of “biallelic” tests. However, in this approach it is not obvious how to assess the general evidence of the original hypothesis; nor is it clear how to establish the significance level for its acceptance/rejection. In this work, we introduce a straightforward method for the multiallelic HWE test, which overcomes the aforementioned issues of sequential methods. The core theory for the proposed method is given by the Full Bayesian Significance Test (FBST), an intuitive Bayesian approach which does not assign positive probabilities to zero measure sets when testing sharp hypotheses. We compare FBST performance to Chi-square, Likelihood-ratio and Markov chain tests, in three numerical experiments. The results suggest that FBST is a robust and high performance method for the HWE test, even in the presence of several alleles and small sample sizes. (shrink)
We present a module based criterion, i.e. a sufficient condition based on the absolute value of the matrix coefficients, for the convergence of Gauss–Seidel method (GSM) for a square system of linear algebraic equations, the Generalized Line Criterion (GLC). We prove GLC to be the “most general” module based criterion and derive, as GLC corollaries, some previously know and also some new criteria for GSM convergence. Although far more general than the previously known results, the proof of GLC is (...) simpler. The results used here are related to recent research in stability of dynamical systems and control of manufacturing systems. (shrink)
To estimate causal relationships, time series econometricians must be aware of spurious correlation, a problem first mentioned by Yule (1926). To deal with this problem, one can work either with differenced series or multivariate models: VAR (VEC or VECM) models. These models usually include at least one cointegration relation. Although the Bayesian literature on VAR/VEC is quite advanced, Bauwens et al. (1999) highlighted that “the topic of selecting the cointegrating rank has not yet given very useful and convincing results”. (...) The present article applies the Full Bayesian Significance Test (FBST), especially designed to deal with sharp hypotheses, to cointegration rank selection tests in VECM time series models. It shows the FBST implementation using both simulated and available (in the literature) data sets. As illustration, standard non informative priors are used. (shrink)
In quantum theory every state can be diagonalized, i.e. decomposed as a convex combination of perfectly distinguishable pure states. This elementary structure plays an ubiquitous role in quantum mechanics, quantum information theory, and quantum statistical mechanics, where it provides the foundation for the notions of majorization and entropy. A natural question then arises: can we reconstruct these notions from purely operational axioms? We address this question in the framework of general probabilistic theories, presenting a set of axioms that guarantee that (...) every state can be diagonalized. The first axiom is Causality, which ensures that the marginal of a bipartite state is well defined. Then, Purity Preservation states that the set of pure transformations is closed under composition. The third axiom is Purification, which allows to assign a pure state to the composition of a system with its environment. Finally, we introduce the axiom of Pure Sharpness, stating that for every system there exists at least one pure effect occurring with unit probability on some state. For theories satisfying our four axioms, we show a constructive algorithm for diagonalizing every given state. The diagonalization result allows us to formulate a majorization criterion that captures the convertibility of states in the operational resource theory of purity, where random reversible transformations are regarded as free operations. (shrink)
The study of cultural evolution has taken on an increasingly interdisciplinary and diverse approach in explicating phenomena of cultural transmission and adoptions. Inspired by this computational movement, this study uses Bayesian networks analysis, combining both the frequentist and the Hamiltonian Markov chain MonteCarlo (MCMC) approach, to investigate the highly representative elements in the cultural evolution of a Vietnamese city’s architecture in the early 20th century. With a focus on the façade design of 68 old houses in Hanoi’s (...) Old Quarter (based on 78 data lines extracted from 248 photos), the study argues that it is plausible to look at the aesthetics, architecture, and designs of the house façade to find traces of cultural evolution in Vietnam, which went through more than six decades of French colonization and centuries of sociocultural influence from China. The in-depth technical analysis, though refuting the presumed model on the probabilistic dependency among the variables, yields several results, the most notable of which is the strong influence of Buddhism over the decorations of the house façade. Particularly, in the top 5 networks with the best Bayesian Information Criterion (BIC) scores and p<0.05, the variable for decorations (DC) always has a direct probabilistic dependency on the variable B for Buddhism. The paper then checks the robustness of these models using Hamiltonian MCMC method and find the posterior distributions of the models’ coefficients all satisfy the technical requirement. Finally, this study suggests integrating Bayesian statistics in the social sciences in general and for the study of cultural evolution and architectural transformation in particular. (shrink)
Ivan Illich was a heavy critic of traditional schooling. His proposals were disregarded, perhaps too quickly, for various reasons. This paper, based on review research, aims to add to a current (re)reading of Illich, seeking to answer the following question: what is the relevance of Illich’s proposal for a successful education in an increasingly digitalised society? The results of this research allow concluding, on the one hand, that Illich’s proposal to replace strict schooling with (self)training networks in a society that (...) is increasingly digitalised and linked by the internet may offer potential benefits, and it is worth, at least, of an in-depth analysis. On the other hand, provocative scholars that allow us to get out of any ideologically and socially delimited system have the merit of helping to provide instruments that enable a better understanding of the present and, consequently, a rationale for the options for the future. Ivan Illich is one of these scholars. (shrink)
R Core Team. (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing. Supplement to Occipital and left temporal instantaneous amplitude and frequency oscillations correlated with access and phenomenal consciousness. Occipital and left temporal instantaneous amplitude and frequency oscillations correlated with access and phenomenal consciousness move from the features of the ERP characterized in Occipital and Left Temporal EEG Correlates of Phenomenal Consciousness (Pereira, 2015) towards the instantaneous amplitude and frequency of event-related changes correlated with a (...) contrast in access and in phenomenology. Occipital and left temporal instantaneous amplitude and frequency oscillations correlated with access and phenomenal consciousness proceed as following. In the first section, empirical mode decomposition (EMD) with post processing (Xie, G., Guo, Y., Tong, S., and Ma, L., 2014. Calculate excess mortality during heatwaves using Hilbert-Huang transform algorithm. BMC medical research methodology, 14, 35) Ensemble Empirical Mode Decomposition (postEEMD) and Hilbert-Huang Transform (HHT). -/- In the second section, calculated the variance inflation factor (VIF). In the third section, partial least squares regression (PLSR): the minimal root mean squared error of prediction (RMSEP). In the last section, partial least squares regression (PLSR): significance multivariate correlation (sMC) statistic. (shrink)
This research employs the Bayesian network modeling approach, and the Markov chain MonteCarlo technique, to learn about the role of lies and violence in teachings of major religions, using a unique dataset extracted from long-standing Vietnamese folktales. The results indicate that, although lying and violent acts augur negative consequences for those who commit them, their associations with core religious values diverge in the final outcome for the folktale characters. Lying that serves a religious mission of either Confucianism (...) or Taoism (but not Buddhism) brings a positive outcome to a character (βT_and_Lie_O= 2.23; βC_and_Lie_O= 1.47; βT_and_Lie_O= 2.23). A violent act committed to serving Buddhist missions results in a happy ending for the committer (βB_and_Viol_O= 2.55). What is highlighted here is a glaring double standard in the interpretation and practice of the three teachings: the very virtuous outcomes being preached, whether that be compassion and meditation in Buddhism, societal order in Confucianism, or natural harmony in Taoism, appear to accommodate two universal vices—violence in Buddhism and lying in the latter two. These findings contribute to a host of studies aimed at making sense of contradictory human behaviors, adding the role of religious teachings in addition to cognition in belief maintenance and motivated reasoning in discounting counterargument. (shrink)
Folklore has a critical role as a cultural transmitter, all the while being a socially accepted medium for the expressions of culturally contradicting wishes and conducts. In this study of Vietnamese folktales, through the use of Bayesian multilevel modeling and the Markov chain MonteCarlo technique, we offer empirical evidence for how the interplay between religious teachings (Confucianism, Buddhism, and Taoism) and deviant behaviors (lying and violence) could affect a folktale’s outcome. The findings indicate that characters who lie (...) and/or commit violent acts tend to have bad endings, as intuition would dictate, but when they are associated with any of the above Three Teachings, the final endings may vary. Positive outcomes are seen in cases where characters associated with Confucianism lie and characters associated with Buddhism act violently. The results supplement the worldwide literature on discrepancies between folklore and real-life conduct, as well as on the contradictory human behaviors vis-à-vis religious teachings. Overall, the study highlights the complexity of human decision-making, especially beyond the folklore realm. (shrink)
The exponential growth of social data both in volume and complexity has increasingly exposed many of the shortcomings of the conventional frequentist approach to statistics. The scientific community has called for careful usage of the approach and its inference. Meanwhile, the alternative method, Bayesian statistics, still faces considerable barriers toward a more widespread application. The bayesvl R package is an open program, designed for implementing Bayesian modeling and analysis using the Stan language’s no-U-turn (NUTS) sampler. The package combines the ability (...) to construct Bayesian network models using directed acyclic graphs (DAGs), the Markov chain MonteCarlo (MCMC) simulation technique, and the graphic capability of the ggplot2 package. As a result, it can improve the user experience and intuitive understanding when constructing and analyzing Bayesian network models. A case example is offered to illustrate the usefulness of the package for Big Data analytics and cognitive computing. (shrink)
This research employs the Bayesian network modeling approach, and the Markov chain MonteCarlo technique, to learn about the role of lies and violence in teachings of major religions, using a unique dataset extracted from long-standing Vietnamese folktales. The results indicate that, although lying and violent acts augur negative consequences for those who commit them, their associations with core religious values diverge in the outcome for the folktale characters. Lying that serves a religious mission of either Confucianism or (...) Taoism (but not Buddhism) brings a positive outcome to a character. A violent act committed to serving Buddhist mission results in a happy ending for the committer. (shrink)
Currently, under the conditions of permanent financial risks that hamper the sustainable economic growth in the financial sector, the development of evaluation and risk management methods both regulated by Basel II and III and others seem to be of special importance. The reputation risk is one of significant risks affecting reliability and credibility of commercial banks. The importance of reputation risk management and the quality of their assessment remain relevant as the probability of decrease in or loss of business reputation (...) influences the financial results and the degree of customers’, partners’ and stakeholders’ confidence. By means of imitating modeling based on Bayesian Networks and the fuzzy data analysis, the article characterizes the mechanism of reputation risk assessment and possible losses evaluation in banks by plotting normal and lognormal distribution functions. Monte-Carlo simulation is used to calculate the probability of losses caused by reputation risks. The degree of standardized histogram similarity is determined on the basis of the fuzzy data analysis applying Hamming distance method. The tree-like hierarchy based on the OWA-operator is used to aggregate the data with Fishburne's coefficients as the convolution scales. The mechanism takes into account the impact of criteria, such as return on equity, goodwill value, the risk assets ratio, the share of the productive assets in net assets, the efficiency ratio of interest bearing liabilities, the risk ratio of credit operations, the funding ratio and reliability index on the business reputation of the bank. The suggested methods and recommendations might be applied to develop the decision-making mechanism targeted at the implementation of reputation risk management system in commercial banks as well as to optimize risk management technologies. (shrink)
Products made from animal fur and skin have been a major part of human civilization. However, in modern society, the unsustainable consumption of these products – often considered luxury goods – has many negative environmental impacts. This study explores how people’s perceptions of biodiversity affect their attitudes and behaviors toward consumption. To investigate the information process deeper, we add the moderation of beliefs about biodiversity loss. Following the Bayesian Mindsponge Framework (BMF) analytics, we use mindsponge-based reasoning for constructing conceptual models (...) and employ Bayesian analysis aided by Markov Chain MonteCarlo (MCMC) algorithms on a dataset of 535 Vietnamese urban residents. The results show that people’s preference for using products made from animal skin/fur is negatively associated with perceived consequences of biodiversity loss when they believe biodiversity loss is real and a major problem. In contrast, if urban residents believe biodiversity loss is unreal or not a significant issue, the association between perceived consequences of biodiversity loss and personal preference happens in the opposite direction. The same effects of biodiversity loss perception on people’s possession of skin/fur products was not found, indicating a more complex information process on behaviors compared to attitudes. Nevertheless, in the scenario that people believe biodiversity loss is not a significant issue, the higher the perceived consequences of biodiversity loss are, the greater number of animal-based products they likely own. Our results suggest that policymakers should not neglect the factor of personal belief besides knowledge and awareness in environmental campaigns. (shrink)
The LIBOR Market Model has become one of the most popular models for pricing interest rate products. It is commonly believed that Monte-Carlo simulation is the only viable method available for the LIBOR Market Model. In this article, however, we propose a lattice approach to price interest rate products within the LIBOR Market Model by introducing a shifted forward measure and several novel fast drift approximation methods. This model should achieve the best performance without losing much accuracy. Moreover, (...) the calibration is almost automatic and it is simple and easy to implement. Adding this model to the valuation toolkit is actually quite useful; especially for risk management or in the case there is a need for a quick turnaround. (shrink)
The adoption of the codex for literature in the Roman world was one of the most significant developments in the history of the book, yet remains poorly understood. Physical evidence seems to contradict literary evidence from Martial's epigrams. Near-total adoption of the codex for early Christian works, even as the book roll dominated non-Christian book forms in the first centuries of our era, has led to endless speculation about possible ideological motives for adoption. What has been unquestioned is the importance (...) of Christian scribes in the surge of adoption from 300 C.E. onward. This article reexamines the foundation of many theories, the timeline for non-Christian adoption sketched by Roberts and Skeat in their study, The Birth of the Codex, and reevaluates it through the lens of “diffusion of innovations theory” in order to reconcile the evidence and elevate practical considerations once and for all over ideological motives. (shrink)
Introduction The Defining Issues Test (DIT) aimed to measure one’s moral judgment development in terms of moral reasoning. The Neo-Kohlbergian approach, which is an elaboration of Kohlbergian theory, focuses on the continuous development of postconventional moral reasoning, which constitutes the theoretical basis of the DIT. However, very few studies have directly tested the internal structure of the DIT, which would indicate its construct validity. Objectives Using the DIT-2, a later revision of the DIT, we examined whether a bi-factor model or (...) 3-factor CFA model showed a better model fit. The Neo-Kohlbergian theory of moral judgment development, which constitutes the theoretical basis for the DIT-2, proposes that moral judgment development occurs continuously and that it can be better explained with a soft-stage model. Given these assertions, we assumed that the bi-factor model, which considers the Schema-General Moral Judgment (SGMJ), might be more consistent with Neo-Kohlbergian theory. Methods We analyzed a large dataset collected from undergraduate students. We performed confirmatory factor analysis (CFA) via weighted least squares. A 3-factor CFA based on the DIT-2 manual and a bi-factor model were compared for model fit. The three factors in the 3-factor CFA were labeled as moral development schemas in Neo-Kohlbergian theory (i.e., personal interests, maintaining norms, and postconventional schemas). The bi-factor model included the SGMJ in addition to the three factors. Results In general, the bi-factor model showed a better model fit compared with the 3-factor CFA model although both models reported acceptable model fit indices. Conclusion We found that the DIT-2 scale is a valid measure of the internal structure of moral reasoning development using both CFA and bi-factor models. In addition, we conclude that the soft-stage model, posited by the Neo-Kohlbergian approach to moral judgment development, can be better supported with the bi-factor model that was tested in the present study. (shrink)
The incremental risk charge (IRC) is a new regulatory requirement from the Basel Committee in response to the recent financial crisis. Notably few models for IRC have been developed in the literature. This paper proposes a methodology consisting of two MonteCarlo simulations. The first MonteCarlo simulation simulates default, migration, and concentration in an integrated way. Combining with full re-valuation, the loss distribution at the first liquidity horizon for a subportfolio can be generated. The second (...)MonteCarlo simulation is the random draws based on the constant level of risk assumption. It convolutes the copies of the single loss distribution to produce one year loss distribution. The aggregation of different subportfolios with different liquidity horizons is addressed. Moreover, the methodology for equity is also included, even though it is optional in IRC. (shrink)
This study used a partial least squares structural equation modelling (PLS-SEM) to estimate curriculum management's direct and indirect effects on university graduate programmes' viability. The study also examined the role of institutional effectiveness in mediating the nexus between the predictor and response variables. This is a correlational study with a factorial research design. The study's participants comprised 149 higher education administrators (23 Faculty Deans and 126 HODs) from two public universities in Nigeria. A structured questionnaire designed by the researchers (...) was used for data collection. The questionnaire was duly validated with an acceptable scale and item content validity indices. The dimensionality of the instrument was determined using exploratory factor analysis (EFA). Convergent validity was based on Average Variance Extracted (AVE), whereas discriminant validity was based on Fornell-Lacker criteria and the Hetero-Trait Mono-Trait (HTMT) ratio. Acceptable composite reliability estimates of internal consistency were reached for the three sub-scales. Following ethical practices, the questionnaire was physically administered to respondents and retrieved afterwards. Smart PLS (version 3.2.9) and SPSS (version 26.0) programs were used for all the statistical analyses. This study uncovered significant direct and indirect effects of curriculum management on the viability of graduate programmes. Institutional effectiveness significantly impacted graduate programmes’ viability while mediating the nexus between curriculum management and graduate programmes’ viability. Curriculum management and institutional effectiveness jointly explained a significant proportion of graduate programmes’ viability variance. The result of this study proved that graduate programmes’ viability depends, to a great extent, on how much curriculum is managed and how effective institutions are with their services. The result of this study can enable institutions seeking to run viable graduate programmes to re-evaluate their curriculum management practices and the effectiveness of their services. (shrink)
This study assessed external debts and the financing of education in Nigeria using time series data obtained from World Bank, and CBN Statistical Bulletin covering a period of 31 years from 1988 -2018. The model of the study was derived, while the data collected were analysed using the Ordinary Least Squares. Diagnostic tests such as Augmented Dickey- Fuller (ADF) unit root test, Johansen co-integration, Vector Error Correction (VEC) techniques of estimation, and Granger Causality tests were all performed. Findings revealed (...) a significant long-run relationship between external debts and the financing of education; external debts have a significant effect (F=39.07055, p<.05) on the financing of education in Nigeria; external debt stock and external debt service payment have no significant effect on the financing of education; real GDP and Exchange rate have a significant effect on the financing of education in Nigeria respectively. Based on these findings, it was concluded that external debt is a big hindrance to the financing of education and consequently, the economic growth of Nigeria. It was recommended amongst others that the government should use borrowed funds from external sources for productive capital projects or development initiatives such as investment in education and the eradication of illiteracy. (shrink)
The purpose of this paper is to investigate different approaches in measurement of the concept of dynamic capabilities. The paper focuses on a formative measurement model by Wilden and colleagues (2013) and a reflective model by Li and Liu (2014). The models were tested on Ukrainian firms in relation to their performance. A Ukrainian and Russian translation of both measurements is introduced and tested. The proposed measurements were tested by applying a partial least squares algorithm using SmartPLS™ software. The (...) sample contained 113 randomly selected firms from the city of Dnipropetrovs’k and the Dnipropetrovs’k region. The results showed that, first, both measurement approaches produced similar results. There were no statistically significant differences between the results received from these measurements. Second, organizational settings like firm size and firm type did not influence the results. Third, dynamic capabilities proved to be a reliable predictor of a firm’s performance. This study is unique in that it applies and compares two different measuring models (reflective and formative) and provides empirical data on the usefulness of both methods. (shrink)
Due to ethical lapses of leaders, interest in ethical leadership has grown, raising important questions about the responsibility of leaders in ensuring moral and ethical conduct. Research conducted on ethical leadership failed to investigate the active role that the characteristics of ethical climate and organisational justice have an increasing or decreasing influence on the ethical leadership in the organisation’s outcomes of employees’ ethical behaviour. Thus, this study examined the dual-mediations of work ethical climate and organisational justice on the relation of (...) ethical leadership and ethical behaviour of employees. A total of 620 full-time employees from five Iraqi provinces working at 33 Iraqi organisations in the field of manufacturing, retailing, medical, insurance, information technology, legal, finance, and telecommunication responded to the questionnaire survey. Structural equation modelling (SEM) was used to test the model and data analysis was carried out using structural equation modelling-partial leastsquare (SEM-PLS). The results revealed that there is a significant relationship between ethical leadership behaviour and the ethical behaviour of employees. Primarily, the study also found that ethical climate and organisational justice play a very significant mediating role between ethical leadership and employees’ ethical behaviour. (shrink)
Purpose – The internet creates ample opportunities to start a mobile social commerce business. The literature confirms the issue of customer trust for social commerce businesses is a challenge that must be addressed. Hence, this study aims to examine the antecedents of trust in mobile social commerce by applying linear and non-linear relationships based on partial least squares structural equation modeling and an artificial neural network model. -/- Design/methodology/approach – This study applied a non-linear artificial neural network approach to (...) provide a further understanding of the determinants of trust in mobile social commerce based on a non-linear and non-compensatory model. Besides, a questionnaire was distributed to 340 social commerce customers in Malaysia. -/- Findings – The conceptual framework for investigating trust in mobile social commerce has various advantages and contributions to predicting consumer behaviour. The results of the study showed there is a positive and significant relationship between social support, presence and unified theory of acceptance and use of technology2 (UTAUT2). In addition, UTAUT2 has fully mediated the relationship between social support, presence and trust in social commerce. Finally, the results concluded the relationship between UTAUT2 and trust in social commerce would be stronger when the diffusion of innovation and innovation resistance is high and low, respectively. -/- Research limitations/implications – The current study provides a novel perspective on how customers can trust social m-commerce to provide real solutions to managers of encouraging e-marketing among consumers. -/- Practical implications – This paper shows how businesses can develop trust in social m-commerce in Malaysian markets. The findings of this study probably could be extended to other businesses in Asia or other countries. Because trust in social e-commerce has a dynamic role in consumer behaviour and intention to purchase. -/- Originality/value – This study provided a new perspective on mobile social commerce and paid more attention to an investigation of such emerging commerce. The originality of this study is embodied by investigating an integrated model that included different theories that presented new directions of trust in mobile social commerce through social and behavioural determinants. (shrink)
Previous research has interlinked alcohol consumption (AC), mental stress (MS), psychotic experiences (PE), and academic performance (AP) of students and psychological behavior of the general population. The current study seems to be the first to consider the joint and partial mediation effects of MS and PE in linking AC to graduates’ job performance in specific areas such as teamwork (TW), communication competence (CC), customer service (CS), and job functions (JF). A virtual cross-section of 3,862 graduates with self-reported cases of having (...) taken alcohol in the past participated in the study. These participants responded to an electronic questionnaire that was mailed to them. The instrument used for data collection had acceptable psychometric properties. The study used the partial least squares structural equation modeling (PLS-SEM) to achieve its objectives. The inner and outer models were all evaluated for quality and goodness of fit. Results showed a significant negative effect of AC and MS on graduates’ job performance in terms of TW, CC, CS, and JF, respectively. AC had a significant positive effect on MS and PE. MS had a significant positive effect on PE. A significant joint mediation effect of MS and PE was found in linking AC to graduates’ TW, CC, and CS, excluding JF. MS partially mediated AC’s paths to all the graduates’ job performance indicators. PE was only a significant partial mediator of the connection between AC to JF, but not TW, CC, and CS. This study’s result can help improve graduates’ work effectiveness and has revealed some negative predictors. Therefore, it is recommended that graduates avoid alcohol or only consume mild quantities of it to enable them to discharge services effectively at the workplace. (shrink)
The purpose of this paper is to empirically explore how the balance use of performance measurement systems mediate the effects of intellectual capital dimensions including human, organizational and social capital on firm performance. The data were collected from a survey of 448 Vietnamese managers of Information and Communication Technology Sector and proposed hypotheses were tested by using partial least squares regression and a structural modeling technique which is appropriate for highly complex predictive models. Findings from hypotheses tests indicated that (...) firms with higher level of intellectual capital dimensions place a premium on the balance use of performance measurement systems in a diagnostic and interactive style. Furthermore, the result also provides some evidences that Intellectual capital dimensions effect indirectly on firm performance through performance measurement systems. (shrink)
Abstract : This research work examined how major macro economic variables in Nigeria such as Gross Domestic Product (GDP), Gross Fixed Capital Formation (GFCF) and National Savings (NS) reacted to International Monetary Fund (IMF) conditionality from 1986 to 2016. Many policy makers and researchers have questioned the benefits of IMF credit facilities to developing nations. This work therefore seeks to evaluate the impact of IMF conditionality like Reduction in Government Expenditure (TGE), Devaluation of Local Currencies (RER), and Trade openness (TO) (...) on the Identified Macro Economic Variable in Nigeria. The data for the analysis were sourced from the data bank of World Bank. Granger causality test and ordinary LeastSquare (OLS) method were used to test the formulated hypotheses. The result revealed that IMF conditionality has significant effect on GDP, GFCF and NS of Nigeria. Devaluation of local currency is the greatest IMF conditionality that exerts great negative influence on economic growth of Nigeria. The work recommends among others that: instead of currency devaluation, protectionist policies via guided liberalization should be promoted combined with the use of fiscal policy in order to encourage local production and usage of locally produced goods. TGE that showed significant positive effect on GDP, GFCF and NS is an indication that government can positively influence the economic positions of Nigeria with the use of fiscal policy. (shrink)
Sustainability is a crucial issue for many sectors in Malaysia, including the manufacturing sector. Many businesses, especially the chemical manufacturing industry, aim to achieve a sustainable business through the implementation of green practices. Green practices provide guidelines for the employees to simultaneously sustain the organisation in a sustainable manner and carry out the required manufacturing activities. Focusing on that, this study aimed to examine the effects of green practices on corporate sustainability performance through Islamic work ethics, organisation size, and organisation (...) age as moderators. Using the stratified random sampling technique, 344 chemical manufacturing organisations in Malaysia were invited to participate in a survey. Data from 130 completed questionnaire sets were subjected to partial leastsquare (PLS) analysis. The results demonstrated significant effects of green practices on corporate sustainability performance via Islamic work ethics and organisation size. However, organisation age was found to exhibit no moderation effect on the relationship between green practices and corporate sustainability performance. Conclusively, as part of the organisational strategies, the sustainability of chemical manufacturing organisations must involve successful implementation of green practices, Islamic work ethics, and organisation size. This study offered several theoretical and practical contributions on green practices, Islamic work ethics, organisation size, and organisation age, and corporate sustainability performance. Theoretically, this study extended literature on the resource-based view theory, natural-resource-based view theory, and stakeholder theory. Al-Quran and hadith were used to support this study to link the relationships of the variables under study, particularly in the context of chemical manufacturing organisations in Malaysia. Practically, this study was expected to assist chemical manufacturers in selecting the appropriate green practices to achieve corporate sustainability performance and good implementation of Islamic work ethics. Additionally, it is recommended for future research to explore other types of industries in the manufacturing sector given the focus of this study on the chemical manufacturing industry only. (shrink)
Simple regression (regression analysis with a single explanatory variable), and multiple regression (regression models with multiple explanatory variables), typically correspond to very different biological questions. The former use regression lines to describe univariate associations. The latter describe the partial, or direct, effects of multiple variables, conditioned on one another. We suspect that the superficial similarity of simple and multiple regression leads to confusion in their interpretation. A clear understanding of these methods is essential, as they underlie a large range of (...) procedures in common use in biology. Beyond simple and multiple regression in their most basic forms, understanding the key principles of these procedures is critical to understanding, and properly applying, many methods, such as mixed models, generalised models, and causal inference using graphs (including path analysis and its extensions). A simple, but careful, look at the distinction between these two analyses is valuable in its own right, and can also be used to clarify widely-held misconceptions about collinearity (correlations among explanatory variables). There is no general sense in which collinearity is a problem. We suspect that the perception of collinearity as a hindrance to analysis stems from misconceptions about interpretation of multiple regression models, and so we pursue discussions about these misconceptions in this light. In particular, collinearity causes multiple regression coefficients to be less precisely estimated than corresponding simple regression coefficients. This should not be interpreted as a problem, as it is perfectly natural that direct effects should be harder to characterise than univariate associations. Purported solutions to the perceived problems of collinearity are detrimental to most biological analyses. (shrink)
Following the speech act theory, we take hypotheses and assertions as linguistic acts with different illocutionary forces. We assume that a hypothesis is justified if there is at least a scintilla of evidence for the truth of its propositional content, while an assertion is justified when there is conclusive evidence that its propositional content is true. Here we extend the logical treatment for assertions given by Dalla Pozza and Garola by outlining a pragmatic logic for assertions and hypotheses. On (...) the basis of this extension we analyse the standard logical opposition relations for assertions and hypotheses. We formulate a pragmatic square of oppositions for assertions and a hexagon of oppositions for hypotheses. Finally, we give a mixed hexagon of oppositions to point out the opposition relations for assertions and hypotheses. (shrink)
Pre-emption cases have been taken by almost everyone to imply the unviability of the simple counterfactual theory of causation. Yet there is ample motivation from scientific practice to endorse a simple version of the theory if we can. There is a way in which a simple counterfactual theory, at least if understood contrastively, can be supported even while acknowledging that intuition goes firmly against it in pre-emption cases—or rather, only in some of those cases. For I present several new (...) pre-emption cases in which causal intuition does not go against the counterfactual theory, a fact that has been verified experimentally. I suggest an account of framing effects that can square the circle. Crucially, this account offers hope of theoretical salvation—but only to the counterfactual theory of causation, not to others. Again, there is experimental support for this account. (shrink)
This paper1 explores, quite tentatively, possible consequences for the concept of semantics of two phenomena concerning meaning and interpretation, viz., radical interpretation and normativity of meaning. Both, it will be argued, challenge the way in which meaning is conceived of in semantics and thereby the status of the discipline itself. For several reasons it seems opportune to explore these issues. If one reviews the developments in semantics over the past two decades, one observes that quite a bit has changed, and (...) one may well wonder how to assess these changes. This relates directly to the status of semantics. If semantics is an empirical discipline, one might expect that most changes are informed by empirical considerations. However, one may also note that the core notion of semantics, meaning, today is conceived of quite differently than in, say, the seventies. How can that be? How can that what semantics is about, be different now from what is was back then? Or is this perhaps an indication that semantics is not as empirical as it is often thought to be? Moreover, it seems that in some deep sense meaning as explicated in semantics and interpretation as studied in various philosophical approaches are strangely at odds. Meaning is what interpretation is concerned with: meaning is, at least so it seems, what in the process of interpretation language users try to recover (or analogously, what they try to convey in production). Yet, the way meaning is conceived of in semantics seems not to square all that neatly with how the process of interpretation is supposed to proceed. In particular it seems to lack some of the intrinsic features that various approaches to interpretation assume it to have. Given these discrepancies, one wonders how the two can be incorporated within a single theory. And that such a theory is desirable goes, it may be presumed, without saying These are the reasons that figure in this paper. At the background there are some others.. (shrink)
A longstanding orthodoxy holds that the Mohists regard the promotion of li (benefit, 利) as their ultimate normative criterion, meaning that they measure what is yi (just, 義) or buyi (unjust, 不義) depending on whether it maximizes li or not. This orthodoxy dates back at least to Joseph Edkins (1859), who saw Mozi as a utilitarian and an ally of Bentham. In this paper, we will argue that this orthodoxy should be reconsidered because it does not square with (...) several passages from the Mozi. That the Mohists place a strong weight on the promotion of ‘li for the whole world (tianxia zhi li, 天下之利)’ is uncontroversial. We argue, however, that in certain cases the Mohist moral calculus diverges in its rationale or outcome from li-promotionalism. This position rejects the orthodoxy by showing that Mohism and li-promotionalism are not entirely coterminous. (shrink)
How about a different take on the rich and famous? First the obvious—the Harry Potter novels are primitive superstition that encourages children to believe in fantasy rather than take responsibility for the world-- the norm of course. JKR is just as clueless about herself and the world as most people, but about 200 times as destructive as the average American and about 800 times more than the average Chinese. She has been responsible for the destruction of maybe 30,000 hectares of (...) forest to produce these trash novels and all the erosion ensuing (not trivial as it’s at least 6 and maybe 12 tons/year soil into the ocean for everyone on earth or maybe 100 tons per American, and so about 5000 tons/year for Rowling’s books and movies and her 3 children). The earth loses at least 1% of its topsoil every year, so as it nears 2100, most of its food growing capacity will be gone. Then there is the huge amount of fuel burned and waste made to make and distribute the books and films, plastic dolls etc. She shows her lack of social responsibility by producing children rather than using her millions to encourage family planning or buy up the rain forest, and by promoting the conventional liberal stupidity of 3rd world supremacy that is destroying Britain, America, the world and her descendant’s future. Of course, she's not that different from the other 7.8 billion clueless - just noisier and more destructive. -/- It is the no free lunch problem writ large. The mob just can’t see that there is no such thing as helping one person without harming others. Rights or privileges given to new entrants into an overcrowded world can only diminish those of others. In spite of the massive ecological disasters happening in front of them everywhere everyday, they can’t pin them to the unrestrained motherhood of “the diverse”, which accounts for most of the population increase of the last century and all of that in this one. They lack some combination of intelligence, education, experience and sanity required to extrapolate the daily assaults on the resources and functioning of society to the eventual collapse of industrial civilization. Each meal, each trip by car or bus, each pair of shoes is another nail in the earth’s coffin. It has likely never crossed her mind that one seat on a plane from London to San Francisco produces about one ton of carbon which melts about 3 square meters of sea ice and as one of the overprivileged she has probably flown hundreds of such flights. -/- Not only the rich and famous, but nearly any public figure at all, including virtually all teachers, are pressured to be politically correct, which in the Western Democracies, now means social democratic (Neomarxist—i.e., diluted communist) third world supremacists working for the destruction of their own societies and their own descendants. So, those whose lack of education, experience, intelligence (and basic common sense), which should prohibit them from making any public statements at all, totally dominate all the media, creating the impression that the intelligent and civilized must favor democracy, diversity and equality, while the truth is that these are the problems and not the solutions, and that they themselves are the prime enemies of civilization. See my Suicide by Democracy 2nd ed (2019). -/- Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle’ 2nd ed (2019). Those interested in more of my writings may see ‘Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 3rd ed (2019), The Logical Structure of Human Behavior (2019), and Suicidal Utopian Delusions in the 21st Century 4th ed (2019) . (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.