"It is the purpose of this article to attempt to re-examine the account of Thrasymachus' doctrine in Plato's Republic, and to show how it can form a self-consistent whole. [...] In this paper it is maintained that Thrasymachus is holding a form of [natural right]." Note: Volume 40 = new series 9.
The paper is devoted to the sophistic method of "two-fold arguments" (antilogic). The traditional understanding of antilogic understood as an expression of agonistic and eristic tendencies of the sophists has been in recent decades, under the influence of G.B. Kerferd, replaced by the understanding of antilogic as an independent argumentative technique, having its own sources, essence, and goals. Following the interpretation of G.B. Kerferd, according to which the foundation of the antilogic is the opposition of two logoi resulting (...) from contradictions or opposites, necessarily associated with the contradictory character of the sensual world, in the paper it is argued that the philosophical basis of antilogic should be sought in the presentation of the views attributed to Protagoras and "adherents of flux" in Plato's dialogue Theaetetus. (shrink)
It is evident from studies of Proverbs that the book has a number of authors and was compiled over an extended period of time. Bible scholars differ in their opinions concerning the authorship and date of compilation of the book. There are a number of critics who believe that references to the names of some authors of Proverbs are symbolic. There are others who believe that the final compilation date of the book was around the 2nd century B.C. On the (...) other hand, there are those who believe that the names of the authors in question are literal and denote actual persons who really existed. There are also some who refute the 2nd century date of final compilation and instead advocate a final compilation date of around the 7th century B.C. Evidence would favour literal names for the authors in question and an earlier final compilation date of around 7th century B.C as opposed to symbolic names and final compilation date of the 2nd century. It is the purpose of this paper to present evidence to support literal names and final compilation date in the 7th century. (shrink)
This terse analysis of Christianity may help to provide a basis for understanding its true meaning and application. The authentic and foundational texts of 1 Corinthians 2:16, and Philippians 2:5 as well as Biblical Christian marriages are used here as exemplars that illustrate the definitive elements of the phenomena and its practice.
This brief, reflective research looks analytically at the impact of Greek philosophy on Christianity from three perspectives. They are: 1) the challenge that it presented to Christianity, 2) the signs of syncretism, and 3) Christian differentiation despite assimilation of aspects of Greek philosophy. Though not exhaustive because of its brevity, the study may help with discussions on the backgrounds of Christianity, and also stimulate an interest in the religion, politics, and history of the Levant in the first century.
The New Testament Writings and the Septuagint were possibly compiled in Hellenism’s greatest period of influence. It is reasonable to say that the writings were influenced by Hellenism because they were written in the language of Hellenism. This study examines how the hegemony of Hellenism, the worldviews of Hellenists, and the current of anti-Semitism impacted the New Testament Writers and influenced why they wrote how they wrote.
This is a three part discussion on linguistic relativity and the New Testament which provides some perspectives towards understanding the inter-relatedness of society, culture, and language as they would have impacted the writers of the New Testament. The ideas discussed should provide useful information for further research into the application of modern linguistics to New Testament hermeneutics, systematic theology, and biblical exegesis. The implications of linguistic relativity theory applied to this genre of literature are of extreme importance in light of (...) resurgence in interest and work in biblical languages and modern linguistics in the last quarter of a century. The tripartite analysis focuses on: transcription of oral tradition, the influence of languages on the autographs, and the implications of linguistic relativity for exegesis. (shrink)
Language sophistication indicates the development of language that incorporates differentiation or diversity that is constrained by integration that facilitates organization or unity. This prelude provides the backdrop for discussing language sophistication. Of necessity, any language that was a part of the continuum of salvation history (Heilsgeschichte ) should: 1) possess the sophistication necessary to re-define OT terminology, 2) have the hegemony to launch the NT church, 3) enjoy the universality that allowed for translation into contemporary languages, and 4) retain the (...) portability that enables interpretation in post-modern languages. (shrink)
For the NT writer, transcription was the process of recording the Christological/theological events of their time to papyrus or vellum. The effort here is to help the reader to understand that from the spoken word to papyrus/vellum was not dictation or a simple copy process but rather a very arduous procedure that was subject to practices of oral expression and limited by the orthography of the target or writing language. Writers were not simply copying but they were re-interpreting oral traditions (...) for the purpose of making a literary recording. (shrink)
The grammatical forms and material of the book of Revelation suggest a complex interplay of Old Testament and 1st century literature and language. As well, the book does not lack its own peculiarity and character that is unparalleled in the literate world. Various analytical tools including historical-comparative methodologies have been employed to reconstruct the linguistic paradigm of the book. Artificial intelligence and its derivatives provide alternate methods of probing this paradigm.
Instead of the usual dialectics that have now become very familiar to the evolution vs creation polemic, this article examines the different views rationally by adopting an eclectic approach that peruses evidence from secular history, cosmology, existential philosophy, systematic theology, and Biblical manuscripts in order to better understand the mind of God and the cosmos.
The style, tone and tenor of the New Testament writers are unique and exceptional. Jesus of Nazareth, Hebraic roots, Old Testament literature, oral tradition, Hellenistic influence, Roman governance, 1st century socio-politics, and multifarious linguistic elements combined to immortalize their literary records and make them indelible in the minds of contemplative readers. This book acknowledges previous work and seeks to connect the thoughts gleaned from them to seminal ideas that have their locus in the inquiry of how language can influence thought (...) and vice-versa. The information has been sourced from the fields of religion, biblical studies, linguistics, English literature, grammar learning, geo-politics, philology, psychology, philosophy, and artificial intelligence as they have to do with thought, language, and society. The carefully quoted and paraphrased opinions of writers from the related disciplines since the fourth century BCE are presented here. However, the volume reflects the very marked influence of 20th century thinkers who have had the benefit of perusing the work of their predecessors and 21st century pioneers who have at their disposal ultra-modern technology, computing power, and easy access to a global community to which the ideas may be applied for comparative analysis. (shrink)
Fruitful dialogue between faith and science occurs when Christians and scientists meaningfully answer the questions that they pose to each other in a disciplined conversation that refines their perspectives iteratively, each respecting the diligent conscientious work of the other. When questions are answered in this way the answers then form the basis for the continuation of the discussion and agreed solutions will chart principles and procedures for feedback and modification.
This paper introduces a new, expanded range of relevant cognitive psychological research on collaborative recall and social memory to the philosophical debate on extended and distributed cognition. We start by examining the case for extended cognition based on the complementarity of inner and outer resources, by which neural, bodily, social, and environmental resources with disparate but complementary properties are integrated into hybrid cognitive systems, transforming or augmenting the nature of remembering or decision-making. Adams and Aizawa, noting this distinctive complementarity argument, (...) say that they agree with it completely: but they describe it as “a non-revolutionary approach” which leaves “the cognitive psychology of memory as the study of processes that take place, essentially without exception, within nervous systems.” In response, we carve out, on distinct conceptual and empirical grounds, a rich middle ground between internalist forms of cognitivism and radical anti-cognitivism. Drawing both on extended cognition literature and on Sterelny’s account of the “scaffolded mind” (this issue), we develop a multidimensional framework for understanding varying relations between agents and external resources, both technological and social. On this basis we argue that, independent of any more “revolutionary” metaphysical claims about the partial constitution of cognitive processes by external resources, a thesis of scaffolded or distributed cognition can substantially influence or transform explanatory practice in cognitive science. Critics also cite various empirical results as evidence against the idea that remembering can extend beyond skull and skin. We respond with a more principled, representative survey of the scientific psychology of memory, focussing in particular on robust recent empirical traditions for the study of collaborative recall and transactive social memory. We describe our own empirical research on socially distributed remembering, aimed at identifying conditions for mnemonic emergence in collaborative groups. Philosophical debates about extended, embedded, and distributed cognition can thus make richer, mutually beneficial contact with independently motivated research programs in the cognitive psychology of memory. (shrink)
Girolamo Saccheri (1667--1733) was an Italian Jesuit priest, scholastic philosopher, and mathematician. He earned a permanent place in the history of mathematics by discovering and rigorously deducing an elaborate chain of consequences of an axiom-set for what is now known as hyperbolic (or Lobachevskian) plane geometry. Reviewer's remarks: (1) On two pages of this book Saccheri refers to his previous and equally original book Logica demonstrativa (Turin, 1697) to which 14 of the 16 pages of the editor's "Introduction" are devoted. (...) At the time of the first edition, 1920, the editor was apparently not acquainted with the secondary literature on Logica demonstrativa which continued to grow in the period preceding the second edition \ref[see D. J. Struik, in Dictionary of scientific biography, Vol. 12, 55--57, Scribner's, New York, 1975]. Of special interest in this connection is a series of three articles by A. F. Emch [Scripta Math. 3 (1935), 51--60; Zbl 10, 386; ibid. 3 (1935), 143--152; Zbl 11, 193; ibid. 3 (1935), 221--333; Zbl 12, 98]. (2) It seems curious that modern writers believe that demonstration of the "nondeducibility" of the parallel postulate vindicates Euclid whereas at first Saccheri seems to have thought that demonstration of its "deducibility" is what would vindicate Euclid. Saccheri is perfectly clear in his commitment to the ancient (and now discredited) view that it is wrong to take as an "axiom" a proposition which is not a "primal verity", which is not "known through itself". So it would seem that Saccheri should think that he was convicting Euclid of error by deducing the parallel postulate. The resolution of this confusion is that Saccheri thought that he had proved, not merely that the parallel postulate was true, but that it was a "primal verity" and, thus, that Euclid was correct in taking it as an "axiom". As implausible as this claim about Saccheri may seem, the passage on p. 237, lines 3--15, seems to admit of no other interpretation. Indeed, Emch takes it this way. (3) As has been noted by many others, Saccheri was fascinated, if not obsessed, by what may be called "reflexive indirect deductions", indirect deductions which show that a conclusion follows from given premises by a chain of reasoning beginning with the given premises augmented by the denial of the desired conclusion and ending with the conclusion itself. It is obvious, of course, that this is simply a species of ordinary indirect deduction; a conclusion follows from given premises if a contradiction is deducible from those given premises augmented by the denial of the conclusion---and it is immaterial whether the contradiction involves one of the premises, the denial of the conclusion, or even, as often happens, intermediate propositions distinct from the given premises and the denial of the conclusion. Saccheri seemed to think that a proposition proved in this way was deduced from its own denial and, thus, that its denial was self-contradictory (p. 207). Inference from this mistake to the idea that propositions proved in this way are "primal verities" would involve yet another confusion. The reviewer gratefully acknowledges extensive communication with his former doctoral students J. Gasser and M. Scanlan. ADDED 14 March 14, 2015: (1) Wikipedia reports that many of Saccheri's ideas have a precedent in the 11th Century Persian polymath Omar Khayyám's Discussion of Difficulties in Euclid, a fact ignored in most Western sources until recently. It is unclear whether Saccheri had access to this work in translation, or developed his ideas independently. (2) This book is another exemplification of the huge difference between indirect deduction and indirect reduction. Indirect deduction requires making an assumption that is inconsistent with the premises previously adopted. This means that the reasoner must perform a certain mental act of assuming a certain proposition. It case the premises are all known truths, indirect deduction—which would then be indirect proof—requires the reasoner to assume a falsehood. This fact has been noted by several prominent mathematicians including Hardy, Hilbert, and Tarski. Indirect reduction requires no new assumption. Indirect reduction is simply a transformation of an argument in one form into another argument in a different form. In an indirect reduction one proposition in the old premise set is replaced by the contradictory opposite of the old conclusion and the new conclusion becomes the contradictory opposite of the replaced premise. Roughly and schematically, P,Q/R becomes P,~R/~Q or ~R, Q/~P. Saccheri’s work involved indirect deduction not indirect reduction. (3) The distinction between indirect deduction and indirect reduction has largely slipped through the cracks, the cracks between medieval-oriented logic and modern-oriented logic. The medievalists have a heavy investment in reduction and, though they have heard of deduction, they think that deduction is a form of reduction, or vice versa, or in some cases they think that the word ‘deduction’ is the modern way of referring to reduction. The modernists have no interest in reduction, i.e. in the process of transforming one argument into another having exactly the same number of premises. Modern logicians, like Aristotle, are concerned with deducing a single proposition from a set of propositions. Some focus on deducing a single proposition from the null set—something difficult to relate to reduction. (shrink)
Practicing clinicians frequently think about behaviors both abstractly (i.e., in terms of symptoms, as in the Diagnostic and Statistical Manual of Mental Disorders, 5th ed., DSM–5; American Psychiatric Association, 2013) and concretely (i.e., in terms of individual clients, as in DSM–5 Clinical Cases; Barnhill, 2013). Does abstract/concrete framing influence clinical judgments about behaviors? Practicing mental health clinicians (N ? 74) were presented with hallmark symptoms of 6 disorders framed abstractly versus concretely, and provided ratings of their biological and psychological bases (...) (Experiment 1) and the likely effectiveness of medication and psychotherapy in alleviating them (Experiment 2). Clinicians perceived behavioral symptoms in the abstract to be more biologically and less psychologically based than when concretely described, and medication was viewed as more effective for abstractly than concretely described symptoms. These findings suggest a possible basis for miscommunication and misalignment of views between primarily research-oriented and primarily practice-oriented clinicians; furthermore, clinicians may accept new neuroscience research more strongly in the abstract than for individual clients. (shrink)
Psychopaths routinely disregard social norms by engaging in selfish, antisocial, often violent behavior. Commonly characterized as mentally disordered, recent evidence suggests that psychopaths are executing a well-functioning, if unscrupulous strategy that historically increased reproductive success at the expense of others. Natural selection ought to have favored strategies that spared close kin from harm, however, because actions affecting the fitness of genetic relatives contribute to an individual’s inclusive fitness. Conversely, there is evidence that mental disorders can disrupt psychological mechanisms designed to (...) protect relatives. Thus, mental disorder and adaptation accounts of psychopathy generate opposing hypotheses: psychopathy should be associated with an increase in the victimization of kin in the former account but not in the latter. Contrary to the mental disorder hypothesis, we show here in a sample of 289 violent offenders that variation in psychopathy predicts a decrease in the genetic relatedness of victims to offenders; that is, psychopathy predicts an increased likelihood of harming non-relatives. Because nepotistic inhibition in violence may be caused by dispersal or kin discrimination, we examined the effects of psychopathy on (1) the dispersal of offenders and their kin and (2) sexual assault frequency (as a window on kin discrimination). Although psychopathy was negatively associated with coresidence with kin and positively associated with the commission of sexual assault, it remained negatively associated with the genetic relatedness of victims to offenders after removing cases of offenders who had coresided with kin and cases of sexual assault from the analyses. These results stand in contrast to models positing psychopathy as a pathology, and provide support for the hypothesis that psychopathy reflects an evolutionary strategy largely favoring the exploitation of non-relatives. (shrink)
According to the forecast that billions of devices will get connected to the Internet by 2020. All these devices will produce a huge amount of data that will have to be handled rapidly and in a feasible manner. It will become a challenge for real-time applications to handle this huge data while considering security issues as well as time constraints. The main highlights of cloud computing are on-demand service and scalability; therefore the data generated from IoT devices are generally handled (...) in cloud infrastructure. Though, dealing with IoT application requests on the cloud exclusively is not a proficient result for some IoT applications particularly time-sensitive ones. These issues can be settled by utilizing another idea called, Fog computing. Fog computing has become one of the major fields of research from both academia and industry perspectives. The ongoing research commitments on few issues in fog computing are figuring out in this paper. At long last, this paper also highlights some open issues in fog with IoT, which will determine the future research direction for implementing Fog computing paradigm. (shrink)
The New Evil Demon problem has been hotly debated since the case was introduced in the early 1980’s (e.g. Lehrer and Cohen 1983; Cohen 1984), and there seems to be recent increased interest in the topic. In a forthcoming collection of papers on the New Evil Demon problem (Dutant and Dorsch, forthcoming), at least two of the papers, both by prominent epistemologists, attempt to resist the problem by appealing to the distinction between justification and excuses. My primary aim here is (...) to critically evaluate this new excuse maneuver as a response to the New Evil Demon problem. -/- Their response attempts to give us reason to reject the idea that victims of the New Evil Demon have justification for believing as they do. I shall argue that this approach is ultimately unsuccessful, however much of value can be learned from these attempts. In particular, progress in the debate can be made by following those who advance the excuse maneuver and make explicit the connection between epistemic justification and epistemic norms. By doing so, the questions being debated are clarified, as is the methodology being used to attempt to answer them. (shrink)
Varzi (2005) discussed 6 ways of symbolizing the sentence 'If Alf went to the movies then Beth went too, but only if she found a taxi-cab.' In the present reply, a seventh symbolization is offered, along with an analysis of the six alternatives discussed by Varzi.
You may not know me well enough to evaluate me in terms of my moral character, but I take it you believe I can be evaluated: it sounds strange to say that I am indeterminate, neither good nor bad nor intermediate. Yet I argue that the claim that most people are indeterminate is the conclusion of a sound argument—the indeterminacy paradox—with two premises: (1) most people are fragmented (they would behave deplorably in many and admirably in many other situations); (2) (...) fragmentation entails indeterminacy. I support (1) by examining psychological experiments in which most participants behave deplorably (e.g., by maltreating “prisoners” in a simulated prison) or admirably (e.g., by intervening in a simulated theft). I support (2) by arguing that, according to certain plausible conceptions, character evaluations presuppose behavioral consistency (lack of fragmentation). Possible reactions to the paradox include: (a) denying that the experiments are relevant to character; (b) upholding conceptions according to which character evaluations do not presuppose consistency; (c) granting that most people are indeterminate and explaining why it appears otherwise. I defend (c) against (a) and (b). (shrink)
A uniform theory of conditionals is one which compositionally captures the behavior of both indicative and subjunctive conditionals without positing ambiguities. This paper raises new problems for the closest thing to a uniform analysis in the literature (Stalnaker, Philosophia, 5, 269–286 (1975)) and develops a new theory which solves them. I also show that this new analysis provides an improved treatment of three phenomena (the import-export equivalence, reverse Sobel-sequences and disjunctive antecedents). While these results concern central issues in the study (...) of conditionals, broader themes in the philosophy of language and formal semantics are also engaged here. This new analysis exploits a dynamic conception of meaning where the meaning of a symbol is its potential to change an agent’s mental state (or the state of a conversation) rather than being the symbol’s content (e.g. the proposition it expresses). The analysis of conditionals is also built on the idea that the contrast between subjunctive and indicative conditionals parallels a contrast between revising and consistently extending some body of information. (shrink)
The disciplinary differentiation of sciences attracted Leibniz’s attention for a long period of time. From nowadays prospects it looks very well grounded as soon as in Leibniz’s manuscripts a modern scholar finds clue ideas of any research field which would tempt him to consider Leibniz as one of the founders of this particular discipline. We argue that this is possible only in retrospection and would significantly distort the essence of Leibniz’s epistemology. Our approach implies, in contrary, the investigation of the (...) Leibniz doctrine of signs on the background of the related philosophical problem, that of expression. The choice of semiotics is justified by the fact that it took a central place in his theoretical constructions, both those of natural sciences and of philosophy. In Leibniz system of knowledge the concept of notes and sings served a theoretical foundation of his most important and long-life aspiration to build up practical science of universal characteristics. In his eyes this practical science was the science of sciences, and we can consider it as the matrix for all possible scientific knowledge. (shrink)
Edited proceedings of an interdisciplinary symposium on consciousness held at the University of Cambridge in January 1978. Includes a foreword by Freeman Dyson. Chapter authors: G. Vesey, R.L. Gregory, H.C. Longuet-Higgins, N.K. Humphrey, H.B. Barlow, D.M. MacKay, B.D. Josephson, M. Roth, V.S. Ramachandran, S. Padfield, and (editorial summary only) E. Noakes. A scanned pdf is available from this web site (philpapers.org), while alternative versions more suitable for copying text are available from https://www.repository.cam.ac.uk/handle/1810/245189. -/- Page numbering convention for the pdf version (...) viewed in a pdf viewer is as follows: 'go to page n' accesses the pair of scanned pages 2n and 2n+1. Applicable licence: CC Attribution-NonCommercial-ShareAlike 2.0. (shrink)
Minimal Negation is defined within the basic positive relevance logic in the relational ternary semantics: B+. Thus, by defining a number of subminimal negations in the B+ context, principles of weak negation are shown to be isolable. Complete ternary semantics are offered for minimal negation in B+. Certain forms of reductio are conjectured to be undefinable without extending the positive logic. Complete semantics for such kinds of reductio in a properly extended positive logic are offered.
Although the word 'sustainability' is used broadly, scientific approaches to sustainability fall into one of two competing paradigms. Following the influential Brundtland report of 1987. some theorists identify sustainability with some form of resource availability, and develop indicators for sustainability that stress capital depletion. This approach has spawned debates about the intersubstitutivity of capitals, with many environmental theorists arguing that at some point, depletion of natural capital cannot be offset by increases in human or social capital. The alternative approach is (...) grounded in stock and flow systems systems modeling, and defines sustainability through indicators that determine whether the system structure is robust (e.g. resists perturbation), resilient (recovers after disruption) and adaptive (capable of change in response to external conditions). Both paradigms have applications in economics and ecology. (shrink)
The topic of this Handbook entry is the relationship between similarity and dimensional analysis, and some of the philosophical issues involved in understanding and making use of that relationship. Discusses basics of the relationship between units, dimensions, and quantities. It explains the significance of dimensionless parameters, and explains that similarity of a physical systems is established by showing equality of a certain set of dimensionless parameters that characterizes the system behavior. Similarity is always relative -- to some system behavior. Other (...) topics discussed: generalization of the notion of similarity, the difference between relative similarity and partial similarity; how the notion of similarity in science differs from similarity as it has been discussed in recent philosophy. Philosophers' views discussed: R. Giere, N. Goodman, P. Bridgman, and B. Ellis. (shrink)
Every activity of man is often done in a certain way. This includes the study of religion. Scholars have generally adopted various methods in studying religion. Some of these methods have been classified as unacademic, while some are academic and scientific. It is accepted that the proper way to study religion academically is through the scientific method which is a systemic and objective analysis of religious phenomena (Kirkpatric ed. 1159). Some other methods identified include: the polymethodic approach, descriptive approach, speculative (...) approach, culture area approach. Others are the use of library, interviews, online sources and participant observation. This paper is an attempt to identify and critically evaluate the methods used by three scholars of African traditional religion and culture in presenting selected topics in their works. In the process, the merits and demerits will be highlighted and constructive suggestions will be given. (shrink)
Three separate churches erected in Constantinople were all dedicated to the wisdom of Christ and erected on the same site one after the other. These churches were built between 360 and 537 AD by three different emperors: Constantius II, Theodosius the Younger, and Justinian I. The first two churches were consumed in flames after relatively short lives, but the final and greatest church still stands today, despite a history of extensive damage. This final edifice is the main focus of this (...) paper, owing to its 1500 year longevity and unprecedented architecture. If the entire History of Justinian's church is to considered, it is inaccurate to refer to it as a “church,” because although it remained a church for the first 900 years after its audacious construction, it was later converted into a mosque. Today it is a museum, in remembrance of its long history. (24). (shrink)
An analysis of the counter-intuitive properties of infinity as understood differently in mathematics, classical physics and quantum physics allows the consideration of various paradoxes under a new light (e.g. Zeno’s dichotomy, Torricelli’s trumpet, and the weirdness of quantum physics). It provides strong support for the reality of abstractness and mathematical Platonism, and a plausible reason why there is something rather than nothing in the concrete universe. The conclusions are far reaching for science and philosophy.
Psychoanalytic self-psychology as outlined by such depth psychologists as Jung, Fordham, Winnicott and Kohut provide a framework for conceptualizing a relationship of complementarity between psychic and immune defence as well as loss of bodily and self integration in disease. Physicist Erwin Schrödinger’s thesis that the so-called “arrow of time” does not necessarily deal a mortal blow to its creator is reminiscent of the concept of timeless dimensions of the unconscious mind and the Self in Analytical Psychology, manifest for instance, in (...) dream content and archetypal symbols. These notions are not only consistent with the concepts of timelessness and meaningful coincidence (synchronicity) in psychoanalysis. They are also implicitly spiritual with intimations of a numinous dimension of the evolutionary process in which humanity participates. This includes the idea that an evolving God becomes conscious through and is completed by humankind in a process (Incarnational) theology which regards the numinous as both immanent and transcendent. And concepts of mind which transcend the individual in a transpersonal sense. The treatment of the psychophysical problem by depth psychologist Carl Jung and physicist Wolfgang Pauli with their notion of the unconscious archetypes as timeless, cosmic ordering and regulating principles creating a bridge between mind and matter in a relationship of complementarity is compatible with such a perspective on the numinous which might in turn be useful for contemporary theology and spirituality. (shrink)
In recent years, a trend in AI research has started to pursue human-level, general artificial intelli-gence (AGI). Although the AGI framework is characterised by different viewpoints on what intelligence is and how to implement it in artificial systems, it conceptualises intelligence as flexible, general-purposed, and capable of self-adapting to different contexts and tasks. Two important ques-tions remain open: a) should AGI projects simu-late the biological, neural, and cognitive mecha-nisms realising the human intelligent behaviour? and b) what is the relationship, if (...) any, between the concept of general intelligence adopted by AGI and that adopted by psychometricians, i.e., the g factor? In this paper, we address these ques-tions and invite researchers in AI to open a dis-cussion on the theoretical conceptions and practi-cal purposes of the AGI approach. (shrink)
İlk kez derli toplu bir biçimde Aydınlanma Çağının en önemli ve etkili filozofu İmmanuel Kant’ın dillendirdiği evrensel ahlâk yasası uyarınca insan öyle eylemelidir ki, davranışıyla insanı araç haline getirmesin. Kültürden, toplumun koşullarından, geleneklerden ve dinsel etkilerden arındırılmış bu ilkenin buyurganlığı herhangi bir koşul ya da sınır tanımamasıyla ünlüdür. Oysa insanlar, hem bireysel hem de toplumsal gerekçelerle bu formel ilkeyle kendilerini her zaman bir çekişme içinde bulurlar. Kant’ın ahlâk anlayışı açısından etik değerlerin herhangi bir kültürel, toplumsal ya da siyasal baskı ya (...) da ilgi içerisinde anlaşılmaması gerekir. Bu ilke(ler) evrenseldir; ulusal ya da yerel çerçeveyle sınırlı görülemezler. Oysa özellikle ulusal sınırlarla belirlenmiş topraklarda yaşayan toplumlann kendi kültürlerinden geleneklerinden süzülerek gelen değerler her zaman bu evrensel ilkelerle uyum içinde değildirler. Aralarında gerilim arttığında genellikle evrensel ilkeler lehinde çözüme gidilmek istenir. Yani yerel değerler yeniden anlamlandırılır. Toplumlann örgütlenmesinde rol oynayan faktörlerin pek çoğunun devredışı bırakılmasının nedeni bunlann, sorunlan çözmekten ziyade artırdığının apaçık olmasıdır. Söz gelişi Ortaçağ boyunca Avrupa’da yaşanmış mezhep çatışmalan, ahlâk ilkelerininin dinsel değerler tarafından tanımlanmasının önündeki en büyük engeldir. Kant’m ahlâk felsefesinin Hıristiyanlık idealleri üzerine kurulmadığı apaçıktır, lyi’nin kökeninde günah/sevap ya da kutsal metinlerin buyurganlığı yoktur. Bunun böyle olmasının bir nedeni de ortaçağlar boyunca Avrupa’da yaşanmış olduğunu gözlemlediğimiz dinsel çatışmalann ahlâk ilkelerinin evrenselleşmesini önlemesidir. İyi ya da doğrunun belirli bir inanç sisteminin önceliklerine terk edilmesi fanatizmi azaltacağına güçlendirmiştir. Yani iyi bir Hıristiyan olmak iyi ahlâklı bir insan olmaktan önce gelmesi eleştirilmektedir. Kısacası ahlâkın temeline dinsel bir dayanak yerleştirmektense, aklı koyan bir yaklaşım söz konusudur. Dinsel çatışmalar bu tercihin yapılmasını kolaylaştırmıştır. Bu noktada ulusal değerler çok daha büyük güçlükler doğurmaktadır; çünkü tarihin kaydettiği olaylar, homojen bir değerler bütünlüğü olmadığım düşündürmektedir. Sonuç olarak, evrensel bir ahlâk ilkesine duyulan ihtiyaç ile içinde yaşadığımız çağın gerçekleri arasındaki makasın açılması tepeden dayatma tehlikesini artırırken, makasın kapanması toplumsal ilerlemenin ve gelişmenin meşruluğuna katkıda bulunmaktadır. Bu bildiride evrensel ilkelerin yerel değerleri anlamlandırmasının olanaklılığı, Kant’ın ahlâk felsefesi model alınarak sorgulanacaktır. (shrink)
Ordinary perceiving relies heavily on our sensing the spatial properties of objects, e.g., their shapes, sizes, and locations. Such spatial perception is central in everyday life. We safely cross a street by seeing and hearing the locations of oncoming vehicles. And we often identify objects by seeing and feeling their distinctive shapes. -/- To understand how we perceive spatial properties, we must explain the nature of the mental states figuring in spatial perception. The experience one has when seeing a cube, (...) e.g., differs from the experiences one has when seeing other shapes, e.g., spheres and pyramids. We must explain how such experiences differ to fully understand how we perceive differences in the spatial properties of objects. This presents a challenge often overlooked in philosophy and cognitive science. Whereas we can differentiate physical objects by their spatial properties, we cannot differentiate the experiences involved in perception in respect of their own spatial properties. Experiences are mental states, not physical objects, so they do not themselves have spatial properties; a visual experience of a 50 ft. tall cube, e.g., isn’t itself 50 ft. tall or cubical. So we must differentiate our perceptual experiences of those objects some other way, in terms of their own properties. -/- I argue the experiences figuring in spatial perception have mental properties distinct from, but analogous to, the spatial properties we perceive. The experience one has when seeing a square, e.g., has a property that resembles and differs from other such mental properties in ways parallel to the ways physical squares resemble and differ from other shapes. Just as squares are more similar to rectangles than triangles, the mental property of an experience of a square is more similar to that of an experience of a rectangle than that of an experience of a triangle. -/- I show how this theory helps solve several problems in philosophy and cognitive science; explaining change blindness, accounting for our ability to perceive combinations of distinct properties, e.g., color and shape, and determining whether the properties of experiences pertaining to the same spatial properties in different sensory modalities are themselves the same. (shrink)
As the title, The Entangled State of God and Humanity suggests, this lecture dispenses with the pre-Copernican, patriarchal, anthropomorphic image of God while presenting a case for a third millennium theology illuminated by insights from archetypal depth psychology, quantum physics, neuroscience and evolutionary biology. It attempts to smash the conceptual barriers between science and religion and in so doing, it may contribute to a Copernican revolution which reconciles both perspectives which have been apparently irreconcilable opposites since the sixteenth century. The (...) published work of C.G. Jung, Wolfgang Pauli, David Bohm and Teilhard de Chardin outline a process whereby matter evolves in increasing complexity from sub-atomic particles to the human brain and the emergence of a reflective consciousness leading to a noosphere evolving towards an Omega point. The noosphere is the envelope of consciousness and meaning superimposed upon the biosphere a concept central to the evolutionary thought of visionary Jesuit palaeontologist Pierre Teilhard de Chardin (The Phenomenon of Man). -/- His central ideas, like those of Jung with his archetypes, in particular that of the Self, provide intimations of a numinous principle implicit in cosmology and the discovery that in and through humanity, evolution becomes not only conscious of itself but also directed and purposive. Although in Jung’s conception it was a “late-born offspring of the unconscious soul”, consciousness has become the mirror which the universe has evolved to reflect upon itself and in which its very existence is revealed. Without consciousness, the universe would not know itself. The implication for process theology is that God and humanity are in an entangled state so that the evolution of God cannot be separated from that of humankind. -/- A process (Incarnational) theology inseminated by the theory of evolution is one in which humankind completes the individuation of God towards the wholeness represented for instance in cosmic mandala symbols (Jung, Collected Works, vol. 11). Jung believed that God needs humankind to become conscious, whole and complete, a thesis explored in my book The Individuation of God: Integrating Science and Religion (Wilmette, IL: Chiron Publications 2012). This process theology like that implicit in the work of Teilhard de Chardin, is panentheistic so that God is immanent in nature though not identical with it (Atmanspacher: 2014: 284). (shrink)
Don Ihde has characterized his philosophy as "phenomenology + pragmatism." This article argues that Ihde's pragmatism can be understood as consistency with two philosophical commitments from the first generation of American pragmatists (e.g. Peirce, James, Dewey and Addams). First, Ihde's notion of embodiment relations for tools and techniques is consistent with the organism-environment relational epistemology of these thinkers. Second, his desire to dissociate himself from romantic and neo-idealist readings of the phenomenological tradition link him with their naturalism.
We can distinguish two concepts of respect for persons: appraisal respect , an attitude based on a positive appraisal of a person's moral character, and recognition respect , the practice of treating persons with consideration based on the belief that they deserve such treatment. After engaging in an extended analysis of these concepts, I examine two "truisms" about them. We justifiably believe of some persons that they have good character and thus deserve our esteem . Frequently it pays to be (...) disrespectful; e.g., insulting those who insult us may put them in their place. By using empirical results from social and personality psychology and techniques from decision theory in addition to conceptual considerations, I argue that, surprisingly, the above two "truisms" are false. Extensive psychological evidence indicates that most persons are indeterminate---overall neither good nor bad nor intermediate---and that our information about specific persons almost never distinguishes those who are indeterminate from those who are not. The strategy of habitually avoiding disrespectful behavior maximizes long-term expected utility. In sum, we have good pragmatic reason to treat persons respectfully, but we have good epistemic reason to avoid esteeming or despising them. (shrink)
To follow the legacy of Dr. B.R. Ambedkar, a RUSA Sponsored One-Day Facutly Development Programme on “Dr. B.R. Ambedkar, Indian Constitution and Indian Society” organised by the Department of Philosophy and P.G. Department of Public Administation held on 20th January, 2016 was a creative and fruitful effort to bring together the scholars and academicians from several disciplines to participate in the deliberations related to the conceptual understanding and insights of the philosophy of Dr. B.R. Ambedkar.
The purpose of this paper is to provide a detailed technical protocol analysis of chess masters' evaluative expertise, paying particular attention to the analysis of the structure of their memory process in evaluating foreseen possibilities in games of dynamic equilibrium. The paper has two purposes. First, to publish a results chapter from my DPhil thesis (in revised journal article form) attending to the measurement of foresight in chess masters' evaluation process, testing alternative theories of cognitive expertise in the domain of (...) chess; and second to provide a subset of the technical graphical analysis that corresponds to that measurement to preserve this protocol analysis for access in the academic domain for future studies of expert memory and foresight (e.g., Ericsson & Simon, 1993). The step-by-step protocol analysis consists of: (i) an introduction to foresight cognition as hypothesis testing, (ii) a theoretical review in the domain of chess masters' expertise according to the theoretical frameworks in that field purporting hypotheses relevant to chess masters' evaluative skill processes, and (iii) summary tables and non-parametric statistical analysis corroborating chunking theory frameworks of expert cognition (e.g., DeGroot, 1965; Newell & Simon, 1972; Gobet, 1998; Gobet et al., 2004), and refuting the alternative search-evaluation models (e.g., Holding & Reynolds, 1982). Moreover, the journal article espouses the preservation of the traditional protocol analysis method core to the field of expert cognition (DeGroot, 1969; Kotov, 1971). The full protocol analysis can be found in monograph form here on my SSRN profile in ‘The role of falsification in hypothesis testing’. It takes the form of a specialist population study (e.g., detailed case study work; Luria, 1987). Thus the outline consists of a short introduction, a theoretical methodological review discussing protocol analysis methods for specialist population studies in cognition (with particular attention to the preservation of protocol analysis methods for chess studies in cognition and expert memory/ with a fresh angle on the foresight process), and the full set of protocol analyses with corresponding problem behaviour graphs. A subset of the main results has been published elsewhere (e.g., Cowley & Byrne, 2004; Cowley 2006), receiving scientific and scientific journalistic acclaim (e.g., Nature Online News 2004). (shrink)
According to the mental model theory, causes and enablers differ in meaning, and therefore in their logical consequences (Goldvarg & Johnson-Laird, 2001). They are consistent with different possibilities. Recent psychological studies have argued to the contrary, and suggested that linguistic cues guide this distinction (Kuhnmünch & Beller, 2005). The issue is important because neither British nor American law recognizes this distinction (e.g., Roberts & Zuckerman, 2004). Yet, in our view, it is central to human conceptions of causality. Hence, in two (...) experiments, we examined our participants’ ability to distinguish between causes and enablers in scenarios describing the actions of two agents and a subsequent outcome, e.g.: ‘Mary threw a lighted cigarette into a bush. Just as the cigarette was going out, Laura deliberately threw petrol on it. The resulting fire burnt down her neighbor’s house.’ Here Mary enabled the fire to occur, whereas Laura caused the fire to occur. (shrink)
There are two competing theoretical frameworks with which cognitive sciences examines how people reason. These frameworks are broadly categorized into logic and probability. This paper reports two applied experiments to test which framework explains better how people reason about evidence in criminal cases. Logical frameworks predict that people derive conclusions from the presented evidence to endorse an absolute value of certainty such as ‘guilty’ or ‘not guilty’ (e.g., Johnson-Laird, 1999). But probabilistic frameworks predict that people derive conclusions from the presented (...) evidence in order that they may use knowledge of prior instances to endorse a conclusion of guilt which varies in certainty (e.g., Tenenbaum, Griffiths, & Kemp, 2006). Experiment 1 showed that reasoning about evidence of prior instances, such as disclosed prior convictions, affected participants’ underlying ratings of guilt. Participants’ guilt ratings increased in certainty according to the number of disclosed prior convictions. Experiment 2 showed that participants’ reasoning about evidence of prior convictions and some forensic evidence tended to lead participants to endorse biased ‘guilty’ verdicts when rationally the evidence does not prove guilt. Both results are predicted by probabilistic frameworks. The paper considers the implications for logical and probabilistic frameworks for reasoning in the real world. (shrink)
DNA evidence is one of the most significant modern advances in the search for truth since the cross examination, but its format as a random-match-probability makes it difficult for people to assign an appropriate probative value (Koehler, 2001). While Frequentist theories propose that the presentation of the match as a frequency rather than a probability facilitates more accurate assessment (e.g., Slovic et al., 2000), Exemplar-Cueing Theory predicts that the subjective weight assigned may be affected by the frequency or probability format, (...) and how easily examples of the event, i.e., ‘exemplars’, are generated from linguistic cues that frame the match in light of further evidence (Koehler & Macchi, 2004). This paper presents two juror research studies to examine the difficulties that jurors have in assigning appropriate probative value to DNA evidence when contradictory evidence is presented. Study 1 showed that refuting evidence significantly reduced guilt judgments when exemplars were linguistically cued, even when the probability match and the refuting evidence had the same objective probative value. Moreover, qualitative reason for judgment responses revealed that interpreting refuting evidence was found to be complex and not necessarily reductive; refutation was found indicative of innocence or guilt depending on whether exemplars have been cued or not. Study 2 showed that the introduction of judges’ directions to linguistically cue exemplars, did not increase the impact of refuting evidence beyond its objective probative value, but less guilty verdicts were returned when jurors were instructed to consider all possible explanations of the evidence. The results are discussed in light of contradictory frequentist and exemplar-cueing theoretical positions, and their real-world consequences. (shrink)
Colistin is drug of choice for treatment of carbapenem resistant Acinetobacter baumannii infections, but increasing colistin resistance (Col-R) has been emerged across the globe. In this study, we collected 187 A. baumannii isolates from specimens of 240 patients admitted to intensive care units (ICUs) of two hospitals in Kerman, Iran during 2017-2018. Among the isolates, four isogenic extensive drug-resistant (XDR) strains with Minimum Inhibitory Concentration (MIC) ≥4 µg/mL against colistin were selected for further study. All the Col-R isolates harbored an (...) intrinsic blaOXA–51 and blaOXA-23 carbapenemase genes. They were resistant to all antibiotic classes except tigecycline and ampicillin-sulbactam. The Col-R isolates were belonged to clonal complex 2, a new ST type 1752 and had identical high-quality RAPD-PCR fingerprints. Phylogenetic tree analysis of PmrA/B suggested that, the Col-R A. baumannii were emerged by endogenous mutations rather than acquisition of preexisting clone. In addition, DNA sequencing of the Col-R genes, showed three different nonsynonymous substitutions in LpxA (N136→K), LpxC (P293→Q) and PmrB transmembrane motif (V21→F and S28→R) of the strains 1 and 3. Interestingly, these strains showed high level MIC against colistin (MIC 32µg/mL). Analysis of gene expression by relative quantitative real-time PCR (qRT-PCR) revealed 8- and 7.1-folds increases in the transcription levels of pmrB and pmrC genes in the strain 1 when cells grown in the presence of 16 µg/mL colistin (p≤0.01). In conclusion, the above results provide valuable insights into the mechanism of Col-R in A. baumannii and the expressions of relative genes. (shrink)
Logical anti-exceptionalism is the view that logic is not special among the sciences. In particular, anti-exceptionalists claim that logical theory choice is effected on the same bases as any other theory choice procedure, i.e., by abduction, by weighting pros and cons of rival views, and by judging which theory scores best on a given set of parameters. In this paper, we first present the anti-exceptionalists favourite method for logical theory choice. After spotting on important features of the method, we discuss (...) how they lead to trouble when the subject matter of choice is logic itself. The major difficulty we find concerns the role of the logic employed to evaluate theory choice, or, more specifically, the role of the metalanguage employed to run the abductive method. When rival logical theories are being evaluated and compared, we argue, it is difficult not to beg some important questions; the metalanguage introduce biases difficult to avoid. These difficulties seem to be inherent to the method described. We suggest that they put some constraints on the scope of application of the method of abductive theory choice in logic and on the kind of disputes the anti-exceptionalist may plausibly expect to solve with it. We end the paper with some suggestions for how the anti-exceptionalist may address these issues on this front. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.