This dissertation examines aspects of the interplay between computing and scientific practice. The appropriate foundational framework for such an endeavour is rather real computability than the classical computability theory. This is so because physical sciences, engineering, and applied mathematics mostly employ functions defined in continuous domains. But, contrary to the case of computation over natural numbers, there is no universally accepted framework for real computation; rather, there are two incompatible approaches --computable analysis and BSS model--, both claiming to formalise (...) algorithmic computation and to offer foundations for scientific computing. -/- The dissertation consists of three parts. In the first part, we examine what notion of 'algorithmic computation' underlies each approach and how it is respectively formalised. It is argued that the very existence of the two rival frameworks indicates that 'algorithm' is not one unique concept in mathematics, but it is used in more than one way. We test this hypothesis for consistency with mathematical practice as well as with key foundational works that aim to define the term. As a result, new connections between certain subfields of mathematics and computer science are drawn, and a distinction between 'algorithms' and 'effective procedures' is proposed. -/- In the second part, we focus on the second goal of the two rival approaches to real computation; namely, to provide foundations for scientific computing. We examine both frameworks in detail, what idealisations they employ, and how they relate to floating-point arithmetic systems used in real computers. We explore limitations and advantages of both frameworks, and answer questions about which one is preferable for computational modelling and which one for addressing general computability issues. -/- In the third part, analogcomputing and its relation to analogue (physical) modelling in science are investigated. Based on some paradigmatic cases of the former, a certain view about the nature of computation is defended, and the indispensable role of representation in it is emphasized and accounted for. We also propose a novel account of the distinction between analog and digital computation and, based on it, we compare analog computational modelling to physical modelling. It is concluded that the two practices, despite their apparent similarities, are orthogonal. (shrink)
This paper argues that the idea of a computer is unique. Calculators and analog computers are not different ideas about computers, and nature does not compute by itself. Computers, once clearly defined in all their terms and mechanisms, rather than enumerated by behavioral examples, can be more than instrumental tools in science, and more than source of analogies and taxonomies in philosophy. They can help us understand semantic content and its relation to form. This can be achieved because they (...) have the potential to do more than calculators, which are computers that are designed not to learn. Today’s computers are not designed to learn; rather, they are designed to support learning; therefore, any theory of content tested by computers that currently exist must be of an empirical, rather than a formal nature. If they are designed someday to learn, we will see a change in roles, requiring an empirical theory about the Turing architecture’s content, using the primitives of learning machines. This way of thinking, which I call the intensional view of computers, avoids the problems of analogies between minds and computers. It focuses on the constitutive properties of computers, such as showing clearly how they can help us avoid the infinite regress in interpretation, and how we can clarify the terms of the suggested mechanisms to facilitate a useful debate. Within the intensional view, syntax and content in the context of computers become two ends of physically realizing correspondence problems in various domains. (shrink)
In this article, after presenting the basic idea of causal accounts of implementation and the problems they are supposed to solve, I sketch the model of computation preferred by Chalmers and argue that it is too limited to do full justice to computational theories in cognitive science. I also argue that it does not suffice to replace Chalmers’ favorite model with a better abstract model of computation; it is necessary to acknowledge the causal structure of physical computers that is not (...) accommodated by the models used in computability theory. Additionally, an alternative mechanistic proposal is outlined. (shrink)
Information Theory, Evolution and The Origin ofLife: The Origin and Evolution of Life as a Digital Message: How Life Resembles a Computer, Second Edition. Hu- bert P. Yockey, 2005, Cambridge University Press, Cambridge: 400 pages, index; hardcover, US $60.00; ISBN: 0-521-80293-8. The reason that there are principles of biology that cannot be derived from the laws of physics and chemistry lies simply in the fact that the genetic information content of the genome for constructing even the simplest organisms is much (...) larger than the information content of these laws. Yockey in his previous book (1992, 335) In this new book, Information Theory, Evolution and The Origin ofLife, Hubert Yockey points out that the digital, segregated, and linear character of the genetic information system has a fundamental significance. If inheritance would blend and not segregate, Darwinian evolution would not occur. If inheritance would be analog, instead of digital, evolution would be also impossible, because it would be impossible to remove the effect of noise. In this way, life is guided by information, and so information is a central concept in molecular biology. The author presents a picture of how the main concepts of the genetic code were developed. He was able to show that despite Francis Crick's belief that the Central Dogma is only a hypothesis, the Central Dogma of Francis Crick is a mathematical consequence of the redundant nature of the genetic code. The redundancy arises from the fact that the DNA and mRNA alphabet is formed by triplets of 4 nucleotides, and so the number of letters (triplets) is 64, whereas the proteome alphabet has only 20 letters (20 amino acids), and so the translation from the larger alphabet to the smaller one is necessarily redundant. Except for Tryptohan and Methionine, all amino acids are coded by more than one triplet, therefore, it is undecidable which source code letter was actually sent from mRNA. This proof has a corollary telling that there are no such mathematical constraints for protein-protein communication. With this clarification, Yockey contributes to diminishing the widespread confusion related to such a central concept like the Central Dogma. Thus the Central Dogma prohibits the origin of life "proteins first." Proteins can not be generated by "self-organization." Understanding this property of the Central Dogma will have a serious impact on research on the origin of life. (shrink)
Since the early eighties, computationalism in the study of the mind has been “under attack” by several critics of the so-called “classic” or “symbolic” approaches in AI and cognitive science. Computationalism was generically identified with such approaches. For example, it was identified with both Allen Newell and Herbert Simon’s Physical Symbol System Hypothesis and Jerry Fodor’s theory of Language of Thought, usually without taking into account the fact ,that such approaches are very different as to their methods and aims. Zenon (...) Pylyshyn, in his influential book Computation and Cognition, claimed that both Newell and Fodor deeply influenced his ideas on cognition as computation. This probably added to the confusion, as many people still consider Pylyshyn’s book as paradigmatic of the computational approach in the study of the mind. Since then, cognitive scientists, AI researchers and also philosophers of the mind have been asked to take sides on different “paradigms” that have from time to time been proposed as opponents of (classic or symbolic) computationalism. Examples of such oppositions are: -/- computationalism vs. connectionism, computationalism vs. dynamical systems, computationalism vs. situated and embodied cognition, computationalism vs. behavioural and evolutionary robotics. -/- Our preliminary claim in section 1 is that computationalism should not be identified with what we would call the “paradigm (based on the metaphor) of the computer” (in the following, PoC). PoC is the (rather vague) statement that the mind functions “as a digital computer”. Actually, PoC is a restrictive version of computationalism, and nobody ever seriously upheld it, except in some rough versions of the computational approach and in some popular discussions about it. Usually, PoC is used as a straw man in many arguments against computationalism. In section 1 we look in some detail at PoC’s claims and argue that computationalism cannot be identified with PoC. In section 2 we point out that certain anticomputationalist arguments are based on this misleading identification. In section 3 we suggest that the view of the levels of explanation proposed by David Marr could clarify certain points of the debate on computationalism. In section 4 we touch on a controversial issue, namely the possibility of developing a notion of analog computation, similar to the notion of digital computation. A short conclusion follows in section 5. (shrink)
In the brain the relations between free neurons and the conditioned ones establish the constraints for the informational neural processes. These constraints reflect the systemenvironment state, i.e. the dynamics of homeocognitive activities. The constraints allow us to define the cost function in the phase space of free neurons so as to trace the trajectories of the possible configurations at minimal cost while respecting the constraints imposed. Since the space of the free states is a manifold or a non orthogonal space, (...) the minimum distance is not a straight line but a geodesic. The minimum condition is expressed by a set of ordinary differential equation ( ODE ) that in general are not linear. In the brain there is not an algorithm or a physical field that regulates the computation, then we must consider an emergent process coming out of the neural collective behavior triggered by synaptic variability. We define the neural computation as the study of the classes of trajectories on a manifold geometry defined under suitable constraints. The cost function supervises pseudo equilibrium thermodynamics effects that manage the computational activities from beginning to end and realizes an optimal control through constraints and geodetics. The task of this work is to establish a connection between the geometry of neural computation and cost functions. To illustrate the essential mathematical aspects we will use as toy model a Network Resistor with Adaptive Memory (Memristors).The information geometry here defined is an analog computation, therefore it does not suffer the limits of the Turing computation and it seems to respond to the demand for a greater biological plausibility. The model of brain optimal control proposed here can be a good foundation for implementing the concept of "intentionality",according to the suggestion of W. Freeman. Indeed, the geodesic in the brain states can produce suitable behavior to realize wanted functions and invariants as neural expressionsof cognitive intentions. (shrink)
This article argues that if panpsychism is true, then there are grounds for thinking that digitally-based artificial intelligence may be incapable of having coherent macrophenomenal conscious experiences. Section 1 briefly surveys research indicating that neural function and phenomenal consciousness may be both analog in nature. We show that physical and phenomenal magnitudes—such as rates of neural firing and the phenomenally experienced loudness of sounds—appear to covary monotonically with the physical stimuli they represent, forming the basis for an analog (...) relationship between the three. Section 2 then argues that if this is true and micropsychism—the panpsychist view that phenomenal consciousness or its precursors exist at a microphysical level of reality—is also true, then human brains must somehow manipulate fundamental microphysical-phenomenal magnitudes in an analog manner that renders them phenomenally coherent at a macro level. However, Sect. 3 argues that because digital computation abstracts away from microphysical-phenomenal magnitudes—representing cognitive functions non-monotonically in terms of digits—digital computation may be inherently incapable of realizing coherent macroconscious experience. Thus, if panpsychism is true, digital AI may be incapable of achieving phenomenal coherence. Finally, Sect. 4 briefly examines our argument’s implications for Tononi’s Integrated Information Theory theory of consciousness, which we contend may need to be supplanted by a theory of macroconsciousness as analog microphysical-phenomenal information integration. (shrink)
Cognition is commonly taken to be computational manipulation of representations. These representations are assumed to be digital, but it is not usually specified what that means and what relevance it has for the theory. I propose a specification for being a digital state in a digital system, especially a digital computational system. The specification shows that identification of digital states requires functional directedness, either for someone or for the system of which it is a part. In the case or digital (...) representations, to be a token of a representational type, where the function of the type is to represent. [An earlier version of this paper was discussed in the web-conference "Interdisciplines" https://web.archive.org/web/20100221125700/http://www.interdisciplines.org/adaptation/papers/7 ]. (shrink)
Advancements in computing, instrumentation, robotics, digital imaging, and simulation modeling have changed science into a technology-driven institution. Government, industry, and society increasingly exert their influence over science, raising questions of values and objectivity. These and other profound changes have led many to speculate that we are in the midst of an epochal break in scientific history. -/- This edited volume presents an in-depth examination of these issues from philosophical, historical, social, and cultural perspectives. It offers arguments both for and (...) against the epochal break thesis in light of historical antecedents. Contributors discuss topics such as: science as a continuing epistemological enterprise; the decline of the individual scientist and the rise of communities; the intertwining of scientific and technological needs; links to prior practices and ways of thinking; the alleged divide between mode-1 and mode-2 research methods; the commodification of university science; and the shift from the scientific to a technological enterprise. Additionally, they examine the epochal break thesis using specific examples, including the transition from laboratory to real world experiments; the increased reliance on computer imaging; how analog and digital technologies condition behaviors that shape the object and beholder; the cultural significance of humanoid robots; the erosion of scientific quality in experimentation; and the effect of computers on prediction at the expense of explanation. -/- Whether these events represent a historic break in scientific theory, practice, and methodology is disputed. What they do offer is an important occasion for philosophical analysis of the epistemic, institutional and moral questions affecting current and future scientific pursuits. (shrink)
Enhancement technologies may someday grant us capacities far beyond what we now consider humanly possible. Nick Bostrom and Anders Sandberg suggest that we might survive the deaths of our physical bodies by living as computer emulations. In 2008, they issued a report, or “roadmap,” from a conference where experts in all relevant fields collaborated to determine the path to “whole brain emulation.” Advancing this technology could also aid philosophical research. Their “roadmap” defends certain philosophical assumptions required for this technology’s success, (...) so by determining the reasons why it succeeds or fails, we can obtain empirical data for philosophical debates regarding our mind and selfhood. The scope ranges widely, so I merely survey some possibilities, namely, I argue that this technology could help us determine (1) if the mind is an emergent phenomenon, (2) if analog technology is necessary for brain emulation, and (3) if neural randomness is so wild that a complete emulation is impossible. (shrink)
There is much discussion about whether the human mind is a computer, whether the human brain could be emulated on a computer, and whether at all physical entities are computers (pancomputationalism). These discussions, and others, require criteria for what is digital. I propose that a state is digital if and only if it is a token of a type that serves a particular function - typically a representational function for the system. This proposal is made on a syntactic level, assuming (...) three levels of description (physical, syntactic, semantic). It suggests that being digital is a matter of discovery or rather a matter of how we wish to describe the world, if a functional description can be assumed. Given the criterion provided and the necessary empirical research, we should be in a position to decide on a given system (e.g. the human brain) whether it is a digital system and can thus be reproduced in a different digital system (since digital systems allow multiple realization). (shrink)
Digital practices in design, together with computer-assisted manufacturing (CAM), have inspired the reflection of philosophers, theorists, and historians over the last decades. Gilles Deleuze’s The Fold: Leibniz and the Baroque (1988) presents one of the first and most successful concepts created to think about these new design and manufacturing practices.1 Deleuze proposed a new concept of the technological object, which was inspired by Bernard Cache’s digital design practices and computer-assisted manufacturing. Deleuze compared Cache’s practices to Leibniz’s differential calculus-based notion of (...) the parametric curve. From this perspective, the object is no longer an essential form: rather, it is functional, defined by a family of parametric curves. In Deleuze’s terms, it is an objectile. This definition grasps a particular aspect of modernity in the technological object, rendering possible the industrial production of “the unique object” (la pièce unique), while making obsolete the homogeneity that comes with industrial standardization. What follows introduces developmental biology into this design matrix, interrogating the relationship between parametric design and computer-assisted manufactory on the one hand and biological morphogenetic processes on the other. Is one necessary for the other? Does an architect need computation in order to render morphogenetic shapes? In answering these questions, this chapter displaces the centrality of digital technology in the design of shape-shifting forms within architecture and calls for a rethinking of design by way of analog prototypes. I argue that “thinking through analogs” offers another means of generating parametrically based morphogenetic forms. (shrink)
Advocates of dynamic systems have suggested that higher mental processes are based on continuous representations. In order to evaluate this claim, we first define the concept of representation, and rigorously distinguish between discrete representations and continuous representations. We also explore two important bases of representational content. Then, we present seven arguments that discrete representations are necessary for any system that must discriminate between two or more states. It follows that higher mental processes require discrete representations. We also argue that discrete (...) representations are more influenced by conceptual role than continuous representations. We end by arguing that the presence of discrete representations in cognitive systems entails that computationalism (i.e., the view that the mind is a computational device) is true, and that cognitive science should embrace representational pluralism. (shrink)
There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out [2]. The third, statistical learning (...) theory, has introduced measures of capacity that control (in part) the expected risk of classifiers [3]. These capacities quantify the expectations regarding future data that learning algorithms embed into classifiers. Solomonoff and Hutter have applied algorithmic information to prove remarkable results on universal induction. Shannon information provides the mathematical foundation for communication and coding theory. However, both approaches have shortcomings. Algorithmic information is not computable, severely limiting its practical usefulness. Shannon information refers to ensembles rather than actual events: it makes no sense to compute the Shannon information of a single string – or rather, there are many answers to this question depending on how a related ensemble is constructed. Although there are asymptotic results linking algorithmic and Shannon information, it is unsatisfying that there is such a large gap – a difference in kind – between the two measures. This note describes a new method of quantifying information, effective information, that links algorithmic information to Shannon information, and also links both to capacities arising in statistical learning theory [4, 5]. After introducing the measure, we show that it provides a non-universal analog of Kolmogorov complexity. We then apply it to derive basic capacities in statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. A nice byproduct of our approach is an interpretation of the explanatory power of a learning algorithm in terms of the number of hypotheses it falsifies [6], counted in two different ways for the two capacities. We also discuss how effective information relates to information gain, Shannon and mutual information. (shrink)
This paper develops a theory of analog representation. We first argue that the mark of the analog is to be found in the nature of a representational system’s interpretation function, rather than in its vehicles or contents alone. We then develop the rulebound structure theory of analog representation, according to which analog systems are those that use interpretive rules to map syntactic structural features onto semantic structural features. The theory involves three degree-theoretic measures that capture three (...) independent ways in which a system can be more or less analog. We explain how our theory improves upon prior accounts of analog representation, provides plausible diagnoses for novel challenge cases, extends to hybrid systems that are partially analog and partially symbolic, and accounts for some of the advantages and disadvantages of representing analogically versus symbolically. (shrink)
We analyze the properties of optimum portfolios, the price of which is considered a new quantum variable and derive a quantum analog of the Black-Scholes formula for the price of financial variables in assumption that the market dynamics can by considered as its continuous weak measurement at no-arbitrage condition.
Intelligent Tutoring Systems (ITS) has a wide influence on the exchange rate, education, health, training, and educational programs. In this paper we describe an intelligent tutoring system that helps student study computer networks. The current ITS provides intelligent presentation of educational content appropriate for students, such as the degree of knowledge, the desired level of detail, assessment, student level, and familiarity with the subject. Our Intelligent tutoring system was developed using ITSB authoring tool for building ITS. A preliminary evaluation of (...) the ITS was done by a group of students and teachers. The results were acceptable. (shrink)
The Computational Theory of Mind (CTM) holds that cognitive processes are essentially computational, and hence computation provides the scientific key to explaining mentality. The Representational Theory of Mind (RTM) holds that representational content is the key feature in distinguishing mental from non-mental systems. I argue that there is a deep incompatibility between these two theoretical frameworks, and that the acceptance of CTM provides strong grounds for rejecting RTM. The focal point of the incompatibility is the fact that representational content is (...) extrinsic to formal procedures as such, and the intended interpretation of syntax makes no difference to the execution of an algorithm. So the unique 'content' postulated by RTM is superfluous to the formal procedures of CTM. And once these procedures are implemented in a physical mechanism, it is exclusively the causal properties of the physical mechanism that are responsible for all aspects of the system's behaviour. So once again, postulated content is rendered superfluous. To the extent that semantic content may appear to play a role in behaviour, it must be syntactically encoded within the system, and just as in a standard computational artefact, so too with the human mind/brain - it's pure syntax all the way down to the level of physical implementation. Hence 'content' is at most a convenient meta-level gloss, projected from the outside by human theorists, which itself can play no role in cognitive processing. (shrink)
The development of technology is unbelievably rapid. From limited local networks to high speed Internet, from crude computing machines to powerful semi-conductors, the world had changed drastically compared to just a few decades ago. In the constantly renewing process of adapting to such an unnaturally high-entropy setting, innovations as well as entirely new concepts, were often born. In the business world, one such phenomenon was the creation of a new type of entrepreneurship. This paper proposes a new academic discipline (...) of computational entrepreneurship, which centers on: (i) an exponentially growing (and less expensive) computing power, to the extent that almost everybody in a modern society can own and use that; (ii) omnipresent high-speed Internet connectivity, wired or wireless, representing our modern day’s economic connectomics; (iii) growing concern of exploiting “serendipity” for a strategic commercial advantage; and (iv) growing capabilities of lay people in performing calculations for their informed decisions in taking fast-moving entrepreneurial opportunities. Computational entrepreneurship has slowly become a new mode of operation for business ventures and will likely bring the academic discipline of entrepreneurship back to mainstream economics. (shrink)
This book addresses key conceptual issues relating to the modern scientific and engineering use of computer simulations. It analyses a broad set of questions, from the nature of computer simulations to their epistemological power, including the many scientific, social and ethics implications of using computer simulations. The book is written in an easily accessible narrative, one that weaves together philosophical questions and scientific technicalities. It will thus appeal equally to all academic scientists, engineers, and researchers in industry interested in questions (...) related to the general practice of computer simulations. (shrink)
This chapter presents a re-understanding of the contents of our analog magnitude representations (e.g., approximate duration, distance, number). The approximate number system (ANS) is considered, which supports numerical representations that are widely described as fuzzy, noisy, and limited in their representational power. The contention is made that these characterizations are largely based on misunderstandings—that what has been called “noise” and “fuzziness” is actually an important epistemic signal of confidence in one’s estimate of the value. Rather than the ANS having (...) noisy or fuzzy numerical content, it is suggested that the ANS has exquisitely precise numerical content that is subject to epistemic limitations. Similar considerations will arise for other analog representations. The chapter discusses how this new understanding of ANS representations recasts the learnability problem for number and the conceptual changes that children must accomplish in the number domain. (shrink)
Since the sixties, computational modeling has become increasingly important in both the physical and the social sciences, particularly in physics, theoretical biology, sociology, and economics. Sine the eighties, philosophers too have begun to apply computational modeling to questions in logic, epistemology, philosophy of science, philosophy of mind, philosophy of language, philosophy of biology, ethics, and social and political philosophy. This chapter analyzes a selection of interesting examples in some of those areas.
Computer-Assisted Argument Mapping (CAAM) is a new way of understanding arguments. While still embryonic in its development and application, CAAM is being used increasingly as a training and development tool in the professions and government. Inroads are also being made in its application within education. CAAM claims to be helpful in an educational context, as a tool for students in responding to assessment tasks. However, to date there is little evidence from students that this is the case. This paper outlines (...) the use of CAAM as an educational tool within an Economics and Commerce Faculty in a major Australian research university. Evaluation results are provided from students from a CAAM pilot within an upper-level Economics subject. Results indicate promising support for the use of CAAM and its potential for transferability within the disciplines. If shown to be valuable with further studies, CAAM could be included in capstone subjects, allowing computer technology to be utilised in the service of generic skill development. (shrink)
This paper explores how the Leviathan that projects power through nuclear arms exercises a unique nuclearized sovereignty. In the case of nuclear superpowers, this sovereignty extends to wielding the power to destroy human civilization as we know it across the globe. Nuclearized sovereignty depends on a hybrid form of power encompassing human decision-makers in a hierarchical chain of command, and all of the technical and computerized functions necessary to maintain command and control at every moment of the sovereign's existence: this (...) sovereign power cannot sleep. This article analyzes how the form of rationality that informs this hybrid exercise of power historically developed to be computable. By definition, computable rationality must be able to function without any intelligible grasp of the context or the comprehensive significance of decision-making outcomes. Thus, maintaining nuclearized sovereignty necessarily must be able to execute momentous life and death decisions without the type of sentience we usually associate with ethical individual and collective decisions. (shrink)
In most accounts of realization of computational processes by physical mechanisms, it is presupposed that there is one-to-one correspondence between the causally active states of the physical process and the states of the computation. Yet such proposals either stipulate that only one model of computation is implemented, or they do not reflect upon the variety of models that could be implemented physically. -/- In this paper, I claim that mechanistic accounts of computation should allow for a broad variation of models (...) of computation. In particular, some non-standard models should not be excluded a priori. The relationship between mathematical models of computation and mechanistically adequate models is studied in more detail. (shrink)
The paper analyses six ethical challenges posed by cloud computing, concerning ownership, safety, fairness, responsibility, accountability and privacy. The first part defines cloud computing on the basis of a resource-oriented approach, and outlines the main features that characterise such technology. Following these clarifications, the second part argues that cloud computing reshapes some classic problems often debated in information and computer ethics. To begin with, cloud computing makes possible a complete decoupling of ownership, possession and use of (...) data and this helps to explain the problems occurring when different providers of cloud computing retain or relinquish the right to use or own users‘ data. The problem of safety in cloud computing is coupled to that of reliability, insofar as users have to trust providers to preserve their data, applications and content in a reliable manner. It is argued that, in this context, data insurance could play an important role. Regarding fairness, the paper argues that cloud computing is already reshaping the nature of the Digital. Responsibility, accountability and privacy close the ethical analysis of cloud computing. In this case, the thesis is that the necessity to account for the actions of cloud computing users imposes delicate trade-offs between users‘ privacy and the traceability of their operations. (shrink)
The purpose of this paper is to argue against the claim that morphological computation is substantially different from other kinds of physical computation. I show that some (but not all) purported cases of morphological computation do not count as specifically computational, and that those that do are solely physical computational systems. These latter cases are not, however, specific enough: all computational systems, not only morphological ones, may (and sometimes should) be studied in various ways, including their energy efficiency, cost, reliability, (...) and durability. Second, I critically analyze the notion of “offloading” computation to the morphology of an agent or robot, by showing that, literally, computation is sometimes not offloaded but simply avoided. Third, I point out that while the morphology of any agent is indicative of the environment that it is adapted to, or informative about that environment, it does not follow that every agent has access to its morphology as the model of its environment. (shrink)
Computers can mimic human intelligence, sometimes quite impressively. This has led some to claim that, a.) computers can actually acquire intelligence, and/or, b.) the human mind may be thought of as a very sophisticated computer. In this paper I argue that neither of these inferences are sound. The human mind and computers, I argue, operate on radically different principles.
Traditionally, computational theory (CT) and dynamical systems theory (DST) have presented themselves as opposed and incompatible paradigms in cognitive science. There have been some efforts to reconcile these paradigms, mainly, by assimilating DST to CT at the expenses of its anti-representationalist commitments. In this paper, building on Piccinini’s mechanistic account of computation and the notion of functional closure, we explore an alternative conciliatory strategy. We try to assimilate CT to DST by dropping its representationalist commitments, and by inviting CT to (...) recognize the functionally closed nature of some computational systems. (shrink)
Argument mapping is a way of diagramming the logical structure of an argument to explicitly and concisely represent reasoning. The use of argument mapping in critical thinking instruction has increased dramatically in recent decades. This paper overviews the innovation and provides a procedural approach for new teaches wanting to use argument mapping in the classroom. A brief history of argument mapping is provided at the end of this paper.
European Computing and Philosophy conference, 2–4 July Barcelona The Seventh ECAP (European Computing and Philosophy) conference was organized by Jordi Vallverdu at Autonomous University of Barcelona. The conference started with the IACAP (The International Association for CAP) presidential address by Luciano Floridi, focusing on mechanisms of knowledge production in informational networks. The first keynote delivered by Klaus Mainzer made a frame for the rest of the conference, by elucidating the fundamental role of complexity of informational structures that can (...) be analyzed on different levels of organization giving place for variety of possible approaches which converge in this cross-disciplinary and multi-disciplinary research field. Keynotes by Kevin Warwick about re-embodiment of rats’ neurons into robots, Raymond Turner on syntax and semantics in programming languages, Roderic Guigo on Biocomputing Sciences and Francesco Subirada on the past and future of supercomputing presented different topics of philosophical as well as practical aspects of computing. Vonference tracks included: Philosophy of Information (Patrick Allo), Philosophy of Computer Science (Raymond Turner), Computer and Information Ethics (Johnny Søraker and Alison Adam), Computational Approaches to the Mind (Ruth Hagengruber), IT and Cultural Diversity (Jutta Weber and Charles Ess), Crossroads (David Casacuberta), Robotics, AI & Ambient Intelligence (Thomas Roth-Berghofer), Biocomputing, Evolutionary and Complex Systems (Gordana Dodig Crnkovic and Søren Brier), E-learning, E-science and Computer-Supported Cooperative Work (Annamaria Carusi) and Technological Singularity and Acceleration Studies (Amnon Eden). (shrink)
Context: At present, we lack a common understanding of both the process of cognition in living organisms and the construction of knowledge in embodied, embedded cognizing agents in general, including future artifactual cognitive agents under development, such as cognitive robots and softbots. Purpose: This paper aims to show how the info-computational approach (IC) can reinforce constructivist ideas about the nature of cognition and knowledge and, conversely, how constructivist insights (such as that the process of cognition is the process of life) (...) can inspire new models of computing. Method: The info-computational constructive framework is presented for the modeling of cognitive processes in cognizing agents. Parallels are drawn with other constructivist approaches to cognition and knowledge generation. We describe how cognition as a process of life itself functions based on info-computation and how the process of knowledge generation proceeds through interactions with the environment and among agents. Results: Cognition and knowledge generation in a cognizing agent is understood as interaction with the world (potential information), which by processes of natural computation becomes actual information. That actual information after integration becomes knowledge for the agent. Heinz von Foerster is identified as a precursor of natural computing, in particular bio computing. Implications: IC provides a framework for unified study of cognition in living organisms (from the simplest ones, such as bacteria, to the most complex ones) as well as in artifactual cognitive systems. Constructivist content: It supports the constructivist view that knowledge is actively constructed by cognizing agents and shared in a process of social cognition. IC argues that this process can be modeled as info-computation. (shrink)
This work addresses a broad range of questions which belong to four fields: computation theory, general philosophy of science, philosophy of cognitive science, and philosophy of mind. Dynamical system theory provides the framework for a unified treatment of these questions. ;The main goal of this dissertation is to propose a new view of the aims and methods of cognitive science--the dynamical approach . According to this view, the object of cognitive science is a particular set of dynamical systems, which I (...) call "cognitive systems". The goal of a cognitive study is to specify a dynamical model of a cognitive system, and then use this model to produce a detailed account of the specific cognitive abilities of that system. The dynamical approach does not limit a-priori the form of the dynamical models which cognitive science may consider. In particular, this approach is compatible with both computational and connectionist modeling, for both computational systems and connectionist networks are special types of dynamical systems. ;To substantiate these methodological claims about cognitive science, I deal first with two questions in two different fields: What is a computational system? What is a dynamical explanation of a deterministic process? ;Intuitively, a computational system is a deterministic system which evolves in discrete time steps, and which can be described in an effective way. In chapter 1, I give a formal definition of this concept which employs the notions of isomorphism between dynamical systems, and of Turing computable function. In chapter 2, I propose a more comprehensive analysis which is based on a natural generalization of the concept of Turing machine. ;The goal of chapter 3 is to develop a theory of the dynamical explanation of a deterministic process. By a "dynamical explanation" I mean the specification of a dynamical model of the system or process which we want to explain. I start from the analysis of a specific type of explanandum--dynamical phenomena--and I then use this analysis to shed light on the general form of a dynamical explanation. Finally, I analyze the structure of those theories which generate explanations of this form, namely dynamical theories. (shrink)
The relationship between abstract formal procedures and the activities of actual physical systems has proved to be surprisingly subtle and controversial, and there are a number of competing accounts of when a physical system can be properly said to implement a mathematical formalism and hence perform a computation. I defend an account wherein computational descriptions of physical systems are high-level normative interpretations motivated by our pragmatic concerns. Furthermore, the criteria of utility and success vary according to our diverse purposes and (...) pragmatic goals. Hence there is no independent or uniform fact to the matter, and I advance the ‘anti-realist’ conclusion that computational descriptions of physical systems are not founded upon deep ontological distinctions, but rather upon interest-relative human conventions. Hence physical computation is a ‘conventional’ rather than a ‘natural’ kind. (shrink)
This chapter draws an analogy between computing mechanisms and autopoietic systems, focusing on the non-representational status of both kinds of system (computational and autopoietic). It will be argued that the role played by input and output components in a computing mechanism closely resembles the relationship between an autopoietic system and its environment, and in this sense differs from the classical understanding of inputs and outputs. The analogy helps to make sense of why we should think of computing (...) mechanisms as non-representational, and might also facilitate reconciliation between computational and autopoietic/enactive approaches to the study of cognition. (shrink)
This paper connects information with computation and cognition via concept of agents that appear at variety of levels of organization of physical/chemical/cognitive systems – from elementary particles to atoms, molecules, life-like chemical systems, to cognitive systems starting with living cells, up to organisms and ecologies. In order to obtain this generalized framework, concepts of information, computation and cognition are generalized. In this framework, nature can be seen as informational structure with computational dynamics, where an (info-computational) agent is needed for the (...) potential information of the world to actualize. Starting from the definition of information as the difference in one physical system that makes a difference in another physical system – which combines Bateson and Hewitt’s definitions, the argument is advanced for natural computation as a computational model of the dynamics of the physical world, where information processing is constantly going on, on a variety of levels of organization. This setting helps us to elucidate the relationships between computation, information, agency and cognition, within the common conceptual framework, with special relevance for biology and robotics. (shrink)
Any computer can create a model of reality. The hypothesis that quantum computer can generate such a model designated as quantum, which coincides with the modeled reality, is discussed. Its reasons are the theorems about the absence of “hidden variables” in quantum mechanics. The quantum modeling requires the axiom of choice. The following conclusions are deduced from the hypothesis. A quantum model unlike a classical model can coincide with reality. Reality can be interpreted as a quantum computer. The physical processes (...) represent computations of the quantum computer. Quantum information is the real fundament of the world. The conception of quantum computer unifies physics and mathematics and thus the material and the ideal world. Quantum computer is a non-Turing machine in principle. Any quantum computing can be interpreted as an infinite classical computational process of a Turing machine. Quantum computer introduces the notion of “actually infinite computational process”. The discussed hypothesis is consistent with all quantum mechanics. The conclusions address a form of neo-Pythagoreanism: Unifying the mathematical and physical, quantum computer is situated in an intermediate domain of their mutual transformation. (shrink)
In this paper a possible general framework for the representation of concepts in cognitive artificial systems and cognitive architectures is proposed. The framework is inspired by the so called proxytype theory of concepts and combines it with the heterogeneity approach to concept representations, according to which concepts do not constitute a unitary phenomenon. The contribution of the paper is twofold: on one hand, it aims at providing a novel theoretical hypothesis for the debate about concepts in cognitive sciences by providing (...) unexplored connections between different theories; on the other hand it is aimed at sketching a computational characterization of the problem of concept representation in cognitively inspired artificial systems and in cognitive architectures. (shrink)
Critical in the computationalist account of the mind is the phenomenon called computational or computer simulation of human thinking, which is used to establish the theses that human thinking is a computational process and that computing machines are thinking systems. Accordingly, if human thinking can be simulated computationally then human thinking is a computational process; and if human thinking is a computational process then its computational simulation is itself a thinking process. This paper shows that the said phenomenon—the computational (...) simulation of human thinking—is ill-conceived, and that, as a consequence, the theses that it intends to establish are problematic. It is argued that what is simulated computationally is not human thinking as such but merely its behavioral manifestations; and that a computational simulation of these behavioral manifestations does not necessarily establish that human thinking is computational, as it is logically possible for a non-computational system to exhibit behaviors that lend themselves to a computational simulation. (shrink)
We present a simple example that disproves the universality principle. Unlike previous counter-examples to computational universality, it does not rely on extraneous phenomena, such as the availability of input variables that are time varying, computational complexity that changes with time or order of execution, physical variables that interact with each other, uncertain deadlines, or mathematical conditions among the variables that must be obeyed throughout the computation. In the most basic case of the new example, all that is used is a (...) single pre-existing global variable whose value is modified by the computation itself. In addition, our example offers a new dimension for separating the computable from the uncomputable, while illustrating the power of parallelism in computation. (shrink)
Is the mathematical function being computed by a given physical system determined by the system’s dynamics? This question is at the heart of the indeterminacy of computation phenomenon (Fresco et al. [unpublished]). A paradigmatic example is a conventional electrical AND-gate that is often said to compute conjunction, but it can just as well be used to compute disjunction. Despite the pervasiveness of this phenomenon in physical computational systems, it has been discussed in the philosophical literature only indirectly, mostly with reference (...) to the debate over realism about physical computation and computationalism. A welcome exception is Dewhurst’s ([2018]) recent analysis of computational individuation under the mechanistic framework. He rejects the idea of appealing to semantic properties for determining the computational identity of a physical system. But Dewhurst seems to be too quick to pay the price of giving up the notion of computational equivalence. We aim to show that the mechanist need not pay this price. The mechanistic framework can, in principle, preserve the idea of computational equivalence even between two different enough kinds of physical systems, say, electrical and hydraulic ones. (shrink)
To study climate change, scientists employ computer models, which approximate target systems with various levels of skill. Given the imperfection of climate models, how do scientists use simulations to generate knowledge about the causes of observed climate change? Addressing a similar question in the context of biological modelling, Levins (1966) proposed an account grounded in robustness analysis. Recent philosophical discussions dispute the confirmatory power of robustness, raising the question of how the results of computer modelling studies contribute to the body (...) of evidence supporting hypotheses about climate change. Expanding on Staley’s (2004) distinction between evidential strength and security, and Lloyd’s (2015) argument connecting variety-of-evidence inferences and robustness analysis, I address this question with respect to recent challenges to the epistemology robustness analysis. Applying this epistemology to case studies of climate change, I argue that, despite imperfections in climate models, and epistemic constraints on variety-of-evidence reasoning and robustness analysis, this framework accounts for the strength and security of evidence supporting climatological inferences, including the finding that global warming is occurring and its primary causes are anthropogenic. (shrink)
In an attempt to determine the epistemic status of computer simulation results, philosophers of science have recently explored the similarities and differences between computer simulations and experiments. One question that arises is whether and, if so, when, simulation results constitute novel empirical data. It is often supposed that computer simulation results could never be empirical or novel because simulations never interact with their targets, and cannot go beyond their programming. This paper argues against this position by examining whether, and under (...) what conditions, the features of empiricality and novelty could be displayed by computer simulation data. I show that, to the extent that certain familiar measurement results have these features, so can some computer simulation results. (shrink)
The articles in this volume present a selection of works from the Symposium on Natu-ral/Unconventional Computing at AISB/IACAP (British Society for the Study of Artificial Intelligence and the Simulation of Behaviour and The International Association for Computing and Philosophy) World Congress 2012, held at the University of Birmingham, celebrating Turing centenary. This book is about nature considered as the totality of physical existence, the universe. By physical we mean all phenomena - objects and processes - that are possible (...) to detect either directly by our senses or via instruments. Historically, there have been many ways of describ-ing the universe (cosmic egg, cosmic tree, theistic universe, mechanistic universe) and a par-ticularly prominent contemporary approach is computational universe. (shrink)
In this paper, building on these previous works, we propose to go deeper into the understanding of crowd behavior by proposing an approach which integrates ontologi- cal models of crowd behavior and dedicated computer vision algorithms, with the aim of recognizing some targeted complex events happening in the playground from the observation of the spectator crowd behavior. In order to do that, we first propose an ontology encoding available knowledge on spectator crowd behavior, built as a spe- cialization of the (...) DOLCE foundational ontology, which allows the representation of categories belonging both to the physical and to the social realms. We then propose a simplified and tractable version of such ontology in a new temporal extension of a description logic, which is used for temporally coupling events happening on the play- ground and spectator crowd behavior. At last, computer vision algorithms provide the input information concerning what is observed on the stands and ontological reasoning delivers the output necessary to perform complex event recognition. (shrink)
Detractors of Searle’s Chinese Room Argument have arrived at a virtual consensus that the mental properties of the Man performing the computations stipulated by the argument are irrelevant to whether computational cognitive science is true. This paper challenges this virtual consensus to argue for the first of the two main theses of the persons reply, namely, that the mental properties of the Man are what matter. It does this by challenging many of the arguments and conceptions put forth by the (...) systems and logical replies to the Chinese Room, either reducing them to absurdity or showing how they lead, on the contrary, to conclusions the persons reply endorses. The paper bases its position on the Chinese Room Argument on additional philosophical considerations, the foundations of the theory of computation, and theoretical and experimental psychology. The paper purports to show how all these dimensions tend to support the proposed thesis of the persons reply. (shrink)
In the last decades a growing body of literature in Artificial Intelligence (AI) and Cognitive Science (CS) has approached the problem of narrative understanding by means of computational systems. Narrative, in fact, is an ubiquitous element in our everyday activity and the ability to generate and understand stories, and their structures, is a crucial cue of our intelligence. However, despite the fact that - from an historical standpoint - narrative (and narrative structures) have been an important topic of investigation in (...) both these areas, a more comprehensive approach coupling them with narratology, digital humanities and literary studies was still lacking. With the aim of covering this empty space, in the last years, a multidisciplinary effort has been made in order to create an international meeting open to computer scientist, psychologists, digital humanists, linguists, narratologists etc.. This event has been named CMN (for Computational Models of Narrative) and was launched in the 2009 by the MIT scholars Mark A. Finlayson and Patrick H. Winston1. (shrink)
Physical Computation is the summation of Piccinini’s work on computation and mechanistic explanation over the past decade. It draws together material from papers published during that time, but also provides additional clarifications and restructuring that make this the definitive presentation of his mechanistic account of physical computation. This review will first give a brief summary of the account that Piccinini defends, followed by a chapter-by-chapter overview of the book, before finally discussing one aspect of the account in more critical detail.
Arguments for extended cognition and the extended mind are typically directed at human-centred forms of cognitive extension—forms of cognitive extension in which the cognitive/mental states/processes of a given human individual are subject to a form of extended or wide realization. The same is true of debates and discussions pertaining to the possibility of Web-extended minds and Internet-based forms of cognitive extension. In this case, the focus of attention concerns the extent to which the informational and technological elements of the online (...) environment form part of the machinery of the (individual) human mind. In this paper, we direct attention to a somewhat different form of cognitive extension. In particular, we suggest that the Web allows human individuals to be incorporated into the computational/cognitive routines of online systems. These forms of computational/cognitive extension highlight the potential of the Web and Internet to support bidirectional forms of computational/cognitive incorporation. The analysis of such bidirectional forms of incorporation broadens the scope of philosophical debates in this area, with potentially important implications for our understanding of the foundational notions of extended cognition and the extended mind. (shrink)
In this paper, we deal with the computation of Lie derivatives, which are required, for example, in some numerical methods for the solution of differential equations. One common way for computing them is to use symbolic computation. Computer algebra software, however, might fail if the function is complicated, and cannot be even performed if an explicit formulation of the function is not available, but we have only an algorithm for its computation. An alternative way to address the problem is (...) to use automatic differentiation. In this case, we only need the implementation of the algorithm that evaluates the function in terms of its analytic expression in a programming language, but we cannot use this if we have only a compiled version of the function. In this paper, we present a novel approach for calculating the Lie derivative of a function, even in the case where its analytical expression is not available, that is based on the In finity Computer arithmetic. A comparison with symbolic and automatic differentiation shows the potentiality of the proposed technique. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.