in Oct-14-1998 ordinance INDESP-IO4 established the federal software certification and verification requirements for gaming machines in Brazil. The authors present the rationale behind these criteria, whose basic principles can find applications in several other software authentication applications.
This paper is concerned with the construction of theories of software systems yielding adequate predictions of their target systems’ computations. It is first argued that mathematical theories of programs are not able to provide predictions that are consistent with observed executions. Empirical theories of software systems are here introduced semantically, in terms of a hierarchy of computational models that are supplied by formal methods and testing techniques in computer science. Both deductive top-down and inductive bottom-up approaches in the (...) discovery of semantic software theories are refused to argue in favour of the abductive process of hypothesising and refining models at each level in the hierarchy, until they become satisfactorily predictive. Empirical theories of computational systems are required to be modular, as modular are most softwareverification and testing activities. We argue that logic relations must be thereby defined among models representing different modules in a semantic theory of a modular software system. We exclude that scientific structuralism is able to define module relations needed in software modular theories. The algebraic Theory of Institutions is finally introduced to specify the logic structure of modular semantic theories of computational systems. (shrink)
Quantum computing is of high interest because it promises to perform at least some kinds of computations much faster than classical computers. Arute et al. 2019 (informally, “the Google Quantum Team”) report the results of experiments that purport to demonstrate “quantum supremacy” – the claim that the performance of some quantum computers is better than that of classical computers on some problems. Do these results close the debate over quantum supremacy? We argue that they do not. In the following, we (...) provide an overview of the Google Quantum Team’s experiments, then identify some open questions in the quest to demonstrate quantum supremacy. (shrink)
Over the last decade, multi-agent systems have come to form one of the key tech- nologies for software development. The Formal Approaches to Multi-Agent Systems (FAMAS) workshop series brings together researchers from the fields of logic, theoreti- cal computer science and multi-agent systems in order to discuss formal techniques for specifying and verifying multi-agent systems. FAMAS addresses the issues of logics for multi-agent systems, formal methods for verification, for example model check- ing, and formal approaches to cooperation, multi-agent (...) planning, communication, coordination, negotiation, games, and reasoning under uncertainty in a distributed environment. In 2007, the third FAMAS workshop, FAMAS'007, was one of the agent workshops gathered together under the umbrella of Multi-Agent Logics, Languages, and Organ- isations - Federated Workshops, MALLOW'007, taking place from 3 to 7 September 2007 in Durham. This current special issue of the Logic Journal of the IGPL gathers together the revised and updated versions of the five best FAMAS'007 contributions. (shrink)
Software is a ubiquitous artifact, yet not much has been done to understand its ontological nature. There are a few accounts offered so far about the nature of software. I argue that none of those accounts give a plausible picture of the nature of software. I draw attention to the striking similarities between software and musical works. These similarities motivate to look more closely on the discussions regarding the nature of the musical works. With the lessons (...) drawn from the ontology of musical works I offer a novel account of the nature of software. In this account, software is an abstract artifact. I elaborate the conditions under which software comes into existence; how it persists; how and on which entities its existence depends. (shrink)
Every person has his/her own unique signature that is used mainly for the purposes of personal identification and verification of important documents or legal transactions. There are two kinds of signature verification: static and dynamic. Static(off-line) verification is the process of verifying an electronic or document signature after it has been made, while dynamic(on-line) verification takes place as a person creates his/her signature on a digital tablet or a similar device. Offline signature verification is not (...) efficient and slow for a large number of documents. To overcome the drawbacks of offline signature verification, we have seen a growth in online biometric personal verification such as fingerprints, eye scan etc. In this paper we created CNN model using python for offline signature and after training and validating, the accuracy of testing was 99.70%. (shrink)
Interactive theorem provers might seem particularly impractical in the history of philosophy. Journal articles in this discipline are generally not formalized. Interactive theorem provers involve a learning curve for which the payoffs might seem minimal. In this article I argue that interactive theorem provers have already demonstrated their potential as a useful tool for historians of philosophy; I do this by highlighting examples of work where this has already been done. Further, I argue that interactive theorem provers can continue to (...) be useful tools for historians of philosophy in the future; this claim is defended through a more conceptual analysis of what historians of philosophy do that identifies argument reconstruction as a core activity of such practitioners. It is then shown that interactive theorem provers can assist in this core practice by a description of what interactive theorem provers are and can do. If this is right, then computer verification for historians of philosophy is in the offing. (shrink)
This article explores some open questions related to the problem of verification of theories in the context of empirical sciences by contrasting three epistemological frameworks. Each of these epistemological frameworks is based on a corresponding central metaphor, namely: (a) Neo-empiricism and the gambling metaphor; (b) Popperian falsificationism and the scientific tribunal metaphor; (c) Cognitive constructivism and the object as eigen-solution metaphor. Each of one of these epistemological frameworks has also historically co-evolved with a certain statistical theory and method for (...) testing scientific hypotheses, respectively: (a) Decision theoretic Bayesian statistics and Bayes factors; (b) Frequentist statistics and p-values; (c) Constructive Bayesian statistics and e-values. This article examines with special care the Zero Probability Paradox (ZPP), related to the verification of sharp or precise hypotheses. Finally, this article makes some remarks on Lakatos’ view of mathematics as a quasi-empirical science. (shrink)
The new phase of science evolution is characterized by totality of subject and object of cognition and technology (high-hume). As a result, forming of network structure in a disciplinary matrix modern are «human dimensional» natural sciences and two paradigmal «nuclei» (attraktors). As a result, the complication of structure of disciplinary matrix and forming a few paradigm nuclei in modern «human dimensional» natural sciences are observed. In the process of social verification integration of scientific theories into the existent system of (...) mental and valued options of spiritual culture takes place. The attribute of classical science – ethics neutrality of scientific knowledge becomes an unattainable ideal. One of the trends in the evolution of theoretical epistemology is the study of migration mechanisms of generation of scientific knowledge from the sphere of its own logic and methodology of science in the field of sociology - the consideration of this process, as the resulting system of interactions of social structures and institutions. Ensuing ideas and settings become the dominant worldview of philosophical and technological civilization. -/- . (shrink)
Artefacts do not always do what they are supposed to, due to a variety of reasons, including manufacturing problems, poor maintenance, and normal wear-and-tear. Since software is an artefact, it should be subject to malfunctioning in the same sense in which other artefacts can malfunction. Yet, whether software is on a par with other artefacts when it comes to malfunctioning crucially depends on the abstraction used in the analysis. We distinguish between “negative” and “positive” notions of malfunction. A (...) negative malfunction, or dysfunction, occurs when an artefact token either does not or cannot do what it is supposed to. A positive malfunction, or misfunction, occurs when an artefact token may do what is supposed to but, at least occasionally, it also yields some unintended and undesirable effects. We argue that software, understood as type, may misfunction in some limited sense, but cannot dysfunction. Accordingly, one should distinguish software from other technical artefacts, in view of their design that makes dysfunction impossible for the former, while possible for the latter. (shrink)
The new phase of science evolution is characterized by totality of subject and object of cognition and technology (high-hume). As a result, forming of network structure in a disciplinary matrix modern are «human dimensional» natural sciences and two paradigmal «nuclei» (attraktors). As a result, the complication of structure of disciplinary matrix and forming a few paradigm nuclei in modern «human dimensional» natural sciences are observed. In the process of social verification integration of scientific theories into the existent system of (...) mental and valued options of spiritual culture takes place. The attribute of classical science – ethics neutrality of scientific knowledge becomes an unattainable ideal. One of the trends in the evolution of theoretical epistemology is the study of migration mechanisms of generation of scientific knowledge from the sphere of its own logic and methodology of science in the field of sociology - the consideration of this process, as the resulting system of interactions of social structures and institutions. Ensuing ideas and settings become the dominant worldview of philosophical and technological civilization. -/- . (shrink)
The new phase of science evolution is characterized by totality of subject and object of cognition and technology (high-hume). As a result, forming of network structure in a disciplinary matrix modern are «human dimensional» natural sciences and two paradigmal «nuclei» (attraktors). As a result, the complication of structure of disciplinary matrix and forming a few paradigm nuclei in modern «human dimensional» natural sciences are observed.
Software piracy is older than the PC and has been the subject of several studies, which have found it to be a widespread phenomenon in general, and among university students in particular. An earlier study by Cohen and Cornwell from a decade ago is replicated, adding questions about downloading music from the Internet. The survey includes responses from 224 students in entry-level courses at two schools, a nondenominational suburban university and a Catholic urban college with similar student profiles. The (...) study found that there has been few if any changes in student opinions regarding the unauthorized duplication of copy- righted materials. Students generally felt that copying commercial software and downloading music from the Internet was acceptable and found that there was no significant correlation between student attitudes and their school’s religious affiliation or lack thereof. Additionally, the study found that a small but significant percentage of respondents considered the other questionable behaviors as ethically acceptable. Finally, the reasons for these attitudes are discussed as well as what colleges can do to correct the situation. (shrink)
The integration of information resources in the life sciences is one of the most challenging problems facing bioinformatics today. We describe how Language and Computing nv, originally a developer of ontology-based natural language understanding systems for the healthcare domain, is developing a framework for the integration of structured data with unstructured information contained in natural language texts. L&C’s LinkSuite™ combines the flexibility of a modular software architecture with an ontology based on rigorous philosophical and logical principles that is designed (...) to comprehend the basic formal relationships that structure both reality and the ways humans perceive and communicate about reality. (shrink)
In this section, we will start with an influential attempt to define `intelligence', and then we will move to a consideration of how human intelligence is to be investigated on the machine model. The last part of the section will discuss the relation between the mental and the biological.
Computational chemistry grew in a new era of “desktop modeling,” which coincided with a growing demand for modeling software, especially from the pharmaceutical industry. Parameterization of models in computational chemistry is an arduous enterprise, and we argue that this activity leads, in this specific context, to tensions among scientists regarding the epistemic opacity transparency of parameterized methods and the software implementing them. We relate one flame war from the Computational Chemistry mailing List in order to assess in detail (...) the relationships between modeling methods, parameterization, software and the various forms of their enclosure or disclosure. Our claim is that parameterization issues are an important and often neglected source of epistemic opacity and that this opacity is entangled in methods and software alike. Models and software must be addressed together to understand the epistemological tensions at stake. (shrink)
Plagiarism is malpractice, the fabrication of others’ “ideas or work” published without the proper permission or citation of the original contributors. Plagiarism is detected through different software, i.e., Turnitin, before publishing any research data. The present survey study assesses whether academicians, researchers, and scholars around the world perceive this software as a creator or destroyer of new thoughts and ideas. A survey of this research data was conducted with academicians, researchers, and scholars around the globe. The number of (...) respondents is 1100, including 688 teaching professionals, 347 non-teaching, and 65 others. The present study finds that 82.7 per cent of research professionals mentioned that plagiarism could be appropriately citable. 76.7 per cent suggested that plagiarism can be completely avoidable, and 72.4 per cent has been proposed to be punishable. The study also described that plagiarism software is a good, efficient, and effective creator for new ideas. (shrink)
Interactions between an intelligent software agent and a human user are ubiquitous in everyday situations such as access to information, entertainment, and purchases. In such interactions, the ISA mediates the user’s access to the content, or controls some other aspect of the user experience, and is not designed to be neutral about outcomes of user choices. Like human users, ISAs are driven by goals, make autonomous decisions, and can learn from experience. Using ideas from bounded rationality, we frame these (...) interactions as instances of an ISA whose reward depends on actions performed by the user. Such agents benefit by steering the user’s behaviour towards outcomes that maximise the ISA’s utility, which may or may not be aligned with that of the user. Video games, news recommendation aggregation engines, and fitness trackers can all be instances of this general case. Our analysis facilitates distinguishing various subcases of interaction, as well as second-order effects that might include the possibility for adaptive interfaces to induce behavioural addiction, and/or change in user belief. We present these types of interaction within a conceptual framework, and review current examples of persuasive technologies and the issues that arise from their use. We argue that the nature of the feedback commonly used by learning agents to update their models and subsequent decisions could steer the behaviour of human users away from what benefits them, and in a direction that can undermine autonomy and cause further disparity between actions and goals as exemplified by addictive and compulsive behaviour. We discuss some of the ethical, social and legal implications of this technology and argue that it can sometimes exploit and reinforce weaknesses in human beings. (shrink)
Open-content communities that focus on co-creation without requirements for entry have to face the issue of institutional trust in contributors. This research investigates the various ways in which these communities manage this issue. It is shown that communities of open-source software—continue to—rely mainly on hierarchy (reserving write-access for higher echelons), which substitutes (the need for) trust. Encyclopedic communities, though, largely avoid this solution. In the particular case of Wikipedia, which is confronted with persistent vandalism, another arrangement has been pioneered (...) instead. Trust (i.e. full write-access) is ‘backgrounded’ by means of a permanent mobilization of Wikipedians to monitor incoming edits. Computational approaches have been developed for the purpose, yielding both sophisticated monitoring tools that are used by human patrollers, and bots that operate autonomously. Measures of reputation are also under investigation within Wikipedia; their incorporation in monitoring efforts, as an indicator of the trustworthiness of editors, is envisaged. These collective monitoring efforts are interpreted as focusing on avoiding possible damage being inflicted on Wikipedian spaces, thereby being allowed to keep the discretionary powers of editing intact for all users. Further, the essential differences between backgrounding and substituting trust are elaborated. Finally it is argued that the Wikipedian monitoring of new edits, especially by its heavy reliance on computational tools, raises a number of moral questions that need to be answered urgently. (shrink)
Hacker communities of the 1970s and 1980s developed a quite characteristic work ethos. Its norms are explored and shown to be quite similar to those which Robert Merton suggested govern academic life: communism, universalism, disinterestedness, and organized scepticism. In the 1990s the Internet multiplied the scale of these communities, allowing them to create successful software programs like Linux and Apache. After renaming themselves the `open source software' movement, with an emphasis on software quality, they succeeded in gaining (...) corporate interest. As one of the main results, their `open' practices have entered industrial software production. The resulting clash of cultures, between the more academic CUDOS norms and their corporate counterparts, is discussed and assessed. In all, the article shows that software practices are a fascinating seedbed for the genesis of work ethics of various kinds, depending on their societal context. (shrink)
Computational reproducibility possesses its own dynamics and narratives of crisis. Alongside the difficulties of computing as an ubiquitous yet complex scientific activity, computational reproducibility suffers from a naive expectancy of total reproducibility and a moral imperative to embrace the principles of free software as a non-negotiable epistemic virtue. We argue that the epistemic issues at stake in actual practices of computational reproducibility are best unveiled by focusing on software as a pivotal concept, one that is surprisingly often overlooked (...) in accounts of reproducibility issues. Software is not only about designing and coding but also about maintaining, supporting, distributing, licensing, and governance; it is not only about developers but also about users. We focus on openness debates among computational chemists involved in molecular modeling software packages as empirical grounding for our argument. We then identify and analyse four epistemic characteristics as key to the role of software in computational reproducibility. (shrink)
English - language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the ‘ coactivity ’ in use between humans and bots, this research ‘ discloses ’ the moral issues that emerge from the combined patrolling by humans and bots. Administrators provide the stronger tools only to trusted users, thereby creating a new (...) hierarchical layer. Further, surveillance exhibits several troubling features : questionable profiling practices, the use of the controversial measure of reputation, ‘ oversurveillance ’ where quantity trumps quality, and a prospective loss of the required moral skills whenever bots take over from humans. The most troubling aspect, though, is that Wikipedia has become a Janus - faced institution. One face is the basic platform of MediaWiki software, transparent to all. Its other face is the anti - vandalism system, which, in contrast, is opaque to the average user, in particular as a result of the algorithms and neural networks in use. Finally it is argued that this secrecy impedes a much needed discussion to unfold ; a discussion that should focus on a ‘ rebalancing ’ of the anti - vandalism system and the development of more ethical information practices towards contributors. (shrink)
A. J. Ayer’s empiricist criterion of meaning was supposed to have sorted all statements into nonsense on the one hand, and tautologies or genuinely factual statements on the other. Unfortunately for Ayer, it follows from classical logic that his criterion is trivial—it classifies all statements as either tautologies or genuinely factual, but none as nonsense. However, in this paper, I argue that Ayer’s criterion of meaning can be defended from classical proofs of its triviality by the adoption of a relevant (...) logic—an idea which is motivated because, according to Ayer, the genuinely factual statements are those which observation is relevant to. (shrink)
Très souvent, on compte sur la science pour nous sauver. Le rapport qu'on établit avec elle peut donner à penser que la science parlerait ainsi de choses, validerait quelque chose comme des évidences immédiates. La connaissance scientifique est inférentielle. Si elle a un objet, ce qu'en rigueur de termes la philosophie n'a pas, elle doit néanmoins se mettre à distance de lui pour se faire science. Elle valide ainsi des schèmes abstraits, qui ne sont scientifiques que dans cette mesure. Nous (...) prendrons la mesure du fait que pour nous aider à retrouver le concret, l'intuition est meilleure guide que la science, et puis nous convoquerons certaines idées de l'algébriste et métaphysicien Alfred North Whitehead dont l'actualité ne cesse d'étonner, lorsqu'il inverse l'accès direct au réel, à l'envers de la vulgate reçue concernant les rapports entre science et religion. (shrink)
We investigate claims about the frequency of "know" made by philosophers. Our investigation has several overlapping aims. First, we aim to show what is required to confirm or disconfirm philosophers’ claims about the comparative frequency of different uses of philosophically interesting expressions. Second, we aim to show how using linguistic corpora as tools for investigating meaning is a productive methodology, in the sense that it yields discoveries about the use of language that philosophers would have overlooked if they remained in (...) their "armchairs of an afternoon", to use J.L. Austin’s phrase. Third, we discuss facts about the meaning of "know" that so far have been ignored in philosophy, with the aim of reorienting discussions of the relevance of ordinary language for philosophical theorizing. (shrink)
Software patents are commonly criticised for being fuzzy, context-sensitive, and often granted for trivial inventions. More often than not, these shortcomings are said to be caused by the abstract nature of software - with little further analysis offered. Drawing on Plato’s Parmenides, this paper will argue (1) that the reason why software patents seem to be elusive is that patent law suggests to think about algorithms as paradigmatic examples and (2) that Plato’s distinction between two modes of (...) predication and the role of competence in his account of knowledge are helpful not only for conceptualising knowledge of algorithms, but also for understanding the limits of software patent regimes. (shrink)
The comprehensive nature of information and insight available in the Upanishads, the Indian philosophical systems like the Advaita Philosophy, Sabdabrahma Siddhanta, Sphota Vaada and the Shaddarsanas, in relation to the idea of human consciousness, mind and its functions, cognitive science and scheme of human cognition and communication are presented. All this is highlighted with vivid classification of conscious-, cognitive-, functional- states of mind; by differentiating cognition as a combination of cognitive agent, cognizing element, cognized element; formation; form and structure of (...) cognition, instruments and means of cognition, validity of cognition and the nature of energy/matter which facilitates and also is the medium of cognition- cognizing process; and also as the container and content of cognition. The human communication process which is just the reverse of cognizing process is also presented with necessary description and detail. The sameness of cognitive / communicative process during language acquisition and communication processes and the modes of language acquisition and communication are also given. The hardware and software of human cognition, language acquisition and communication as envisaged in Indian spiritual and philosophical expressions are given. In the light of the information and insight obtained as above, the axioms for human cognition / communication are formed, framed and presented. A brain-wave frequency modulation / demodulation model of human cognition, communication and language acquisition and communication process based on Upanishadic expressions, Shaddarsanas and Sabdabrahma Siddhhanta is defined and discussed here. The use of these theories and models and axioms in mind-machine modelling and natural language comprehension branch of artificial intelligence will be put forward and highlighted. (shrink)
According to Karl Popper, science cannot verify its theories empirically, but it can falsify them, and that suffices to account for scientific progress. For Popper, a law or theory remains a pure conjecture, probability equal to zero, however massively corroborated empirically it may be. But it does just seem to be the case that science does verify empirically laws and theories. We trust our lives to such verifications when we fly in aeroplanes, cross bridges and take modern medicines. We can (...) do some justice to this apparent capacity of science to verify if we make a number of improvements to Popper’s philosophy of science. The key step is to recognize that physics, in accepting unified theories only, thereby makes a big metaphysical assumption about the nature of the universe. The outcome is a conception of scientific method which facilitates the criticism and improvement of metaphysical assumptions of physics. This view provides, not verification, but a perfect simulacrum of verification indistinguishable from the real thing. (shrink)
The terms ‘verification’ and ‘validation’ are widely used in science, both in the natural and the social sciences. They are extensively used in simulation, often associated with the need to evaluate models in different stages of the simulation development process. Frequently, terminological ambiguities arise when researchers conflate, along the simulation development process, the technical meanings of both terms with other meanings found in the philosophy of science and the social sciences. This article considers the problem of verification and (...) validation in social science simulation along five perspectives: The reasons to address terminological issues in simulation; the meaning of the terms in the philosophical sense of the problem of “truth”; the observation that some debates about these terms in simulation are inadvertently more terminological than epistemological; the meaning of the terms in the technical context of the simulation development process; and finally, a comprehensive outline of the relation between terminology used in simulation, different types of models used in the development process and different epistemological perspectives. (shrink)
Modern computing is generally taken to consist primarily of symbol manipulation. But symbols are abstract, and computers are physical. How can a physical device manipulate abstract symbols? Neither Church nor Turing considered this question. My answer is that the bit, as a hardware-implemented abstract data type, serves as a bridge between materiality and abstraction. Computing also relies on three other primitive—but more straightforward—abstractions: Sequentiality, State, and Transition. These physically-implemented abstractions define the borderline between hardware and software and between physicality (...) and abstraction. At a deeper level, asking how a physical device can interact with abstract symbols is the wrong question. The relationship between symbols and physical devices begins with the realization that human beings already know what it means to manipulate symbols. We build and program computers to do what we understand to be symbol manipulation. To understand what that means, consider a light switch. A light switch doesn’t turn a light on or off. Those are abstractions. Light switches don’t operate with abstractions. We build light switches, so that when flipped, the world is changed in such a way that we understand the light to be on or off. Similarly, we build computers to perform operations that we understand as manipulating symbols. (shrink)
According to the Argument for Autonomous Mental Disorder (AAMD), mental disorder can occur in the absence of brain disorder, just as software problems can occur in the absence of hardware problems in a computer. This paper argues that the AAMD is unsound. I begin by introducing the ‘natural dysfunction analysis’ of disorder, before outlining the AAMD. I then analyse the necessary conditions for realiser autonomous dysfunction. Building on this, I show that software functions disassociate from hardware functions in (...) a way that mental functions do not disassociate from brain functions. It follows that mental disorders are brain disorders necessarily. (shrink)
Una imagen muy generalizada a la hora de entender el software de computador es la que lo representa como una “caja negra”: no importa realmente saber qué partes lo componen internamente, sino qué resultados se obtienen de él según ciertos valores de entrada. Al hacer esto, muchos problemas filosóficos son ocultados, negados o simplemente mal entendidos. Este artículo discute tres unidades de análisis del software de computador, esto es, las especificaciones, los algoritmos y los procesos computacionales. El objetivo (...) central es entender las prácticas cientficas e ingenieriles detrás de cada unidad de software, así como analizar su metodología, ontología y epistemología. (shrink)
Two property regimes for software development may be distinguished. Within corporations, on the one hand, a Private Regime obtains which excludes all outsiders from access to a firm's software assets. It is shown how the protective instruments of secrecy and both copyright and patent have been strengthened considerably during the last two decades. On the other, a Public Regime among hackers may be distinguished, initiated by individuals, organizations or firms, in which source code is freely exchanged. It is (...) argued that copyright is put to novel use here: claiming their rights, authors write `open source licenses' that allow public usage of the code, while at the same time regulating the inclusion of users. A `regulated commons' is created. The analysis focuses successively on the most important open source licenses to emerge, the problem of possible incompatibility between them (especially as far as the dominant General Public License is concerned), and the fragmentation into several user communities that may result. (shrink)
The study was conducted to compare manual and computerized software techniques of data management and analysis in educational research. Specifically, the study investigated whether there was a significant difference in the results of Pearson correlation, independent t-test and ANOVA obtained from using manual and computerized software technique of data analyses. Three null hypotheses were formulated accordingly to guide the study. The study adopted a quasi-experimental research design where several data were generated by the researchers and analyzed using manual (...) and computerized software techniques. The data were generated to suit the required data of each statistical method of analysis. CASIO fx-991ES PLUS NATURAL DISPLAY scientific calculator and statistical tables were used for manual analysis; while data analysis tool pack of Microsoft Excel version 2013 were used for computerized software analysis. The results of the analysis revealed that both manual and computerized software techniques yielded the same results for Pearson correlation, independent t-test and ANOVA. It was concluded that though both manual and computerized techniques are reliable and dependable, computerized technique is faster and efficient in managing and analyzing data than manual technique. It was recommended, among other things, that any of the techniques should be used without fear when computing Pearson, independent t-test and one-way ANOVA as it is the same results that will be gotten. (shrink)
The scalar approach to negative polarity item (NPI) licensing assumes that NPIs are allowable in contexts in which the introduction of the NPI leads to proposition strengthening (e.g., Kadmon & Landman 1993, Krifka 1995, Lahiri 1997, Chierchia 2006). A straightforward processing prediction from such a theory is that NPI’s facilitate inference verification from sets to subsets. Three experiments are reported that test this proposal. In each experiment, participants evaluated whether inferences from sets to subsets were valid. Crucially, we manipulated (...) whether the premises contained an NPI. In Experiment 1, participants completed a metalinguistic reasoning task, and Experiments 2 and 3 tested reading times using a self-paced reading task. Contrary to expectations, no facilitation was observed when the NPI was present in the premise compared to when it was absent. In fact, the NPI significantly slowed down reading times in the inference region. Our results therefore favor those scalar theories that predict that the NPI is costly to process (Chierchia 2006), or other, nonscalar theories (Giannakidou 1998, Ladusaw 1992, Postal 2005, Szabolcsi 2004) that likewise predict NPI processing cost but, unlike Chierchia (2006), expect the magnitude of the processing cost to vary with the actual pragmatics of the NPI. (shrink)
In the last two decades, software process modeling has been an area of interest within both academia and industry. Software process modeling aims at defining and representing software processes in the form of models. A software process model represents the medium that allows better understanding, management and control of the software process. Software process metamodeling rather, provides standard metamodels which enable the defining of customized software process models for a specific project in hand (...) by instantiation. Several software process modeling/meta-modeling languages have been introduced to formalize software process models. Nonetheless, none of them has managed to introduce a compatible yet precise language to include all necessary concepts and information for software process modeling. This paper presents Software Process Meta-Modeling and Notation (SP2MN); a meta-modeling language that provides simple and expressive graphical notations for the aim of software process modeling. SP2MN has been evaluated based upon the well-known ISPW-6 process example, a standard benchmark problem for software process modeling. SP2MN has proved that it presents a valid and expressive software process modeling language. (shrink)
We comment some recent results obtained by using a Clifford bare bone skeleton of quantum mechanics in order to formulate the conclusion that quantum mechanics has its origin in the logic, and relates conceptual entities. Such results touch directly the basic problem about the structure of our cognitive and conceptual dynamics and thus of our mind. The problem of exploring consciousness results consequently to be strongly linked. This is the reason because studies on quantum mechanics applied to this matter are (...) so important for neurologists and psychologists. Under this profile we present some experimental results showing violation of Bell inequality during the MBTI test in investigation of C.V. Jung’s theory of personality. (shrink)
More Info: B. Plawgo, A. Grabska, M. Klimczuk-Kochańska, A. Klimczuk, J. Kierklo, J. Żynel-Etel, Startery podlaskiej gospodarki. Analiza gospodarczych obszarów wzrostu i innowacji województwa podlaskiego: sektor produkcji oprogramowania komputerowego (Podlasie economy starters. Analysis of economic growth and innovation areas of Podlaskie: software production sector), Wojewódzki Urząd Pracy w Białymstoku, Białystok 2011.
Although successive generations of digital technology have become increasingly powerful in the past 20 years, digital democracy has yet to realize its potential for deliberative transformation. The undemocratic exploitation of massive social media systems continued this trend, but it only worsened an existing problem of modern democracies, which were already struggling to develop deliberative infrastructure independent of digital technologies. There have been many creative conceptions of civic tech, but implementation has lagged behind innovation. This article argues for implementing one such (...) vision of digital democracy through the establishment of a public corporation. Modeled on the Corporation for Public Broadcasting in the United States, this entity would foster the creation of new digital technology by providing a stable source of funding to nonprofit technologists, interest groups, civic organizations, government, researchers, private companies, and the public. Funded entities would produce and maintain software infrastructure for public benefit. The concluding sections identify what circumstances might create and sustain such an entity. (shrink)
Create a novel network model for mobile ad hoc network (MANET) nodes and actors in wireless sensor networks to collaborate on event processing. There are two stages in the development of distributed algorithms: setup and negotiation. The first uses weighted proportional max-min fairness to initially allocate MANET nodes across event zones, whereas the latter uses a market-based method to re-distribute the number of MANET nodes based on existing and new events. A detection technique for malicious packet dropping attacks in MANETs. (...) The mechanism of the suggested approach is Collaborative Convolutional Neural Network (CCNN), which is based on the reputation value computed for that node by its neighbours. A node's reputation value is determined by its network packet forwarding behaviour. The reputation information is collected, saved, and transferred between nodes before being calculated under various scenarios. A network simulator was used to test the proposed protocol. The simulation results demonstrate the effectiveness of its performance. Even in the presence of cryptographic procedures, our method incurs negligible network bandwidth and latency costs. Moreover, we demonstrate that the protection is still effective in the presence of misbehaving nodes and routing changes caused by mobility. While further research is needed to thoroughly evaluate our method, we feel that the concept of collaborative security in MANETs is a potential future area. (shrink)
Karl Popper rightly contests the possibility of a verification of basic statements. At the same time he strictly believes in the possibility of growth of empirical knowledge. Knowledge growth, however, is only possible if empirical theories can be falsified. This raises the question, how theories can be falsified, if a verification of those statements that falsify theories – i.e. basic statements – is not possible. This problem is often referred to as the “basic problem” or “problem of the (...) empirical basis”. In this paper I show that – from a logical point of view – a falsification of theories is possible without a verification of basic state-ments. Furthermore I show that knowledge growth in the empirical sciences will be possible if two assumptions are valid. These assumptions can neither be proven nor falsified. However, they have to be postulated by everybody in eve-ryday life. (shrink)
Sabdabrahma Siddhanta, popularized by Patanjali and Bhartruhari will be scientifically analyzed. Sphota Vada, proposed and nurtured by the Sanskrit grammarians will be interpreted from modern physics and communication engineering points of view. Insight about the theory of language and modes of language acquisition and communication available in the Brahma Kanda of Vakyapadeeyam will be translated into modern computational terms. A flowchart of language processing in humans will be given. A gross model of human language acquisition, comprehension and communication process forming (...) the basis to develop software for relevant mind-machine modeling will be presented. The implications of such a model to artificial intelligence and cognitive sciences will be discussed. The essentiality and necessity of a physics, communication engineering , biophysical and biochemical insight as both complementary and supplementary to using mathematical and computational methods in delineating the theory of Sanskrit language is put forward. Natural language comprehension as distinct and different from natural language processing is pointed out. (shrink)
The concept of non-arbitrage plays an essential role in finance theory. Under certain regularity conditions, the Fundamental Theorem of Asset Pricing states that, in non-arbitrage markets, prices of financial instruments are martingale processes. In this theoretical framework, the analysis of the statistical distributions of financial assets can assist in understanding how participants behave in the markets, and may or may not engender arbitrage conditions. Assuming an underlying Variance Gamma statistical model, this study aims to test, using the FBST - Full (...) Bayesian Significance Test, if there is a relevant price difference between essentially the same financial asset traded at two distinct locations. Specifically, we investigate and compare the behavior of call options on the BOVESPA Index traded at (a) the Equities Segment and (b) the Derivatives Segment of BM&FBovespa. Our results seem to point out significant statistical differences. To what extent this evidence is actually the expression of perennial arbitrage opportunities is still an open question. (shrink)
Les scientifiques espèrent qu'à l'avenir ils pourront tester des trous noirs en observant les effets causés par un fort champ gravitationnel à proximité, comme la lentille gravitationnelle. Il existe déjà des observations sur les lentilles gravitationnelles faibles, dans lesquelles les rayons lumineux sont déviés en quelques secondes seulement, mais jamais directement pour un trou noir. Il existe plusieurs candidats à cet effet, en orbite autour du Sagittaire A*. Plusieurs conjectures ad hoc ont été introduites pour mieux expliquer les observations de (...) candidats de trous noirs astronomiques identiques, mais avec des mécanismes de fonctionnement différents : gravastar, étoile noire (gravité semi-classique) , étoile à énergie noire, etc. DOI: 10.13140/RG.2.2.27478.06723. (shrink)
В статье рассмотрено как результаты квантовых экспериментов могут изменить метафизические представления о реальности. Экспериментальная проверка неравенств Белла, Леггета, Леггета—Гарга, а также эксперименты с отложенным выбором и квантовым «ластиком» подтверждают, что для квантовых объектов следует отказаться от представлений классического реализма. Однако конкуренция между квантовым анти-реализмом и квантовым реализмом продолжается.
This is an explanation of a possible new insight into the halting problem provided in the language of software engineering. Technical computer science terms are explained using software engineering terms. No knowledge of the halting problem is required. -/- It is based on fully operational software executed in the x86utm operating system. The x86utm operating system (based on an excellent open source x86 emulator) was created to study the details of the halting problem proof counter-examples at the (...) much higher level of abstraction of C/x86. (shrink)
B. Plawgo, A. Grabska, M. Klimczuk-Kochańska, A. Klimczuk, J. Kierklo, J. Żynel-Etel, Startery podlaskiej gospodarki. Analiza gospodarczych obszarów wzrostu i innowacji województwa podlaskiego: sektor produkcji oprogramowania komputerowego, Wojewódzki Urz¸ad Pracy w Białymstoku, Białystok 2011.
The utopian character of modern scientific theories, with the human nature as a subject, is an inevitable consequence of the presence of an imperative component of transdisciplinary human dimensional scientific knowledge. Its social function is the adaptation of the descriptive component of the theory to the given socio-cultural type that simplifies the passage of the process of social verification of the theory. The genesis of bioethics can be seen as one of the basic premises for the actualization of the (...) anthropic principle of ontology, which thus acquires the axiological and epistemological sense. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.