Results for 'data standards'

980 found
Order:
  1. (1 other version)Open data, open review and open dialogue in making social sciences plausible.Quan-Hoang Vuong - 2017 - Nature: Scientific Data Updates 2017.
    Nowadays, protecting trust in social sciences also means engaging in open community dialogue, which helps to safeguard robustness and improve efficiency of research methods. The combination of open data, open review and open dialogue may sound simple but implementation in the real world will not be straightforward. However, in view of Begley and Ellis’s (2012) statement that, “the scientific process demands the highest standards of quality, ethics and rigour,” they are worth implementing. More importantly, they are feasible to (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  2. HL7 RIM: An incoherent standard.Barry Smith & Werner Ceusters - 2006 - Studies in Health Technology and Informatics 124 (Proceedings of MIE 2006):133–138.
    The Health Level 7 Reference Information Model (HL7 RIM) is lauded by its authors as ‘the foundation of healthcare interoperability’. Yet even after some 10 years of development work, the RIM is still subject to a variety of logical and ontological flaws which have placed severe obstacles in the way of those who are called upon to develop implementations. We offer evidence that these obstacles are insurmountable and that the time has come to abandon an unworkable paradigm.
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  3. The Relationship between Performance Standards and Achieving the Objectives of Supervision at the Islamic University in Gaza.Ashraf A. M. Salama, Mazen Al Shobaki, Samy S. Abu-Naser, Abed Alfetah M. AlFerjany & Youssef M. Abu Amuna - 2018 - International Journal of Engineering and Information Systems (IJEAIS) 1 (10):89-101.
    The aim of the research is to identify the relationship between the performance criteria and the achievement of the objectives of supervision which is represented in the performance of the job at the Islamic University in Gaza Strip. To achieve the objectives of the research, the researchers used the descriptive analytical approach to collect information. The questionnaire consisted of (22) paragraphs distributed to three categories of employees of the Islamic University (senior management, faculty members, their assistants and members of the (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  4. Varying Evidential Standards as a Matter of Justice.Ahmad Elabbar - forthcoming - British Journal for the Philosophy of Science.
    The setting of evidential standards is a core practice of scientific assessment for policy. Persuaded by considerations of inductive risk, philosophers generally agree that the justification of evidential standards must appeal to non-epistemic values but debate whether the balance of non-epistemic reasons favours varying evidential standards versus maintaining fixed high evidential standards in assessment, as both sets of standards promote different and important political virtues of advisory institutions. In this paper, I adjudicate the evidential (...) debate by developing a novel argument from justice, drawing on the IPCC’s assessment of climate impacts as a case study. I argue that in assessments marked by background evidential inequality, maintaining fixed high evidential standards results in an unequal distribution of ‘epistemic power’ among stakeholders, producing a ‘powerful assessment’ for the data-rich (a high rate of findings) and a ‘weak assessment’ for the data-poor (a low rate of findings). Where such inequalities of epistemic power disadvantage those in data-poor regions with respect to fundamental interests, such as basic human rights, we have decisive reasons of justice to vary evidential standards. (shrink)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  5. AI-Driven Synthetic Data Generation for Financial Product Development: Accelerating Innovation in Banking and Fintech through Realistic Data Simulation.Debasish Paul Rajalakshmi Soundarapandiyan, Praveen Sivathapandi - 2022 - Journal of Artificial Intelligence Research and Applications 2 (2):261-303.
    The rapid evolution of the financial sector, particularly in banking and fintech, necessitates continuous innovation in financial product development and testing. However, challenges such as data privacy, regulatory compliance, and the limited availability of diverse datasets often hinder the effective development and deployment of new products. This research investigates the transformative potential of AI-driven synthetic data generation as a solution for accelerating innovation in financial product development. Synthetic data, generated through advanced AI techniques such as Generative Adversarial (...)
    Download  
     
    Export citation  
     
    Bookmark  
  6. No wisdom in the crowd: genome annotation at the time of big data - current status and future prospects.Antoine Danchin - 2018 - Microbial Biotechnology 11 (4):588-605.
    Science and engineering rely on the accumulation and dissemination of knowledge to make discoveries and create new designs. Discovery-driven genome research rests on knowledge passed on via gene annotations. In response to the deluge of sequencing big data, standard annotation practice employs automated procedures that rely on majority rules. We argue this hinders progress through the generation and propagation of errors, leading investigators into blind alleys. More subtly, this inductive process discourages the discovery of novelty, which remains essential in (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  7. “Who Should I Trust with My Data?” Ethical and Legal Challenges for Innovation in New Decentralized Data Management Technologies.Haleh Asgarinia, Andrés Chomczyk Penedo, Beatriz Esteves & Dave Lewis - 2023 - Information (Switzerland) 14 (7):1-17.
    News about personal data breaches or data abusive practices, such as Cambridge Analytica, has questioned the trustworthiness of certain actors in the control of personal data. Innovations in the field of personal information management systems to address this issue have regained traction in recent years, also coinciding with the emergence of new decentralized technologies. However, only with ethically and legally responsible developments will the mistakes of the past be avoided. This contribution explores how current data management (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  8. Critical Provocations for Synthetic Data.Daniel Susser & Jeremy Seeman - 2024 - Surveillance and Society 22 (4):453-459.
    Training artificial intelligence (AI) systems requires vast quantities of data, and AI developers face a variety of barriers to accessing the information they need. Synthetic data has captured researchers’ and industry’s imagination as a potential solution to this problem. While some of the enthusiasm for synthetic data may be warranted, in this short paper we offer critical counterweight to simplistic narratives that position synthetic data as a cost-free solution to every data-access challenge—provocations highlighting ethical, political, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  9. Is semantic information meaningful data?Luciano Floridi - 2005 - Philosophy and Phenomenological Research 70 (2):351-370.
    There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not (...)
    Download  
     
    Export citation  
     
    Bookmark   103 citations  
  10. From data to semantic information.Luciano Floridi - 2003 - Entropy 5:125–145.
    There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  11. Critical remarks on current practices of data article publishing: Issues, challenges, and recommendations.Quan-Hoang Vuong, Viet-Phuong La & Minh-Hoang Nguyen - 2024 - Data Science and Informetrics 4 (2):1-14.
    The contribution of the data paper publishing paradigm to the knowledge generation and validation processes is becoming substantial and pivotal. In this paper, through the information-processing perspective of Mindsponge Theory, we discuss how the data article publishing system serves as a filtering mechanism for quality control of the increasingly chaotic datasphere. The overemphasis on machine-actionality and technical standards presents some shortcomings and limitations of the data article publishing system, such as the lack of consideration of humanistic (...)
    Download  
     
    Export citation  
     
    Bookmark  
  12. Are publicly available (personal) data “up for grabs”? Three privacy arguments.Elisa Orrù - 2024 - In Paul De Hert, Hideyuki Matsumi, Dara Hallinan, Diana Dimitrova & Eleni Kosta, Data Protection and Privacy, Volume 16: Ideas That Drive Our Digital World. London: Hart. pp. 105-123.
    The re-use of publicly available (personal) data for originally unanticipated purposes has become common practice. Without such secondary uses, the development of many AI systems like large language models (LLMs) and ChatGPT would not even have been possible. This chapter addresses the ethical implications of such secondary processing, with a particular focus on data protection and privacy issues. Legal and ethical evaluations of secondary processing of publicly available personal data diverge considerably both among scholars and the general (...)
    Download  
     
    Export citation  
     
    Bookmark  
  13. CIDO, a community-based ontology for coronavirus disease knowledge and data integration, sharing, and analysis.Oliver He, John Beverley, Gilbert S. Omenn, Barry Smith, Brian Athey, Luonan Chen, Xiaolin Yang, Junguk Hur, Hsin-hui Huang, Anthony Huffman, Yingtong Liu, Yang Wang, Edison Ong & Hong Yu - 2020 - Scientific Data 181 (7):5.
    Ontologies, as the term is used in informatics, are structured vocabularies comprised of human- and computer-interpretable terms and relations that represent entities and relationships. Within informatics fields, ontologies play an important role in knowledge and data standardization, representation, integra- tion, sharing and analysis. They have also become a foundation of artificial intelligence (AI) research. In what follows, we outline the Coronavirus Infectious Disease Ontology (CIDO), which covers multiple areas in the domain of coronavirus diseases, including etiology, transmission, epidemiology, pathogenesis, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  14. SNOMED CT standard ontology based on the ontology for general medical science.Shaker El-Sappagh, Francesco Franda, Ali Farman & Kyung-Sup Kwak - 2018 - BMC Medical Informatics and Decision Making 76 (18):1-19.
    Background: Systematized Nomenclature of Medicine—Clinical Terms (SNOMED CT, hereafter abbreviated SCT) is a comprehensive medical terminology used for standardizing the storage, retrieval, and exchange of electronic health data. Some efforts have been made to capture the contents of SCT as Web Ontology Language (OWL), but these efforts have been hampered by the size and complexity of SCT. -/- Method: Our proposal here is to develop an upper-level ontology and to use it as the basis for defining the terms in (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  15. Intermediate Role of Operations Standard in the Relationship between the Focus on Benefiting Students and Students Satisfaction in Palestinian Universities.Suliman A. El Talla, Mazen J. Al Shobaki, Samy S. Abu-Naser & Ahmed M. A. FarajAllah - 2019 - International Journal of Academic Multidisciplinary Research (IJAMR) 3 (5):86-100.
    The study aimed to identify the intermediate role of the standard of operations in the relationship between the focus on students and beneficiaries in achieving satisfaction of students in Palestinian universities. The study used the analytical descriptive method. The study was conducted on university leadership in Al-Azhar, Islamic and Al-Aqsa Universities. The study sample consisted of (200) individuals, 182 of whom responded, and the questionnaire was used in collecting the data. The results of the study were as follows: - (...)
    Download  
     
    Export citation  
     
    Bookmark  
  16. Advanced AI Algorithms for Automating Data Preprocessing in Healthcare: Optimizing Data Quality and Reducing Processing Time.Muthukrishnan Muthusubramanian Praveen Sivathapandi, Prabhu Krishnaswamy - 2022 - Journal of Science and Technology (Jst) 3 (4):126-167.
    This research paper presents an in-depth analysis of advanced artificial intelligence (AI) algorithms designed to automate data preprocessing in the healthcare sector. The automation of data preprocessing is crucial due to the overwhelming volume, diversity, and complexity of healthcare data, which includes medical records, diagnostic imaging, sensor data from medical devices, genomic data, and other heterogeneous sources. These datasets often exhibit various inconsistencies such as missing values, noise, outliers, and redundant or irrelevant information that necessitate (...)
    Download  
     
    Export citation  
     
    Bookmark  
  17. Are Different Standards Warranted to Evaluate Psi?George Williams - 2016 - Journal of Parapsychology 79 (2):186-202.
    Throughout the debate on psi, skeptics have almost universally insisted on different standards for evaluating the evidence, claiming that psi represents a radical departure from our current scientific understanding. Thus, there is considerable ambiguity about what standard of evaluation psi must meet. Little attention has been paid to the possible harm to the integrity of scientific investigation from this resulting inconsistency in testing standards. Some have proposed using a Bayesian framework as an improvement on this dilemma in order (...)
    Download  
     
    Export citation  
     
    Bookmark  
  18. Ethical Standards in Higher Education.Eutychus Gichuru - 2023 - Kiu Journal of Education 3 (2):98-114.
    A study was conducted regarding ways in which higher education institutions can improve ethics. Theoretical frameworks used included: Virtue ethics, deontological and environmental ethics theories. The total sampled written texts were 94. Non-probability sampling was used. The type that was used was online convenience sampling through web scraping. Philosophical assumption that guided this study was interpretivism and the approach was Qualitative. Case study was used as a design and content analysis as a method of data analysis. Some of the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  19. Would you mind being watched by machines? Privacy concerns in data mining.Vincent C. Müller - 2009 - AI and Society 23 (4):529-544.
    "Data mining is not an invasion of privacy because access to data is only by machines, not by people": this is the argument that is investigated here. The current importance of this problem is developed in a case study of data mining in the USA for counterterrorism and other surveillance purposes. After a clarification of the relevant nature of privacy, it is argued that access by machines cannot warrant the access to further information, since the analysis will (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  20. Horizontal Integration of Warfighter Intelligence Data: A Shared Semantic Resource for the Intelligence Community.Barry Smith, Tatiana Malyuta, William S. Mandrick, Chia Fu, Kesny Parent & Milan Patel - 2012 - In Barry Smith, Tatiana Malyuta, William S. Mandrick, Chia Fu, Kesny Parent & Milan Patel, Proceedings of the Conference on Semantic Technology in Intelligence, Defense and Security (STIDS), CEUR. pp. 1-8.
    We describe a strategy that is being used for the horizontal integration of warfighter intelligence data within the framework of the US Army’s Distributed Common Ground System Standard Cloud (DSC) initiative. The strategy rests on the development of a set of ontologies that are being incrementally applied to bring about what we call the ‘semantic enhancement’ of data models used within each intelligence discipline. We show how the strategy can help to overcome familiar tendencies to stovepiping of intelligence (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  21.  43
    Building Scalable Data Warehouses for Financial Analytics in Large Enterprises.Vijayan Naveen Edapurath - 2024 - International Journal of Innovative Research and Creative Technology 10 (3):1-10.
    In today's digital era, large enterprises face the daunting task of managing and analyzing vast volumes of financial data to inform strategic decision-making and maintain a competitive edge. Traditional data warehousing solutions often fall short in addressing the scale, complexity, and performance demands of modern financial analytics. This paper explores the architectural principles, technological strategies, and best practices essential for building scalable data warehouses tailored to the needs of financial analytics in large organizations. It delves into (...) integration techniques, performance optimization methods, security measures, and compliance with regulatory standards. Through in-depth analysis and real-world case studies, the paper provides a comprehensive roadmap for practitioners aiming to design and implement robust, scalable, and secure data warehousing solutions. (shrink)
    Download  
     
    Export citation  
     
    Bookmark  
  22. Teachers’ Attitudes to The Implementation of Bhutan Professional Standards (BPST) For Teachers in Bhutan.Karma Wangda - 2023 - Universal Journal of Educational Research 2 (3):268-280.
    Implementation of Bhutan Professional Standards for Teachers in Bhutan is a key impetus for teachers across the country to enhance impeccable teacher competency. Studies on teachers’ attitudes towards professional standards show a positive as there was a significant correlation between teachers’ competencies, learners’ academic achievement, and the quality of education. However, Bhutan Professional Standards for Teachers is relatively new and there is little study exists relative to Bhutan. The study on Teachers’ Attitudes to the Implementation of Bhutan (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  23. Ontology-based knowledge representation of experiment metadata in biological data mining.Scheuermann Richard, Kong Megan, Dahlke Carl, Cai Jennifer, Lee Jamie, Qian Yu, Squires Burke, Dunn Patrick, Wiser Jeff, Hagler Herb, Herb Hagler, Barry Smith & David Karp - 2009 - In Chen Jake & Lonardi Stefano, Biological Data Mining. Chapman Hall / Taylor and Francis. pp. 529-559.
    According to the PubMed resource from the U.S. National Library of Medicine, over 750,000 scientific articles have been published in the ~5000 biomedical journals worldwide in the year 2007 alone. The vast majority of these publications include results from hypothesis-driven experimentation in overlapping biomedical research domains. Unfortunately, the sheer volume of information being generated by the biomedical research enterprise has made it virtually impossible for investigators to stay aware of the latest findings in their domain of interest, let alone to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  24. Knowledge Attributions and Relevant Epistemic Standards.Dan Zeman - 2010 - In François Récanati, Isidora Stojanovic & Neftalí Villanueva, Context Dependence, Perspective and Relativity. Mouton de Gruyter.
    The paper is concerned with the semantics of knowledge attributions(K-claims, for short) and proposes a position holding that K-claims are contextsensitive that differs from extant views on the market. First I lay down the data a semantic theory for K-claims needs to explain. Next I present and assess three views purporting to give the semantics for K-claims: contextualism, subject-sensitive invariantism and relativism. All three views are found wanting with respect to their accounting for the data. I then propose (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  25. The Availability of the Resource Standard and Partnership as One of the Possibilities of Excellence in Palestinian Universities According to the European Model.Suliman A. El Talla, Ahmed M. A. FarajAllah, Samy S. Abu-Naser & Mazen J. Al Shobaki - 2018 - International Journal of Academic Multidisciplinary Research (IJAMR) 2 (11):31-40.
    The study aimed to identify the availability of the resource and partnership standard as one of the possibilities of excellence in Palestinian universities according to the European model. The study used the analytical descriptive method. The study was conducted on the university leadership at Al - Azhar and Islamic Universities, where the study population consisted of (282) individuals. The study sample consisted of (135) individuals, (119) of them responded, and the questionnaire was used in collecting the data. The study (...)
    Download  
     
    Export citation  
     
    Bookmark  
  26.  67
    Definiteness in Tunisian Arabizi: Some Data from Statistical Approaches.Elisa Gugliotta, Angelapia Massaro, Giuliano Mion & Marco Dinarelli - 2024 - Romano-Arabica 23:49-76.
    We present a statistical analysis of the realization of definiteness in Tunisian Arabic (TA) texts written in Arabizi, a hybrid system reflecting some features of TA phonetics (assimilation), but also showing orthographic features, as the use of arithmographs. In §1, we give an overview of definiteness in TA from a semantic and syntactic point of view. In §2 we outline a typology of definite articles and show that TA normally marks definiteness with articles or similar devices, but also presents zero-markings (...)
    Download  
     
    Export citation  
     
    Bookmark  
  27. Challenges and recommendations for wearable devices in digital health: Data quality, interoperability, health equity, fairness.Stefano Canali, Viola Schiaffonati & Andrea Aliverti - 2022 - PLOS Digital Health 1 (10):e0000104.
    Wearable devices are increasingly present in the health context, as tools for biomedical research and clinical care. In this context, wearables are considered key tools for a more digital, personalised, preventive medicine. At the same time, wearables have also been associated with issues and risks, such as those connected to privacy and data sharing. Yet, discussions in the literature have mostly focused on either technical or ethical considerations, framing these as largely separate areas of discussion, and the contribution of (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  28. Polarization and Belief Dynamics in the Black and White Communities: An Agent-Based Network Model from the Data.Patrick Grim, Stephen B. Thomas, Stephen Fisher, Christopher Reade, Daniel J. Singer, Mary A. Garza, Craig S. Fryer & Jamie Chatman - 2012 - In Christoph Adami, David M. Bryson, Charles Offria & Robert T. Pennock, Artificial Life 13. MIT Press.
    Public health care interventions—regarding vaccination, obesity, and HIV, for example—standardly take the form of information dissemination across a community. But information networks can vary importantly between different ethnic communities, as can levels of trust in information from different sources. We use data from the Greater Pittsburgh Random Household Health Survey to construct models of information networks for White and Black communities--models which reflect the degree of information contact between individuals, with degrees of trust in information from various sources correlated (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  29. Methodology for semantic enhancement of intelligence data.Barry Smith, Tatiana Malyuta & William Mandrick - 2013 - CUBRC Report.
    What follows is a contribution to the horizontal integration of warfighter intelligence data as defined in Chairman of the Joint Chiefs of Staff Instruction J2 CJCSI 3340.02AL: -/- Horizontally integrating warfighter intelligence data improves the consumers’ production, analysis and dissemination capabilities. HI requires access (including discovery, search, retrieval, and display) to intelligence data among the warfighters and other producers and consumers via standardized services and architectures. These consumers include, but are not limited to, the combatant commands, Services, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  30. Gestalt Models for Data Decomposition and Functional Architecture in Visual Neuroscience.Carmelo Calì - 2013 - Gestalt Theory 35 (3).
    Attempts to introduce Gestalt theory into the realm of visual neuroscience are discussed on both theoretical and experimental grounds. To define the framework in which these proposals can be defended, this paper outlines the characteristics of a standard model, which qualifies as a received view in the visual neurosciences, and of the research into natural images statistics. The objections to the standard model and the main questions of the natural images research are presented. On these grounds, this paper defends the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  31. Tests and Problems of the Standard Model in Cosmology.Martín López-Corredoira - 2017 - Foundations of Physics 47 (6):711-768.
    The main foundations of the standard \CDM model of cosmology are that: the redshifts of the galaxies are due to the expansion of the Universe plus peculiar motions; the cosmic microwave background radiation and its anisotropies derive from the high energy primordial Universe when matter and radiation became decoupled; the abundance pattern of the light elements is explained in terms of primordial nucleosynthesis; and the formation and evolution of galaxies can be explained only in terms of gravitation within a inflation (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  32.  20
    Improving Generative AI Models for Secure and Private Data Synthesis.Sharma Sidharth - 2022 - Journal of Science Technology and Research (JSTAR) 3 (1):210-215.
    Generative Adversarial Networks (GANs) have demonstrated significant potential in generating synthetic data for various applications, including those involving sensitive information like healthcare and finance. However, two major issues arise when GANs are applied to sensitive datasets: (i) the model may memorize training samples, compromising the privacy of individuals, especially when the data includes personally identifiable information (PII), and (ii) there is a lack of control over the specificity of the generated samples, which limits their utility for tailored use-cases. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  33. Formalising the 'No Information Without Data-Representation' Principle.Patrick Allo - 2008 - In P. Brey, A. Briggle & K. Waelbers, Current Issues in Computing and Philosophy. IOS Press. pp. 79.
    One of the basic principles of the general definition of information is its rejection of dataless information, which is reflected in its endorsement of an ontological neutrality. In general, this principles states that “there can be no information without physical implementation” (Floridi (2005)). Though this is standardly considered a commonsensical assumption, many questions arise with regard to its generalised application. In this paper a combined logic for data and information is elaborated, and specifically used to investigate the consequences of (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  34. Biomedical Terminologies and Ontologies: Enabling Biomedical Semantic Interoperability and Standards in Europe.Bernard de Bono, Mathias Brochhausen, Sybo Dijkstra, Dipak Kalra, Stephan Keifer & Barry Smith - 2009 - In Bernard de Bono, Mathias Brochhausen, Sybo Dijkstra, Dipak Kalra, Stephan Keifer & Barry Smith, European Large-Scale Action on Electronic Health.
    In the management of biomedical data, vocabularies such as ontologies and terminologies (O/Ts) are used for (i) domain knowledge representation and (ii) interoperability. The knowledge representation role supports the automated reasoning on, and analysis of, data annotated with O/Ts. At an interoperability level, the use of a communal vocabulary standard for a particular domain is essential for large data repositories and information management systems to communicate consistently with one other. Consequently, the interoperability benefit of selecting a particular (...)
    Download  
     
    Export citation  
     
    Bookmark  
  35. Moral Implications of Data-Mining, Key-word Searches, and Targeted Electronic Surveillance.Michael Skerker - 2015 - In Bradley J. Strawser, Fritz Allhoff & Adam Henschke, Binary Bullets.
    This chapter addresses the morality of two types of national security electronic surveillance (SIGINT) programs: the analysis of communication “metadata” and dragnet searches for keywords in electronic communication. The chapter develops a standard for assessing coercive government action based on respect for the autonomy of inhabitants of liberal states and argues that both types of SIGINT can potentially meet this standard. That said, the collection of metadata creates opportunities for abuse of power, and so judgments about the trustworthiness and competence (...)
    Download  
     
    Export citation  
     
    Bookmark  
  36. A noncontextualist account of contextualist linguistic data.Mylan Engel - 2005 - Acta Analytica 20 (2):56-79.
    The paper takes as its starting point the observation that people can be led to retract knowledge claims when presented with previously ignored error possibilities, but offers a noncontextualist explanation of the data. Fallibilist epistemologies are committed to the existence of two kinds of Kp -falsifying contingencies: (i) Non-Ignorable contingencies [NI-contingencies] and (ii) Properly-Ignorable contingencies [PI-contingencies]. For S to know that p, S must be in an epistemic position to rule out all NI-contingencies, but she need not be able (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  37. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.Jie Zheng, Marcelline R. Harris, Anna Maria Masci, Lin Yu, Alfred Hero, Barry Smith & Yongqun He - 2016 - Journal of Biomedical Semantics 7 (53).
    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. Terms in OBCS, including ‘data collection’, ‘data transformation in statistics’, ‘data (...)
    Download  
     
    Export citation  
     
    Bookmark  
  38. The norm of assertion: Empirical data.Markus Kneer - 2018 - Cognition 177 (C):165-171.
    Assertions are speech acts by means of which we express beliefs. As such they are at the heart of our linguistic and social practices. Recent research has focused extensively on the question whether the speech act of assertion is governed by norms, and if so, under what conditions it is acceptable to make an assertion. Standard theories propose, for instance, that one should only assert that p if one knows that p (the knowledge account), or that one should only assert (...)
    Download  
     
    Export citation  
     
    Bookmark   31 citations  
  39. Philosophical Methodology: From Data to Theory. [REVIEW]Ethan Landes - forthcoming - Philosophical Quarterly.
    It is impossible to study philosophical methodology without being struck by the state of absolute chaos of the field’s methodological practices, methodological norms, and metaphilosophical beliefs. Not only are the methods of formal epistemology nothing like the methods of aesthetics, but even within specific debates and subfields, there are often significant disagreements about standards of proof, to say nothing about disagreements about the ultimate nature of the debate. The question facing metaphilosophers is whether this chaos is a feature or (...)
    Download  
     
    Export citation  
     
    Bookmark  
  40. Enhancing Network Security in Healthcare Institutions: Addressing Connectivity and Data Protection Challenges.Bellamkonda Srikanth - 2019 - International Journal of Innovative Research in Computer and Communication Engineering 7 (2):1365-1375.
    The rapid adoption of digital technologies in healthcare has revolutionized patient care, enabling seamless data sharing, remote consultations, and enhanced medical record management. However, this digital transformation has also introduced significant challenges to network security and data protection. Healthcare institutions face a dual challenge: ensuring uninterrupted connectivity for critical operations and safeguarding sensitive patient information from cyber threats. These challenges are exacerbated by the increased use of interconnected devices, electronic health records (EHRs), and cloud-based solutions, which, while enhancing (...)
    Download  
     
    Export citation  
     
    Bookmark  
  41. Developments and Uses of Generative Artificial Intelligence and Present Experimental Data on the Impact on Productivity Applying Artificial Intelligence that is Generative.Tambi Varun Kumar - 2024 - International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering 12 (10):2382-2388.
    In the context of mid-level professional writing jobs, we examine the productivity effects of a generative artificial intelligence technology, namely the assistive chatbot ChatGPT. We used ChatGPT to randomly expose half of the 444 college-educated professionals to occupation-specific, incentive-based writing tasks in a preregistered online experiment. Our results show that ChatGPT considerably increases average productivity: output quality improves by 0.4 standard deviations and task completion time drops by 0.8 standard deviations. By compressing the production distribution, ChatGPT also lessens worker inequality, (...)
    Download  
     
    Export citation  
     
    Bookmark   30 citations  
  42.  57
    The Instrumentarian Power of Artificial Intelligence in Data-Driven Fascist Regimes.Anaïs Nony - 2024 - la Furia Umana 1 (1):1-16.
    AI-powered technology can both promote accuracy and hide the standards of measurement and circulation of information. It can also produce models that are opaque and hard to access. As such, the new paradigm of AI asks to pounder about societal values and sets of priorities we want to promote, especially as these technologies are further deployed in times of warfare. The systemic tracking of people’s life and the opaqueness of the models designate a new paradigm in the formation of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  43.  19
    Improving Generative AI Models for Secure and Private Data Synthesis.Sharma Sidharth - 2015 - International Journal of Engineering Innovations and Management Strategies 1 (1):1-4.
    Generative Adversarial Networks (GANs) have demonstrated significant potential in generating synthetic data for various applications, including those involving sensitive information like healthcare and finance. However, two major issues arise when GANs are applied to sensitive datasets: (i) the model may memorize training samples, compromising the privacy of individuals, especially when the data includes personally identifiable information (PII), and (ii) there is a lack of control over the specificity of the generated samples, which limits their utility for tailored usecases. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  44. Peer Review system: A Golden standard for publications process.Shamima Parvin Lasker - 2018 - Bangladesh Journal of Bioethics 9 (1):13-23.
    Peer review process helps in evaluating and validating of research that is published in the journals. U.S. Office of Research Integrity reported that data fraudulence was found to be involved in 94% cases of misconduct from 228 identified articles between 1994–2012. If fraud in published article are significantly as high as reported, the question arise in mind, were these articles peer reviewed? Another report said that the reviewers failed to detect 16 cases of fabricated article of Jan Hendrick Schon. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  45. Improving the Quality and Utility of Electronic Health Record Data through Ontologies.Asiyah Yu Lin, Sivaram Arabandi, Thomas Beale, William Duncan, Hicks D., Hogan Amanda, R. William, Mark Jensen, Ross Koppel, Catalina Martínez-Costa, Øystein Nytrø, Jihad S. Obeid, Jose Parente de Oliveira, Alan Ruttenberg, Selja Seppälä, Barry Smith, Dagobert Soergel, Jie Zheng & Stefan Schulz - 2023 - Standards 3 (3):316–340.
    The translational research community, in general, and the Clinical and Translational Science Awards (CTSA) community, in particular, share the vision of repurposing EHRs for research that will improve the quality of clinical practice. Many members of these communities are also aware that electronic health records (EHRs) suffer limitations of data becoming poorly structured, biased, and unusable out of original context. This creates obstacles to the continuity of care, utility, quality improvement, and translational research. Analogous limitations to sharing objective (...) in other areas of the natural sciences have been successfully overcome by developing and using common ontologies. This White Paper presents the authors’ rationale for the use of ontologies with computable semantics for the improvement of clinical data quality and EHR usability formulated for researchers with a stake in clinical and translational science and who are advocates for the use of information technology in medicine but at the same time are concerned by current major shortfalls. This White Paper outlines pitfalls, opportunities, and solutions and recommends increased investment in research and development of ontologies with computable semantics for a new generation of EHRs. (shrink)
    Download  
     
    Export citation  
     
    Bookmark  
  46. ISARIC-COVID-19 dataset: A Prospective, Standardized, Global Dataset of Patients Hospitalized with COVID-19.Isaric Clinical Characterization Group - 2022 - Scientific Data 9 (1):454.
    The International Severe Acute Respiratory and Emerging Infection Consortium (ISARIC) COVID-19 dataset is one of the largest international databases of prospectively collected clinical data on people hospitalized with COVID-19. This dataset was compiled during the COVID-19 pandemic by a network of hospitals that collect data using the ISARIC-World Health Organization Clinical Characterization Protocol and data tools. The database includes data from more than 705,000 patients, collected in more than 60 countries and 1,500 centres worldwide. Patient (...) are available from acute hospital admissions with COVID-19 and outpatient follow-ups. The data include signs and symptoms, pre-existing comorbidities, vital signs, chronic and acute treatments, complications, dates of hospitalization and discharge, mortality, viral strains, vaccination status, and other data. Here, we present the dataset characteristics, explain its architecture and how to gain access, and provide tools to facilitate its use. (shrink)
    Download  
     
    Export citation  
     
    Bookmark  
  47. (6 other versions)Ontology (science).Barry Smith - 2001 - In Barry Smith & Christopher Welty, Formal Ontology in Information Systems (FOIS). ACM Press. pp. 21-35.
    Increasingly, in data-intensive areas of the life sciences, experimental results are being described in algorithmically useful ways with the help of ontologies. Such ontologies are authored and maintained by scientists to support the retrieval, integration and analysis of their data. The proposition to be defended here is that ontologies of this type – the Gene Ontology (GO) being the most conspicuous example – are a part of science. Initial evidence for the truth of this proposition (which some will (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  48. Biomedical Ontologies.Barry Smith - 2023 - In Peter L. Elkin, Terminology, Ontology and their Implementations. Cham, Switzerland: Springer Nature. pp. 125-169.
    We begin at the beginning, with an outline of Aristotle’s views on ontology and with a discussion of the influence of these views on Linnaeus. We move from there to consider the data standardization initiatives launched in the 19th century, and then turn to investigate how the idea of computational ontologies developed in the AI and knowledge representation communities in the closing decades of the 20th century. We show how aspects of this idea, particularly those relating to the use (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  49. The ImmPort Antibody Ontology.William Duncan, Travis Allen, Jonathan Bona, Olivia Helfer, Barry Smith, Alan Ruttenberg & Alexander D. Diehl - 2016 - Proceedings of the International Conference on Biological Ontology 1747.
    Monoclonal antibodies are essential biomedical research and clinical reagents that are produced by companies and research laboratories. The NIAID ImmPort (Immunology Database and Analysis Portal) resource provides a long-term, sustainable data warehouse for immunological data generated by NIAID, DAIT and DMID funded investigators for data archiving and re-use. A variety of immunological data is generated using techniques that rely upon monoclonal antibody reagents, including flow cytometry, immunofluorescence, and ELISA. In order to facilitate querying, integration, and reuse (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  50. A new framework for host-pathogen interaction research.Hong Yu, Li Li, Anthony Huffman, John Beverley, Junguk Hur, Eric Merrell, Hsin-hui Huang, Yang Wang, Yingtong Liu, Edison Ong, Liang Cheng, Tao Zeng, Jingsong Zhang, Pengpai Li, Zhiping Liu, Zhigang Wang, Xiangyan Zhang, Xianwei Ye, Samuel K. Handelman, Jonathan Sexton, Kathryn Eaton, Gerry Higgins, Gilbert S. Omenn, Brian Athey, Barry Smith, Luonan Chen & Yongqun He - 2022 - Frontiers in Immunology 13.
    COVID-19 often manifests with different outcomes in different patients, highlighting the complexity of the host-pathogen interactions involved in manifestations of the disease at the molecular and cellular levels. In this paper, we propose a set of postulates and a framework for systematically understanding complex molecular host-pathogen interaction networks. Specifically, we first propose four host-pathogen interaction (HPI) postulates as the basis for understanding molecular and cellular host-pathogen interactions and their relations to disease outcomes. These four postulates cover the evolutionary dispositions involved (...)
    Download  
     
    Export citation  
     
    Bookmark  
1 — 50 / 980