We begin at the beginning, with an outline of Aristotle’s views on ontology and with a discussion of the influence of these views on Linnaeus. We move from there to consider the data standardization initiatives launched in the 19th century, and then turn to investigate how the idea of computational ontologies developed in the AI and knowledge representation communities in the closing decades of the 20th century. We show how aspects of this idea, particularly those relating to the use (...) of the term 'concept' in ontology development, influenced SNOMED CT and other medical terminologies. Against this background we then show how the Foundational Model of Anatomy, the Gene Ontology, Basic Formal Ontology and other OBO Foundry ontologies came into existence and discuss their role in the development of contemporary biomedical informatics. (shrink)
Biomedical ontologies exist to serve integration of clinical and experimental data, and it is critical to their success that they be put to widespread use in the annotation of data. How, then, can ontologies achieve the sort of user-friendliness, reliability, cost-effectiveness, and breadth of coverage that is necessary to ensure extensive usage? Methods: Our focus here is on two different sets of answers to these questions that have been proposed, on the one hand in medicine, by the SNOMED CT (...) community, and on the other hand in biology, by the OBO Foundry. We address more specifically the issue as to how adherence to certain development principles can advance the usability and effectiveness of an ontology or terminology resource, for example by allowing more accurate maintenance, more reliable application, and more efficient interoperation with other ontologies and information resources. Results: SNOMED CT and the OBO Foundry differ considerably in their general approach.Nevertheless, a general trend towards more formal rigor and cross-domain interoperability can be seen in both and we argue that this trend should be accepted by all similar initiatives in the future. Conclusions: Future efforts in ontology development have to address the need for harmonization and integration of ontologies across disciplinary borders, and for this, coherent formalization of ontologies is a prerequisite. (shrink)
To enhance the treatment of relations in biomedical ontologies we advance a methodology for providing consistent and unambiguous formal definitions of the relational expressions used in such ontologies in a way designed to assist developers and users in avoiding errors in coding and annotation. The resulting Relation Ontology can promote interoperability of ontologies and support new types of automated reasoning about the spatial and temporal dimensions of biological and medical phenomena.
While representation learning techniques have shown great promise in application to a number of different NLP tasks, they have had little impact on the problem of ontology matching. Unlike past work that has focused on feature engineering, we present a novel representation learning approach that is tailored to the ontology matching task. Our approach is based on embedding ontological terms in a high-dimensional Euclidean space. This embedding is derived on the basis of a novel phrase retrofitting strategy through (...) which semantic similarity information becomes inscribed onto fields of pre-trained word vectors. The resulting framework also incorporates a novel outlier detection mechanism based on a denoising autoencoder that is shown to improve performance. An ontology matching system derived using the proposed framework achieved an F-score of 94% on an alignment scenario involving the Adult Mouse Anatomical Dictionary and the Foundational Model of Anatomy ontology (FMA) as targets. This compares favorably with the best performing systems on the Ontology Alignment Evaluation Initiative anatomy challenge. We performed additional experiments on aligning FMA to NCI Thesaurus and to SNOMED CT based on a reference alignment extracted from the UMLS Metathesaurus. Our system obtained overall F-scores of 93.2% and 89.2% for these experiments, thus achieving state-of-the-art results. (shrink)
The National Center for BiomedicalOntology is a consortium that comprises leading informaticians, biologists, clinicians, and ontologists, funded by the National Institutes of Health (NIH) Roadmap, to develop innovative technology and methods that allow scientists to record, manage, and disseminate biomedical information and knowledge in machine-processable form. The goals of the Center are (1) to help unify the divergent and isolated efforts in ontology development by promoting high quality open-source, standards-based tools to create, manage, and use (...) ontologies, (2) to create new software tools so that scientists can use ontologies to annotate and analyze biomedical data, (3) to provide a national resource for the ongoing evaluation, integration, and evolution of biomedical ontologies and associated tools and theories in the context of driving biomedical projects (DBPs), and (4) to disseminate the tools and resources of the Center and to identify, evaluate, and communicate best practices of ontology development to the biomedical community. Through the research activities within the Center, collaborations with the DBPs, and interactions with the biomedical community, our goal is to help scientists to work more effectively in the e-science paradigm, enhancing experiment design, experiment execution, data analysis, information synthesis, hypothesis generation and testing, and understand human disease. (shrink)
The National Center for BiomedicalOntology is now in its seventh year. The goals of this National Center for Biomedical Computing are to: create and maintain a repository of biomedical ontologies and terminologies; build tools and web services to enable the use of ontologies and terminologies in clinical and translational research; educate their trainees and the scientific community broadly about biomedicalontology and ontology-based technology and best practices; and collaborate with a variety of (...) groups who develop and use ontologies and terminologies in biomedicine. The centerpiece of the National Center for BiomedicalOntology is a web-based resource known as BioPortal. BioPortal makes available for research in computationally useful forms more than 270 of the world's biomedical ontologies and terminologies, and supports a wide range of web services that enable investigators to use the ontologies to annotate and retrieve data, to generate value sets and special-purpose lexicons, and to perform advanced analytics on a wide range of biomedical data. (shrink)
Knowledge-making practices in biology are being strongly affected by the availability of data on an unprecedented scale, the insistence on systemic approaches and growing reliance on bioinformatics and digital infrastructures. What role does theory play within data-intensive science, and what does that tell us about scientific theories in general? To answer these questions, I focus on Open Biomedical Ontologies, digital classification tools that have become crucial to sharing results across research contexts in the biological and biomedical sciences, and (...) argue that they constitute an example of classificatory theory. This form of theorizing emerges from classification practices in conjunction with experimental know-how and expresses the knowledge underpinning the analysis and interpretation of data disseminated online. (shrink)
An accurate classification of bacteria is essential for the proper identification of patient infections and subsequent treatment decisions. Multi-Locus Sequence Typing (MLST) is a genetic technique for bacterial classification. MLST classifications are used to cluster bacteria into clonal complexes. Importantly, clonal complexes can serve as a biological species concept for bacteria, facilitating an otherwise difficult taxonomic classification. In this paper, we argue for the inclusion of terms relating to clonal complexes in biomedical ontologies.
The meeting focused on uses of ontologies, with a special focus on spatial ontologies, in addressing the ever increasing needs faced by biology and medicine to cope with ever expanding quantities of data. To provide effective solutions computers need to integrate data deriving from myriad heterogeneous sources by bringing the data together within a single framework. The meeting brought together leaders in the field of what are called "top-level ontologies" to address this issue, and to establish strategies among leaders in (...) the field of biomedicalontology for the creation of interoperable biomedical ontologies which will serve the goal of useful data integration. (shrink)
We present a novel methodology for calculating the improvements obtained in successive versions of biomedical ontologies. The theory takes into account changes both in reality itself and in our understanding of this reality. The successful application of the theory rests on the willingness of ontology authors to document changes they make by following a number of simple rules. The theory provides a pathway by which ontology authoring can become a science rather than an art, following principles analogous (...) to those that have fostered the growth of modern evidence-based medicine. Although in this paper we focus on ontologies, the methodology can be generalized to other sorts of terminology-based artifacts, including Electronic Patient Records. (shrink)
The Foundational Model of Anatomy (FMA) symbolically represents the structural organization of the human body from the macromolecular to the macroscopic levels, with the goal of providing a robust and consistent scheme for classifying anatomical entities that is designed to serve as a reference ontology in biomedical informatics. Here we articulate the need for formally clarifying the is-a and part-of relations in the FMA and similar ontology and terminology systems. We diagnose certain characteristic errors in the treatment (...) of these relations and show how these errors can be avoided through adoption of the formalism we describe. We then illustrate how a consistently applied formal treatment of taxonomy and partonomy can support the alignment of ontologies. (shrink)
The National Center for BiomedicalOntology is now in its seventh year. The goals of this National Center for Biomedical Computing are to: create and maintain a repository of biomedical ontologies and terminologies; build tools and web services to enable the use of ontologies and terminologies in clinical and translational research; educate their trainees and the scientific community broadly about biomedicalontology and ontology-based technology and best practices; and collaborate with a variety of (...) groups who develop and use ontologies and terminologies in biomedicine. The centerpiece of the National Center for BiomedicalOntology is a web-based resource known as BioPortal. BioPortal makes available for research in computationally useful forms more than 270 of the world's biomedical ontologies and terminologies, and supports a wide range of web services that enable investigators to use the ontologies to annotate and retrieve data, to generate value sets and special-purpose lexicons, and to perform advanced analytics on a wide range of biomedical data. (shrink)
The Ontology for Biomedical Investigations (OBI) is an ontology that provides terms with precisely defined meanings to describe all aspects of how investigations in the biological and medical domains are conducted. OBI re-uses ontologies that provide a representation of biomedical knowledge from the Open Biological and Biomedical Ontologies (OBO) project and adds the ability to describe how this knowledge was derived. We here describe the state of OBI and several applications that are using it, such (...) as adding semantic expressivity to existing databases, building data entry forms, and enabling interoperability between knowledge resources. OBI covers all phases of the investigation process, such as planning, execution and reporting. It represents information and material entities that participate in these processes, as well as roles and functions. Prior to OBI, it was not possible to use a single internally consistent resource that could be applied to multiple types of experiments for these applications. OBI has made this possible by creating terms for entities involved in biological and medical investigations and by importing parts of other biomedical ontologies such as GO, Chemical Entities of Biological Interest (ChEBI) and Phenotype Attribute and Trait Ontology (PATO) without altering their meaning. OBI is being used in a wide range of projects covering genomics, multi-omics, immunology, and catalogs of services. OBI has also spawned other ontologies (Information Artifact Ontology) and methods for importing parts of ontologies (Minimum information to reference an external ontology term (MIREOT)). The OBI project is an open cross-disciplinary collaborative effort, encompassing multiple research communities from around the globe. To date, OBI has created 2366 classes and 40 relations along with textual and formal definitions. The OBI Consortium maintains a web resource providing details on the people, policies, and issues being addressed in association with OBI. (shrink)
The integration of biomedical terminologies is indispensable to the process of information integration. When terminologies are linked merely through the alignment of their leaf terms, however, differences in context and ontological structure are ignored. Making use of the SNAP and SPAN ontologies, we show how three reference domain ontologies can be integrated at a higher level, through what we shall call the OBR framework (for: Ontology of Biomedical Reality). OBR is designed to facilitate inference across the boundaries (...) of domain ontologies in anatomy, physiology and pathology. (shrink)
PURPOSE—A substantial fraction of the observations made by clinicians and entered into patient records are expressed by means of negation or by using terms which contain negative qualifiers (as in “absence of pulse” or “surgical procedure not performed”). This seems at first sight to present problems for ontologies, terminologies and data repositories that adhere to a realist view and thus reject any reference to putative non-existing entities. Basic Formal Ontology (BFO) and Referent Tracking (RT) are examples of such paradigms. (...) The purpose of the research here described was to test a proposal to capture negative findings in electronic health record systems based on BFO and RT. METHODS—We analysed a series of negative findings encountered in 748 sentences taken from 41 patient charts. We classified the phenomena described in terms of the various top-level categories and relations defined in BFO, taking into account the role of negation in the corresponding descriptions. We also studied terms from SNOMED-CT containing one or other form of negation. We then explored ways to represent the described phenomena by means of the types of representational units available to realist ontologies such as BFO. RESULTS—We introduced a new family of ‘lacks’ relations into the OBO Relation Ontology. The relation lacks_part, for example, defined in terms of the positive relation part_of, holds between a particular p and a universal U when p has no instance of U as part. Since p and U both exist, assertions involving ‘lacks_part’ and its cognates meet the requirements of positivity. CONCLUSION—By expanding the OBO Relation Ontology, we were able to accommodate nearly all occurrences of negative findings in the sample studied. (shrink)
In previous work on biomedical ontologies we showed how the provision of formal definitions for relations such as is_a and part_of can support new types of auto-mated reasoning about biomedical phenomena. We here extend this approach to the transformation_of characteristic of pathologies.
Ontology is one strategy for promoting interoperability of heterogeneous data through consistent tagging. An ontology is a controlled structured vocabulary consisting of general terms (such as “cell” or “image” or “tissue” or “microscope”) that form the basis for such tagging. These terms are designed to represent the types of entities in the domain of reality that the ontology has been devised to capture; the terms are provided with logical defi nitions thereby also supporting reasoning over the tagged (...) data. Aim: This paper provides a survey of the biomedical imaging ontologies that have been developed thus far. It outlines the challenges, particularly faced by ontologies in the fields of histopathological imaging and image analysis, and suggests a strategy for addressing these challenges in the example domain of quantitative histopathology imaging. The ultimate goal is to support the multiscale understanding of disease that comes from using interoperable ontologies to integrate imaging data with clinical and genomics data. (shrink)
Current approaches to formal representation in biomedicine are characterized by their focus on either the static or the dynamic aspects of biological reality. We here outline a theory that combines both perspectives and at the same time tackles the by no means trivial issue of their coherent integration. Our position is that a good ontology must be capable of accounting for reality both synchronically (as it exists at a time) and diachronically (as it unfolds through time), but that these (...) are two quite different tasks, whose simultaneous realization is by no means trivial. The paper is structured as follows. We begin by laying out the methodological and philosophical background of our approach. We then summarize the structure and elements of the Basic Formal Ontology on which it rests, in particular the SNAP ontology of objects and the SPAN ontology of processes. Finally, we apply the general framework to the specific domain of biomedicine. (shrink)
In the era of “big data,” science is increasingly information driven, and the potential for computers to store, manage, and integrate massive amounts of data has given rise to such new disciplinary fields as biomedical informatics. Applied ontology offers a strategy for the organization of scientific information in computer-tractable form, drawing on concepts not only from computer and information science but also from linguistics, logic, and philosophy. This book provides an introduction to the field of applied ontology (...) that is of particular relevance to biomedicine, covering theoretical components of ontologies, best practices for ontology design, and examples of biomedical ontologies in use. After defining an ontology as a representation of the types of entities in a given domain, the book distinguishes between different kinds of ontologies and taxonomies, and shows how applied ontology draws on more traditional ideas from metaphysics. It presents the core features of the Basic Formal Ontology (BFO), now used by over one hundred ontology projects around the world, and offers examples of domain ontologies that utilize BFO. The book also describes Web Ontology Language (OWL), a common framework for Semantic Web technologies. Throughout, the book provides concrete recommendations for the design and construction of domain ontologies. (shrink)
In the management of biomedical data, vocabularies such as ontologies and terminologies (O/Ts) are used for (i) domain knowledge representation and (ii) interoperability. The knowledge representation role supports the automated reasoning on, and analysis of, data annotated with O/Ts. At an interoperability level, the use of a communal vocabulary standard for a particular domain is essential for large data repositories and information management systems to communicate consistently with one other. Consequently, the interoperability benefit of selecting a particular O/T as (...) a standard for data exchange purposes is often seen by the end-user as a function of the number of applications using that vocabulary (and, by extension, the size of the user base). Furthermore, the adoption of an O/T as an interoperability standard requires confidence in its stability and guaranteed continuity as a resource. (shrink)
Successful biomedical data mining and information extraction require a complete picture of biological phenomena such as genes, biological processes, and diseases; as these exist on different levels of granularity. To realize this goal, several freely available heterogeneous databases as well as proprietary structured datasets have to be integrated into a single global customizable scheme. We will present a tool to integrate different biological data sources by mapping them to a proprietary biomedicalontology that has been developed for (...) the purposes of making computers understand medical natural language. (shrink)
The goal of the OBO (Open Biomedical Ontologies) Foundry initiative is to create and maintain an evolving collection of non-overlapping interoperable ontologies that will offer unambiguous representations of the types of entities in biological and biomedical reality. These ontologies are designed to serve non-redundant annotation of data and scientific text. To achieve these ends, the Foundry imposes strict requirements upon the ontologies eligible for inclusion. While these requirements are not met by most existing biomedical terminologies, the latter (...) may nonetheless support the Foundry’s goal of consistent and non-redundant annotation if appropriate mappings of data annotated with their aid can be achieved. To construct such mappings in reliable fashion, however, it is necessary to analyze terminological resources from an ontologically realistic perspective in such a way as to identify the exact import of the ‘concepts’ and associated terms which they contain. We propose a framework for such analysis that is designed to maximize the degree to which legacy terminologies and the data coded with their aid can be successfully used for information-driven clinical and translational research. (shrink)
The automatic integration of rapidly expanding information resources in the life sciences is one of the most challenging goals facing biomedical research today. Controlled vocabularies, terminologies, and coding systems play an important role in realizing this goal, by making it possible to draw together information from heterogeneous sources – for example pertaining to genes and proteins, drugs and diseases – secure in the knowledge that the same terms will also represent the same entities on all occasions of use. In (...) the naming of genes, proteins, and other molecular structures, considerable efforts are under way to reduce the effects of the different naming conventions which have been spawned by different groups of researchers. Electronic patient records, too, increasingly involve the use of standardized terminologies, and tremendous efforts are currently being devoted to the creation of terminology resources that can meet the needs of a future era of personalized medicine, in which genomic and clinical data can be aligned in such a way that the corresponding information systems become interoperable. (shrink)
Software application ontologies have the potential to become the keystone in state-of-the-art information management techniques. It is expected that these ontologies will support the sort of reasoning power required to navigate large and complex terminologies correctly and efficiently. Yet, there is one problem in particular that continues to stand in our way. As these terminological structures increase in size and complexity, and the drive to integrate them inevitably swells, it is clear that the level of consistency required for such navigation (...) will become correspondingly difficult to maintain. While descriptive semantic representations are certainly a necessary component to any adequate ontology-based system, so long as ontology engineers rely solely on semantic information, without a sound ontological theory informing their modeling decisions, this goal will surely remain out of reach. In this paper we describe how Language and Computing nv (L&C), along with The Institute for Formal Ontology and Medical Information Sciences (IFOMIS), are working towards developing and implementing just such a theory, combining the open software architecture of L&C’s LinkSuiteTM with the philosophical rigor of IFOMIS’s Basic Formal Ontology. In this way we aim to move beyond the more or less simple controlled vocabularies that have dominated the industry to date. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology greatly benefits application ontologies. To this end r®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this project we aim to move (...) beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications. Our general procedure has been the implementation of a meta-ontological definition space in which the definitions of all the concepts and relations in LinKBase® are standardized in a framework of first-order logic. In this paper we describe how this standardization has already led to an improvement in the LinKBase® structure that allows for a greater degree of internal coherence than ever before possible. We then show the use of this philosophical standardization for the purpose of mapping external databases to one another, using LinKBase® as translation hub, with a greater degree of success than possible hitherto. We demonstrate how this offers a genuine advance over other application ontologies that have not submitted themselves to the demands of philosophical scrutiny. LinKBase® is one of the world’s largest applications-oriented medical domain ontologies, and BFO is one of the world’s first philosophically driven reference ontologies. The collaboration of the two thus initiates a new phase in the quest to solve the so-called “Tower of Babel”. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology will greatly benefit software application ontologies. To this end LinKBase®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this, we aim to (...) move beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications. (shrink)
As biological and biomedical research increasingly reference the environmental context of the biological entities under study, the need for formalisation and standardisation of environment descriptors is growing. The Environment Ontology (ENVO) is a community-led, open project which seeks to provide an ontology for specifying a wide range of environments relevant to multiple life science disciplines and, through an open participation model, to accommodate the terminological requirements of all those needing to annotate data using ontology classes. This (...) paper summarises ENVO’s motivation, content, structure, adoption, and governance approach. (shrink)
Biomedical terminologies are focused on what is general, Electronic Health Records (EHRs) on what is particular, and it is commonly assumed that the step from the one to the other is unproblematic. We argue that this is not so, and that, if the EHR of the future is to fulfill its promise, then the foundations of both EHR architectures and biomedical terminologies need to be reconceived. We accordingly describe a new framework for the treatment of both generals and (...) particulars in biomedical information systems that is designed: 1) to provide new opportunities for the sharing and management of data within and between healthcare institutions, 2) to facilitate interoperability among different terminology and record systems, and thereby 3) to allow new kinds of reasoning with biomedical data. (shrink)
Ontology is a burgeoning field, involving researchers from the computer science, philosophy, data and software engineering, logic, linguistics, and terminology domains. Many ontology-related terms with precise meanings in one of these domains have different meanings in others. Our purpose here is to initiate a path towards disambiguation of such terms. We draw primarily on the literature of biomedical informatics, not least because the problems caused by unclear or ambiguous use of terms have been there most thoroughly addressed. (...) We advance a proposal resting on a distinction of three levels too often run together in biomedicalontology research: 1. the level of reality; 2. the level of cognitive representations of this reality; 3. the level of textual and graphical artifacts. We propose a reference terminology for ontology research and development that is designed to serve as common hub into which the several competing disciplinary terminologies can be mapped. We then justify our terminological choices through a critical treatment of the ‘concept orientation’ in biomedical terminology research. (shrink)
The Protein Ontology (PRO) web resource provides an integrative framework for protein-centric exploration and enables specific and precise annotation of proteins and protein complexes based on PRO. Functionalities include: browsing, searching and retrieving, terms, displaying selected terms in OBO or OWL format, and supporting URIs. In addition, the PRO website offers multiple ways for the user to request, submit, or modify terms and/or annotation. We will demonstrate the use of these tools for protein research and annotation.
We propose a typology of representational artifacts for health care and life sciences domains and associate this typology with different kinds of formal ontology and logic, drawing conclusions as to the strengths and limitations for ontology in a description logics framework. The four types of domain representation we consider are: (i) lexico-semantic representation, (ii) representation of types of entities, (iii) representations of background knowledge, and (iv) representation of individuals. We advocate a clear distinction of the four kinds of (...) representation in order to provide a more rational basis for using ontologies and related artifacts to advance integration of data and enhance interoperability of associated reasoning systems. We highlight the fact that only a minor portion of scientifically relevant facts in a domain such as biomedicine can be adequately represented by formal ontologies as long as the latter are conceived as representations of entity types. In particular, the attempt to encode default or probabilistic knowledge using ontologies so conceived is prone to produce unintended, erroneous models. (shrink)
Ontologies describe reality in specific domains in ways that can bridge various disciplines and languages. They allow easier access and integration of information that is collected by different groups. Ontologies are currently used in the biomedical sciences, geography, and law. A Biomedical Ethics Ontology would benefit members of ethics committees who deal with protocols and consent forms spanning numerous fields of inquiry. There already exists the Ontology for Biomedical Investigations (OBI); the proposed BMEO would interoperate (...) with OBI, creating a powerful information tool. We define a domain ontology and begin to construct a BMEO, focused on the process of evaluating human research protocols. Finally, we show how our BMEO can have practical applications for ethics committees. This paper describes ongoing research and a strategy for its broader continuation and cooperation. (shrink)
BioPortal is a Web portal that provides access to a library of biomedical ontologies and terminologies developed in OWL, RDF(S), OBO format, Protégé frames, and Rich Release Format. BioPortal functionality, driven by a service-oriented architecture, includes the ability to browse, search and visualize ontologies (Figure 1). The Web interface also facilitates community-based participation in the evaluation and evolution of ontology content.
The Salivaomics Knowledge Base (SKB) is designed to serve as a computational infrastructure that can permit global exploration and utilization of data and information relevant to salivaomics. SKB is created by aligning (1) the saliva biomarker discovery and validation resources at UCLA with (2) the ontology resources developed by the OBO (Open Biomedical Ontologies) Foundry, including a new Saliva Ontology (SALO). We define the Saliva Ontology (SALO; http://www.skb.ucla.edu/SALO/) as a consensus-based controlled vocabulary of terms and relations (...) dedicated to the salivaomics domain and to saliva-related diagnostics following the principles of the OBO (Open Biomedical Ontologies) Foundry. The Saliva Ontology is an ongoing exploratory initiative. The ontology will be used to facilitate salivaomics data retrieval and integration across multiple fields of research together with data analysis and data mining. The ontology will be tested through its ability to serve the annotation ('tagging') of a representative corpus of salivaomics research literature that is to be incorporated into the SKB. Background Saliva (oral fluid) is an emerging biofluid for non-invasive diagnostics used in the detection of human diseases. The need to advance saliva research is strongly emphasized by the National Institute of Dental and Craniofacial Research (NIDCR), and is included in the NIDCR's 2004- 2009 expert panel long-term research agenda [1]. The ability to monitor health status, disease onset, progression, recurrence and treatment outcome through noninvasive means is highly important to advancing health care management. Saliva is a perfect medium to be explored for personalized individual medicine including diagnostics, offering a non-invasive, easy to obtain means for detecting and monitoring diseases. Saliva testing potentially allows the patient to collect their own saliva samples at home, yielding convenience for the patient and savings in health costs, and facilitating multiple sampling. Specimen collection is less objectionable to patients and easier in children and elderly individuals. Due to these advantages. (shrink)
The Protein Ontology (PRO) provides a formal, logically-based classification of specific protein classes including structured representations of protein isoforms, variants and modified forms. Initially focused on proteins found in human, mouse and Escherichia coli, PRO now includes representations of protein complexes. The PRO Consortium works in concert with the developers of other biomedical ontologies and protein knowledge bases to provide the ability to formally organize and integrate representations of precise protein forms so as to enhance accessibility to results (...) of protein research. PRO (http://pir.georgetown.edu/pro) is part of the Open Biomedical Ontologies (OBO) Foundry. (shrink)
Since 2002 we have been testing and refining a methodology for ontology development that is now being used by multiple groups of researchers in different life science domains. Gary Merrill, in a recent paper in this journal, describes some of the reasons why this methodology has been found attractive by researchers in the biological and biomedical sciences. At the same time he assails the methodology on philosophical grounds, focusing specifically on our recommendation that ontologies developed for scientific purposes (...) should be constructed in such a way that their terms are seen as referring to what we call universals or types in reality. As we show, Merrill’s critique is of little relevance to the success of our realist project, since it not only reveals no actual errors in our work but also criticizes views on universals that we do not in fact hold. However, it nonetheless provides us with a valuable opportunity to clarify the realist methodology, and to show how some of its principles are being applied, especially within the framework of the OBO (Open Biomedical Ontologies) Foundry initiative. (shrink)
Representing the kinetic state of a patient (posture, motion, and activity) during vital sign measurement is an important part of continuous monitoring applications, especially remote monitoring applications. In contextualized vital sign representation, the measurement result is presented in conjunction with salient measurement context metadata. We present an automated annotation system for vital sign measurements that uses ontologies from the Open BiomedicalOntology Foundry (OBO Foundry) to represent the patient’s kinetic state at the time of measurement. The annotation system (...) is applied to data generated by a wearable personal status monitoring (PSM) device. We demonstrate how annotated PSM data can be queried for contextualized vital signs as well as sensor algorithm configuration parameters. (shrink)
Medical terminology collects and organizes the many different kinds of terms employed in the biomedical domain both by practitioners and also in the course of biomedical research. In addition to serving as labels for biomedical classes, these names reflect the organizational principles of biomedical vocabularies and ontologies. Some names represent invariant features (classes, universals) of biomedical reality (i.e., they are a matter for ontology). Other names, however, convey also how this reality is perceived, measured, (...) and understood by health professionals (i.e., they belong to the domain of epistemology). We analyze terms from several biomedical vocabularies in order to throw light on the interactions between ontological and epistemological components of these terminologies. We identify four cases: 1) terms containing classification criteria, 2) terms reflecting detectability, modality, uncertainty, and vagueness, 3) terms created in order to obtain a complete partition of a given domain, and 4) terms reflecting mere fiat boundaries. We show that epistemology-loaded terms are pervasive in biomedical vocabularies, that the “classes” they name often do not comply with sound classification principles, and that they are therefore likely to cause problems in the evolution and alignment of terminologies and associated ontologies. (shrink)
An explicit formal-ontological representation of entities existing at multiple levels of granularity is an urgent requirement for biomedical information processing. We discuss some fundamental principles which can form a basis for such a representation. We also comment on some of the implicit treatments of granularity in currently available ontologies and terminologies (GO, FMA, SNOMED CT).
The Infectious Disease Ontology (IDO) is a suite of interoperable ontology modules that aims to provide coverage of all aspects of the infectious disease domain, including biomedical research, clinical care, and public health. IDO Core is designed to be a disease and pathogen neutral ontology, covering just those types of entities and relations that are relevant to infectious diseases generally. IDO Core is then extended by a collection of ontology modules focusing on specific diseases and (...) pathogens. In this paper we present applications of IDO Core within various areas of infectious disease research, together with an overview of all IDO extension ontologies and the methodology on the basis of which they are built. We also survey recent developments involving IDO, including the creation of IDO Virus; the Coronaviruses Infectious Disease Ontology (CIDO); and an extension of CIDO focused on COVID-19 (IDO-CovID-19).We also discuss how these ontologies might assist in information-driven efforts to deal with the ongoing COVID-19 pandemic, to accelerate data discovery in the early stages of future pandemics, and to promote reproducibility of infectious disease research. (shrink)
Ontology is the philosophical discipline which aims to understand how things in the world are divided into categories and how these categories are related together. This is exactly what information scientists aim for in creating structured, automated representations, called 'ontologies,' for managing information in fields such as science, government, industry, and healthcare. Currently, these systems are designed in a variety of different ways, so they cannot share data with one another. They are often idiosyncratically structured, accessible only to those (...) who created them, and unable to serve as inputs for automated reasoning. This volume shows, in a nontechnical way and using examples from medicine and biology, how the rigorous application of theories and insights from philosophical ontology can improve the ontologies upon which information management depends. (shrink)
Biomedical ontologies are emerging as critical tools in genomic and proteomic research where complex data in disparate resources need to be integrated. A number of ontologies exist that describe the properties that can be attributed to proteins; for example, protein functions are described by Gene Ontology, while human diseases are described by Disease Ontology. There is, however, a gap in the current set of ontologies—one that describes the protein entities themselves and their relationships. We have designed a (...) PRotein Ontology (PRO) to facilitate protein annotation and to guide new experiments. The components of PRO extend from the classification of proteins on the basis of evolutionary relationships to the representation of the multiple protein forms of a gene (products generated by genetic variation, alternative splicing, proteolytic cleavage, and other post-translational modification). PRO will allow the specification of relationships between PRO, GO and other OBO Foundry ontologies. Here we describe the initial development of PRO, illustrated using human proteins from the TGF-beta signaling pathway. (shrink)
A repository of clinically associated Staphylococcus aureus (Sa) isolates is used to semi‐automatically generate a set of application ontologies for specific subfamilies of Sa‐related disease. Each such application ontology is compatible with the Infectious Disease Ontology (IDO) and uses resources from the Open BiomedicalOntology (OBO) Foundry. The set of application ontologies forms a lattice structure beneath the IDO‐Core and IDO‐extension reference ontologies. We show how this lattice can be used to define a strategy for the (...) construction of a new taxonomy of infectious disease incorporating genetic, molecular, and clinical data. We also outline how faceted browsing and query of annotated data is supported using a lattice application ontology. (shrink)
Ontology is currently perceived as the solution of first resort for all problems related to biomedical terminology, and the use of description logics is seen as a minimal requirement on adequate ontology-based systems. Contrary to common conceptions, however, description logics alone are not able to prevent incorrect representations; this is because they do not come with a theory indicating what is computed by using them, just as classical arithmetic does not tell us anything about the entities that (...) are added or subtracted. In this paper we shall show that ontology is indeed an essential part of any solution to the problems of medical terminology – but only if it is understood in the right sort of way. Ontological engineering, we shall argue, should in every case go hand in hand with a sound ontological theory. (shrink)
Ontologies are being ever more commonly used in biomedical informatics and we provide a survey of some of these uses, and of the relations between ontologies and other terminology resources. In order for ontologies to become truly useful, two objectives must be met. First, ways must be found for the transparent evaluation of ontologies. Second, existing ontologies need to be harmonised. We argue that one key foundation for both ontology evaluation and harmonisation is the adoption of a realist (...) paradigm in ontology development. For science-based ontologies of the sort which concern us in the eHealth arena, it is reality that provides the common benchmark against which ontologies can be evaluated and aligned within larger frameworks. Given the current multitude of ontologies in the biomedical domain the need for harmonisation is becoming ever more urgent. We describe one example of such harmonisation within the ACGT project, which draws on ontology-based computing as a basis for sharing clinical and laboratory data on cancer research. (shrink)
We present the details of a methodology for quality assurance in large medical terminologies and describe three algorithms that can help terminology developers and users to identify potential mistakes. The methodology is based in part on linguistic criteria and in part on logical and ontological principles governing sound classifications. We conclude by outlining the results of applying the methodology in the form of a taxonomy different types of errors and potential errors detected in SNOMED-CT.
We are developing the Neurological Disease Ontology (ND) to provide a framework to enable representation of aspects of neurological diseases that are relevant to their treatment and study. ND is a representational tool that addresses the need for unambiguous annotation, storage, and retrieval of data associated with the treatment and study of neurological diseases. ND is being developed in compliance with the Open BiomedicalOntology Foundry principles and builds upon the paradigm established by the Ontology for (...) General Medical Science (OGMS) for the representation of entities in the domain of disease and medical practice. Initial applications of ND will include the annotation and analysis of large data sets and patient records for Alzheimer’s disease, multiple sclerosis, and stroke. (shrink)
The relevance of analytic metaphysics has come under criticism: Ladyman & Ross, for instance, have suggested do discontinue the field. French & McKenzie have argued in defense of analytic metaphysics that it develops tools that could turn out to be useful for philosophy of physics. In this article, we show first that this heuristic defense of metaphysics can be extended to the scientific field of applied ontology, which uses constructs from analytic metaphysics. Second, we elaborate on a parallel by (...) French & McKenzie between mathematics and metaphysics to show that the whole field of analytic metaphysics, being useful not only for philosophy but also for science, should continue to exist as a largely autonomous field. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.