To enhance the treatment of relations in biomedical ontologies we advance a methodology for providing consistent and unambiguous formal definitions of the relational expressions used in such ontologies in a way designed to assist developers and users in avoiding errors in coding and annotation. The resulting Relation Ontology can promote interoperability of ontologies and support new types of automated reasoning about the spatial and temporal dimensions of biological and medical phenomena.
There are three theories in the epistemology of modality that have received sustained attention over the past 20 years: conceivability-theory, counterfactual-theory, and deduction-theory. In this paper we argue that all three face what we call the problem of modal epistemic friction. One consequence of the problem is that for any of the three accounts to yield modal knowledge, the account must provide an epistemology of essence. We discuss an attempt to fend off the problem within the context of the internalism (...) versus externalism debate about epistemic justification. We then investigate the effects that the PMEF has on reductive and non-reductive theories of the relation between essence and modality. (shrink)
Recently, Kit Fine's view that modal truths are true in virtue of, grounded in, or explained by essentialist truths has been under attack. In what follows we offer two responses to the wave of criticism against his view. While the first response is pretty straightforward, the second is based on the distinction between, what we call, Reductive Finean Essentialism and Non-Reductive Finean Essentialism. Engaging the work of Bob Hale on Non-Reductive Finean Essentialism, we aim to show that the arguments against (...) Fine's view are unconvincing, while we acknowledge the presence of a deep standoff between the two views. (shrink)
The Unified Medical Language System and the Gene Ontology are among the most widely used terminology resources in the biomedical domain. However, when we evaluate them in the light of simple principles for wellconstructed ontologies we find a number of characteristic inadequacies. Employing the theory of granular partitions, a new approach to the understanding of ontologies and of the relationships ontologies bear to instances in reality, we provide an application of this theory in relation to an example drawn from the (...) context of the pathophysiology of hypertension. This exercise is designed to demonstrate how, by taking ontological principles into account we can create more realistic biomedical ontologies which will also bring advantages in terms of efficiency and robustness of associated software applications. (shrink)
In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. (...) The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines. (shrink)
The National Cancer Institute’s Thesaurus (NCIT) has been created with the goal of providing a controlled vocabulary which can be used by specialists in the various sub-domains of oncology. It is intended to be used for purposes of annotation in ways designed to ensure the integration of data and information deriving from these various sub-domains, and thus to support more powerful cross-domain inferences. In order to evaluate its suitability for this purpose, we examined the NCIT’s treatment of the kinds of (...) entities which are fundamental to an ontology of colon carcinoma. We here describe the problems we uncovered concerning classification, synonymy, relations and definitions, and we draw conclusions for the work needed to establish the NCIT as a reference ontology for the cancer domain in the future. (shrink)
Formal principles governing best practices in classification and definition have for too long been neglected in the construction of biomedical ontologies, in ways which have important negative consequences for data integration and ontology alignment. We argue that the use of such principles in ontology construction can serve as a valuable tool in error-detection and also in supporting reliable manual curation. We argue also that such principles are a prerequisite for the successful application of advanced data integration techniques such as ontology-based (...) multi-database querying, automated ontology alignment and ontology-based text-mining. These theses are illustrated by means of a case study of the Gene Ontology, a project of increasing importance within the field of biomedical data integration. (shrink)
that can serve as a foundation for more refined ontologies in the field of proteomics. Standard data sources classify proteins in terms of just one or two specific aspects. Thus SCOP (Structural Classification of Proteins) is described as classifying proteins on the basis of structural features; SWISSPROT annotates proteins on the basis of their structure and of parameters like post-translational modifications. Such data sources are connected to each other by pairwise term-to-term mappings. However, there are obstacles which stand in the (...) way of combining them together to form a robust meta-classification of the needed sort. We discuss some formal ontological principles which should be taken into account within the existing datasources in order to make such a metaclassification possible, taking into account also the Gene Ontology (GO) and its application to the annotation of proteins. (shrink)
An explicit formal-ontological representation of entities existing at multiple levels of granularity is an urgent requirement for biomedical information processing. We discuss some fundamental principles which can form a basis for such a representation. We also comment on some of the implicit treatments of granularity in currently available ontologies and terminologies (GO, FMA, SNOMED CT).
The integration of biomedical terminologies is indispensable to the process of information integration. When terminologies are linked merely through the alignment of their leaf terms, however, differences in context and ontological structure are ignored. Making use of the SNAP and SPAN ontologies, we show how three reference domain ontologies can be integrated at a higher level, through what we shall call the OBR framework (for: Ontology of Biomedical Reality). OBR is designed to facilitate inference across the boundaries of domain ontologies (...) in anatomy, physiology and pathology. (shrink)
The theory of granular partitions (TGP) is a new approach to the understanding of ontologies and other classificatory systems. The paper explores the use of this new theory in the treatment of task-based clinical guidelines as a means for better understanding the relations between different clinical tasks, both within the framework of a single guideline and between related guidelines. We used as our starting point a DAML+OIL-based ontology for the WHO guideline for hypertension management, comparing this with related guidelines and (...) attempting to show that TGP provides a flexible and highly expressive basis for the manipulation of ontologies of a sort which might be useful in providing more adequate Computer Interpretable Guideline Models (CIGMs) in the future. (shrink)
The Gene Ontology is an important tool for the representation and processing of information about gene products and functions. It provides controlled vocabularies for the designations of cellular components, molecular functions, and biological processes used in the annotation of genes and gene products. These constitute three separate ontologies, of cellular components), molecular functions and biological processes, respectively. The question we address here is: how are the terms in these three separate ontologies related to each other? We use statistical methods and (...) formal ontological principles as a first step towards finding answers to this question. (shrink)
We provide a methodology for the creation of ontological partitions in biomedicine and we test the methodology via an application to the phenomenon of blood pressure. An ontology of blood pressure must do justice to the complex networks of intersecting pathways in the organism by which blood pressure is regulated. To this end it must deal not only with the anatomical structures and physiological processes involved in such regulation but also with the relations between these at different levels of granularity. (...) For this purpose our ontology offers a variety of distinct partitions � of substances, processes and functions � and integrates these together within a single framework via transitive networks of part-whole and dependence relations among the entities in each of these categories. The paper concludes with a comparison of this methodology with the approaches of GOTM, KEGG, DIP and BIND and provides an outline of how the methodology is currently being applied in the field of biomedical database integration. (shrink)
Ontological principles are needed in order to bridge the gap between medical and biological information in a robust and computable fashion. This is essential in order to draw inferences across the levels of granularity which span medicine and biology, an example of which include the understanding of the roles of tumor markers in the development and progress of carcinoma. Such information integration is also important for the integration of genomics information with the information contained in the electronic patient records in (...) such a way that real time conclusions can be drawn. In this paper we describe a large multi-granular datasource built by using ontological principles and focusing on the case of colon carcinoma. (shrink)
(Report assembled for the Workshop of the AMIA Working Group on Formal Biomedical Knowledge Representation in connection with AMIA Symposium, Washington DC, 2005.) Best practices in ontology building for biomedicine have been frequently discussed in recent years. However there is a range of seemingly disparate views represented by experts in the field. These views not only reflect the different uses to which ontologies are put, but also the experiences and disciplinary background of these experts themselves. We asked six questions related (...) to biomedical ontologies to what we believe is a representative sample of ontologists in the biomedical field and came to a number conclusions which we believe can help provide an insight into the practical problems which ontology builders face today. (shrink)
Evidence-based medicine relies on the execution of clinical practice guidelines and protocols. A great deal of of effort has been invested in the development of various tools which automate the representation and execution of the recommendations contained within such guidelines and protocols by creating Computer Interpretable Guideline Models (CIGMs). Context-based task ontologies (CTOs), based on standard terminology systems like UMLS, form one of the core components of such a model. We have created DAML+OIL-based CTOs for the tasks mentioned in the (...) WHO guideline for hypertension management, drawing comparisons also with other related guidelines. The advantages of CTOs include: contextualization of ontologies, providing ontologies tailored to specific aspects of the phenomena of interest, dividing the complexity involved in creating ontologies into different levels, providing a methodology by means of which the task recommendations contained within guidelines can be integrated into the clinical practices of a health care set-up. (shrink)
Clinical guidelines are special types of plans realized by collective agents. We provide an ontological theory of such plans that is designed to support the construction of a framework in which guideline-based information systems can be employed in the management of workflow in health care organizations. The framework we propose allows us to represent in formal terms how clinical guidelines are realized through the actions of are realized through the actions of individuals organized into teams. We provide various levels of (...) implementation representing different levels of conformity on the part of health care organizations. Implementations built in conformity with our framework are marked by two dimensions of flexibility that are designed to make them more likely to be accepted by health care professionals than standard guideline-based management systems. They do justice to the fact 1) that responsibilities within a health care organization are widely shared, and 2) that health care professionals may on different occasions be non-compliant with guidelines for a variety of well justified reasons. The advantage of the framework lies in its built-in flexibility, its sensitivity to clinical context, and its ability to use inference tools based on a robust ontology. One disadvantage lies in its complicated implementation. (shrink)
Outcomes research in healthcare has been a topic much addressed in recent years. Efforts in this direction have been supplemented by work in the areas of guidelines for clinical practice and computer-interpretable workflow and careflow models.In what follows we present the outlines of a framework for understanding the relations between organizations, guidelines, individual patients and patient-related functions. The derived framework provides a means to extract the knowledge contained in the guideline text at different granularities, in ways that can help us (...) to assign tasks within the healthcare organization and to assess clinical performance in realizing the guideline. It does this in a way that preserves the flexibility of the organization in the adoption of the guidelines. (shrink)
There are a number of existing classifications and staging schemes for carcinomas, one of the most frequently used being the TNM classification. Such classifications represent classes of entities which exist at various anatomical levels of granularity. We argue that in order to apply such representations to the Electronic Health Records one needs sound ontologies which take into consideration the diversity of the domains which are involved in clinical bioinformatics. Here we outline a formal theory for addressing these issues in a (...) way that the ontologies can be used to support inferences relating to entities which exist at different anatomical levels of granularity. Our case study is the colon carcinoma, one of the most common carcinomas prevalent within the European population. (shrink)
The automatic integration of information resources in the life sciences is one of the most challenging goals facing biomedical informatics today. Controlled vocabularies have played an important role in realizing this goal, by making it possible to draw together information from heterogeneous sources secure in the knowledge that the same terms will also represent the same entities on all occasions of use. One of the most impressive achievements in this regard is the Gene Ontology (GO), which is rapidly acquiring the (...) status of a de facto standard in the field of gene and gene product annotations, and whose methodology has been much intimated in attempts to develop controlled vocabularies for shared use in different domains of biology. The GO Consortium has recognized, however, that its controlled vocabulary as currently constituted is marked by several problematic features - features which are characteristic of much recent work in bioinformatics and which are destined to raise increasingly serious obstacles to the automatic integration of biomedical information in the future. Here, we survey some of these problematic features, focusing especially on issues of compositionality and syntactic regimentation. (shrink)
Recent work on the quality assurance of the Gene Ontology (GO, Gene Ontology Consortium 2004) from the perspective of both linguistic and ontological organization has made it clear that GO lacks the kind of formalism needed to support logic-based reasoning. At the same time it is no less clear that GO has proven itself to be an excellent terminological resource that can serve to combine together a variety of biomedical database and information systems. Given the strengths of GO, it is (...) worth investigating whether, by overcoming some of its weaknesses from the point of view of formal-ontological principles, we might not be able to enhance a version of GO which can come even closer to serving the needs of the various communities of biomedical researchers and practitioners. It is accepted that clinical and bioinformatics need to find common ground if the results of data-intensive biomedical research are to be harvested to the full. It is also widely accepted that no single method will be sufficient to create the needed common framework. We believe that the principles-based approach to life-science data integration and knowledge representation must be one of the methods applied. Indeed in dealing with the ontological representation of carcinomas, and specifically of colon carcinomas, we have established that, had GO (and related biomedical ontologies) followed some of the basic formal-ontological principles we have identified (Smith et al. 2004, Ceusters et al. 2004), then the effort required to navigate successfully between clinical and bioinformatics systems would have been reduced. We point here to the sources of ontologically-related errors in GO, and also provide arguments as to why and how such errors need to be resolved. (shrink)
The International Classification of Functioning, Disability and Health provides a classification of human bodily functions, which, while exhibiting non-conformance to many formal ontological principles, provides an insight into which basic functions such a classification should include. Its evaluation is an important first step towards such an adequate ontology of this domain. Presented at the 13th Annual North American WHO Collaborating Center Conference on the ICF, 2007.
It is widely understood that protein functions can be exhaustively described in terms of no single parameter, whether this be amino acid sequence or the three-dimensional structure of the underlying protein molecule. This means that a number of different attributes must be used to create an ontology of protein functions. Certainly much of the required information is already stored in databases such as Swiss-Prot, Protein Data Bank, SCOP and MIPS. But the latter have been developed for different purposes and the (...) separate data-structures which they employ are not conducive to the needed data integration. When we attempt to classify the entities in the domain of proteins, we find ourselves faced with a number of cross-cutting principles of classification. Our question here is: how can we bring together these separate taxonomies in order to describe protein functions? Our proposed answer is: via a careful top-level ontological analysis of the relevant principles of classification, combined with a new framework for the simultaneous manipulation of classifications constructed for different purposes. (shrink)
Human Aspiration is the first chapter of the magnum opus book "Life Divine". Here in in this chapter Sri Aurobindo one of the most modern prolific philosophers of Renaissance India has highlighted his focal points as to what Man's eternal aspiration has been, that is, God, Light , Freedom & Eternity. Despite technological and scientific advancements, Mans is still thirsty, it is because he aspires for a Divine Life. The article talks about the "Human aspiration" of eternity in details.
With the advent of the “Clean India” campaign in India, a renewed focus on cleanliness has started, with a special focus on sanitation. There have been efforts in the past to provide sanitation related services. However, there were several challenges in provisioning. Provision of sanitation is a public health imperative given increased instances of antimicrobial resistance in India. This paper focuses on sanitation provisioning in the city of Mumbai, especially in the slums of Mumbai. The paper compares and contrasts different (...) models of sanitation provision including supply-led provisioning of sanitation by the Indian government to demand-led provisioning of sanitation through a World Bank funded “Slum Sanitation Program” (SSP). The paper also outlines a comparative perspective on the implementation and usage of toilet blocks. The author presents the theory of social networks and positive peer pressure and argues that these will amplify the effect of other incentives. With the help of an illustration, this paper concludes that the sustainable sanitation policy should look at facilitating and creating the infrastructure as a network and not strictly at building toilet blocks. (shrink)
The hard problem of consciousness arises in most incarnations of present day physicalism. Why should certain physical processes necessarily be accompanied by experience? One possible response is that physicalism itself should be modified in order to accommodate experience: But, modified how? In the present work, we investigate whether an ontology derived from quantum field theory can help resolve the hard problem. We begin with the assumption that experience cannot exist without being accompanied by a subject of experience (SoE). While people (...) well versed in Indian philosophy will not find that statement problematic, it is still controversial in the analytic tradition. Luckily for us, Strawson has elaborately defended the notion of a thin subject—an SoE which exhibits a phenomenal unity with different types of content (sensations, thoughts etc.) occurring during its temporal existence. Next, following Stoljar, we invoke our ignorance of the true physical as the reason for the explanatory gap between present day physical processes (events, properties) and experience. We are therefore permitted to conceive of thin subjects as related to the physical via a new, yet to be elaborated relation. While this is difficult to conceive under most varieties of classical physics, we argue that this may not be the case under certain quantum field theory ontologies. We suggest that the relation binding an SoE to the physical is akin to the relation between a particle and (quantum) field. In quantum field theory, a particle is conceived as a coherent excitation of a field. Under the right set of circumstances, a particle coalesces out of a field and dissipates. We suggest that an SoE can be conceived as akin to a particle—a SelfOn—which coalesces out of physical fields, persists for a brief period of time and then dissipates in a manner similar to the phenomenology of a thin subject. Experiences are physical properties of selfons with the constraint (specified by a similarity metric) that selfons belonging to the same natural kind will have similar experiences. While it is odd at first glance to conceive of subjects of experience as akin to particles, the spatial and temporal unity exhibited by particles as opposed to fields and the expectation that selfons are new kinds of particles, paves the way for cementing this notion. Next, we detail the various no-go theorems in most versions of quantum field theory and discuss their impact on the existence of selfons. Finally, we argue that the time is ripe for a rejuvenated Indian philosophy to begin tackling the three-way relationship between SoEs (which may become equivalent to jivas in certain Indian frameworks), phenomenal content and the physical world. With analytic philosophy still struggling to come to terms with the complex worlds of quantum field theory and with the relative inexperience of the western world in arguing the jiva-world relation, there is a clear and present opportunity for Indian philosophy to make a worldcentric contribution to the hard problem of experience. (shrink)
In this paper, a framework incorporating flexibility as a characteristic is proposed for designing complex, resilient socio-ecological systems. In an interconnected complex system, flexibility allows prompt deployment of resources where they are needed and is crucial for both innovation and robustness. A comparative analysis of flexible manufacturing systems, economics, evolutionary biology, and supply chain management is conducted to identify the most important characteristics of flexibility. Evolutionary biology emphasises overlapping functions and multi-functionality, which allow a system with structurally different elements to (...) perform the same function, enhancing resilience. In economics, marginal cost and marginal expected profit are factors that are considered to be important in incorporating flexibility while making changes to the system. In flexible manufacturing systems, the size of choice sets is important in creating flexibility, as initial actions preserve more options for future actions that will enhance resilience. Given the dynamic nature of flexibility, identifying the characteristics that can lead to flexibility will introduce a crucial dimension to designing resilient and sustainable socio-ecological systems with a long-term perspective in mind. (shrink)
There is a need for enterprises to incorporate information on the environment into decision making and to take action on ecological restoration. Within academia, a comprehensive understanding of the impacts on how business can serve sustainability transformation is still lacking as diverging holistic approaches and reductive approaches cloud academic thinking. The authors take a science-policy interface perspective to cover the role of cognitive proximity, matching and coordination of scientific knowledge from diverse stakeholders for effective policy making and implementation. We show (...) through a literature review that temporal and spatial scales, soil and land degradation, institutions and ecosystem, and the role of human behavior and narrative are not adequately emphasized in sustainability research. A scale-based picture, focusing on landscapes, institutions and practices is proposed which can be used to align diverse fields by acting as “bridge” for improved science policy interface and decision making, facilitated through cognitive proximity, matching, and coordination. A case study on a business association from South India is used to demonstrate the scales based approach in practice. A scale-based approach can play a key role in connecting human behaviour, a social science thematic topic, with ecosystems, a natural science thematic topic. (shrink)
Tumors, abscesses, cysts, scars, fractures are familiar types of what we shall call pathological continuant entities. The instances of such types exist always in or on anatomical structures, which thereby become transformed into pathological anatomical structures of corresponding types: a fractured tibia, a blistered thumb, a carcinomatous colon. In previous work on biomedical ontologies we showed how the provision of formal definitions for relations such as is_a, part_of and transformation_of can facilitate the integration of such ontologies in ways which have (...) the potential to support new kinds of automated reasoning. We here extend this approach to the treatment of pathologies, focusing especially on those pathological continuant entities which arise when organs become affected by carcinomas. (shrink)
An important part of the Unified Medical Language System (UMLS) is its Semantic Network, consisting of 134 Semantic Types connected to each other by edges formed by one or more of 54 distinct Relation Types. This Network is however for many purposes overcomplex, and various groups have thus made attempts at simplification. Here we take this work further by simplifying the relations which involve the three Semantic Types – Diagnostic Procedure, Laboratory Procedure and Therapeutic or Preventive Procedure. We define operators (...) which can be used to generate terms instantiating types from this selected set when applied to terms designating certain other Semantic Types, including almost all the terms specifying clinical tasks. Usage of such operators thus provides a useful and economical way of specifying clinical tasks. The operators allow us to define a mapping between those types within the UMLS which do not represent clinical tasks and those which do. This mapping then provides a basis for an ontology of clinical tasks that can be used in the formulation of computer-interpretable clinical guideline models. (shrink)
Formalisms such as description logics (DL) are sometimes expected to help terminologies ensure compliance with sound ontological principles. The objective of this paper is to study the degree to which one DL-based biomedical terminology (SNOMED CT) complies with such principles. We defined seven ontological principles (for example: each class must have at least one parent, each class must differ from its parent) and examined the properties of SNOMED CT classes with respect to these principles. Our major results are: 31% of (...) the classes have a single child; 27% have multiple parents; 51% do not exhibit any differentiae between the description of the parent and that of the child. The applications of this study to quality assurance for ontologies are discussed and suggestions are made for dealing with multiple inheritance. (shrink)
Quality assurance in large terminologies is a difficult issue. We present two algorithms that can help terminology developers and users to identify potential mistakes. We demonstrate the methodology by outlining the different types of mistakes that are found when the algorithms are applied to SNOMED-CT. On the basis of the results, we argue that both formal logical and linguistic tools should be used in the development and quality-assurance process of large terminologies.
We consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary evidence-based definitions of the satisfaction and truth of the atomic formulas of the first-order Peano Arithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways: (1) in terms of classical algorithmic verifiabilty; and (2) in terms of finitary algorithmic computability. We then show that the two definitions correspond to two distinctly different assignments of satisfaction and truth (...) to the compound formulas of PA over N---I_PA(N; SV ) and I_PA(N; SC). We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both I_PA(N; SV ) and I_PA(N; SC). We then show: (a) that if we assume the satisfaction and truth of the compound formulas of PA are always non-finitarily decidable under I_PA(N; SV ), then this assignment corresponds to the classical non-finitary putative standard interpretation I_PA(N; S) of PA over the domain N; and (b) that the satisfaction and truth of the compound formulas of PA are always finitarily decidable under the assignment I_PA(N; SC), from which we may finitarily conclude that PA is consistent. We further conclude that the appropriate inference to be drawn from Goedel's 1931 paper on undecidable arithmetical propositions is that we can define PA formulas which---under interpretation---are algorithmically verifiable as always true over N, but not algorithmically computable as always true over N. We conclude from this that Lucas' Goedelian argument is validated if the assignment I_PA(N; SV ) can be treated as circumscribing the ambit of human reasoning about `true' arithmetical propositions, and the assignment I_PA(N; SC) as circumscribing the ambit of mechanistic reasoning about `true' arithmetical propositions. (shrink)
Formalisms based on one or other flavor of Description Logic (DL) are sometimes put forward as helping to ensure that terminologies and controlled vocabularies comply with sound ontological principles. The objective of this paper is to study the degree to which one DL-based biomedical terminology (SNOMED CT) does indeed comply with such principles. We defined seven ontological principles (for example: each class must have at least one parent, each class must differ from its parent) and examined the properties of SNOMED (...) CT classes with respect to these principles. Our major results are: 31% of these classes have a single child; 27% have multiple parents; 51% do not exhibit any differentiae between the description of the parent and that of the child. The applications of this study to quality assurance for ontologies are discussed and suggestions are made for dealing with the phenomenon of multiple inheritance. The advantages and limitations of our approach are also discussed. (shrink)
In previous work on biomedical ontologies we showed how the provision of formal definitions for relations such as is_a and part_of can support new types of auto-mated reasoning about biomedical phenomena. We here extend this approach to the transformation_of characteristic of pathologies.
We present the details of a methodology for quality assurance in large medical terminologies and describe three algorithms that can help terminology developers and users to identify potential mistakes. The methodology is based in part on linguistic criteria and in part on logical and ontological principles governing sound classifications. We conclude by outlining the results of applying the methodology in the form of a taxonomy different types of errors and potential errors detected in SNOMED-CT.
Conventional wisdom dictates that proofs of mathematical propositions should be treated as necessary, and sufficient, for entailing `significant' mathematical truths only if the proofs are expressed in a---minimally, deemed consistent---formal mathematical theory in terms of: * Axioms/Axiom schemas * Rules of Deduction * Definitions * Lemmas * Theorems * Corollaries. Whilst Andrew Wiles' proof of Fermat's Last Theorem FLT, which appeals essentially to geometrical properties of real and complex numbers, can be treated as meeting this criteria, it nevertheless leaves two (...) questions unanswered: (i) Why is x^n +y^n = z^n solvable only for n < 3 if x, y, z, n are natural numbers? (ii) What technique might Fermat have used that led him to, even if only briefly, believe he had `a truly marvellous demonstration' of FLT? Prevailing post-Wiles wisdom---leaving (i) essentially unaddressed---dismisses Fermat's claim as a conjecture without a plausible proof of FLT. -/- However, we posit that providing evidence-based answers to both queries is necessary not only for treating FLT as significant, but also for understanding why FLT can be treated as a true arithmetical proposition. We thus argue that proving a theorem formally from explicit, and implicit, premises/axioms using rules of deduction---as currently accepted---is a meaningless game, of little scientific value, in the absence of evidence that has already established---unambiguously---why the premises/axioms and rules of deduction can be treated, and categorically communicated, as pre-formal truths in Marcus Pantsar's sense. Consequently, only evidence-based, pre-formal, truth can entail formal provability; and the formal proof of any significant mathematical theorem cannot entail its pre-formal truth as evidence-based. It can only identify the explicit/implicit premises that have been used to evidence the, already established, pre-formal truth of a mathematical proposition. Hence visualising and understanding the evidence-based, pre-formal, truth of a mathematical proposition is the only raison d'etre for subsequently seeking a formal proof of the proposition within a formal mathematical language (whether first-order or second order set theory, arithmetic, geometry, etc.) By this yardstick Andrew Wiles' proof of FLT fails to meet the required, evidence-based, criteria for entailing a true arithmetical proposition. -/- Moreover, we offer two scenarios as to why/how Fermat could have laconically concluded in his recorded marginal noting that FLT is a true arithmetical proposition---even though he either did not (or could not to his own satisfaction) succeed in cogently evidencing, and recording, why FLT can be treated as an evidence-based, pre-formal, arithmetical truth (presumably without appeal to properties of real and complex numbers). It is primarily such a putative, unrecorded, evidence-based reasoning underlying Fermat's laconic assertion which this investigation seeks to reconstruct; and to justify by appeal to a plausible resolution of some philosophical ambiguities concerning the relation between evidence-based, pre-formal, truth and formal provability. (shrink)
The integration of standardized biomedical terminologies into a single, unified knowledge representation system has formed a key area of applied informatics research in recent years. The Unified Medical Language System (UMLS) is the most advanced and most prominent effort in this direction, bringing together within its Metathesaurus a large number of distinct source-terminologies. The UMLS Semantic Network, which is designed to support the integration of these source-terminologies, has proved to be a highly successful combination of formal coherence and broad scope. (...) We argue here, however, that its organization manifests certain structural problems, and we describe revisions which we believe are needed if the network is to be maximally successful in realizing its goals of supporting terminology integration. (shrink)
Classical interpretations of Goedels formal reasoning, and of his conclusions, implicitly imply that mathematical languages are essentially incomplete, in the sense that the truth of some arithmetical propositions of any formal mathematical language, under any interpretation, is, both, non-algorithmic, and essentially unverifiable. However, a language of general, scientific, discourse, which intends to mathematically express, and unambiguously communicate, intuitive concepts that correspond to scientific investigations, cannot allow its mathematical propositions to be interpreted ambiguously. Such a language must, therefore, define mathematical truth (...) verifiably. We consider a constructive interpretation of classical, Tarskian, truth, and of Goedel's reasoning, under which any formal system of Peano Arithmetic---classically accepted as the foundation of all our mathematical Languages---is verifiably complete in the above sense. We show how some paradoxical concepts of Quantum mechanics can, then, be expressed, and interpreted, naturally under a constructive definition of mathematical truth. (shrink)
Two senses of ‘ontology’ can be distinguished in the current literature. First is the sense favored by information scientists, who view ontologies as software implementations designed to capture in some formal way the consensus conceptualization shared by those working on information systems or databases in a given domain. [Gruber 1993] Second is the sense favored by philosophers, who regard ontologies as theories of different types of entities (objects, processes, relations, functions) [Smith 2003]. Where information systems ontologists seek to maximize reasoning (...) efficiency even at the price of simplifications on the side of representation, philosophical ontologists argue that representational adequacy can bring benefits for the stability and resistance to error of an ontological framework and also for its extendibility in the future. In bioinformatics, however, a third sense of ‘ontology’ has established itself, above all as a result of the successes of the Gene Ontology (hereafter: GO), which is a tool for the representation and processing of information about gene products and their biological functions [Gene Ontology Consortium 2000]. We show how Basic Formal Ontology (BFO) has established itself as an overarching ontology drawing on all three of the strands distinguished above, and describe applications of BFO especially in the treatment of biological granularity. (shrink)
Andrew Wiles' analytic proof of Fermat's Last Theorem FLT, which appeals to geometrical properties of real and complex numbers, leaves two questions unanswered: (i) What technique might Fermat have used that led him to, even if only briefly, believe he had `a truly marvellous demonstration' of FLT? (ii) Why is x^n+y^n=z^n solvable only for n<3? In this inter-disciplinary perspective, we offer insight into, and answers to, both queries; yielding a pre-formal proof of why FLT can be treated as a true (...) arithmetical proposition (one which, moreover, might not be provable formally in the first-order Peano Arithmetic PA), where we admit only elementary (i.e., number-theoretic) reasoning, without appeal to analytic properties of real and complex numbers. We cogently argue, further, that any formal proof of FLT needs---as is implicitly suggested by Wiles' proof---to appeal essentially to formal geometrical properties of formal arithmetical propositions. (shrink)
Public trust in research and its output is essential for a healthy modern society. Although the research enterprise is self- correcting, this self-regulation occasionally needs help. Over the years, research institutions, professional societies, and governments have established several protocols, codes of conduct, norms, and principles to enhance that trust in research institutions, funders, producers, publishers, and products.
Although the Four Colour Theorem is passe, we give an elementary pre-formal proof that transparently illustrates why four colours suffice to chromatically differentiate any set of contiguous, simply connected and bounded, planar spaces; by showing that there is no minimal 4-coloured planar map M. We note that such a pre-formal proof of the Four Colour Theorem highlights the significance of differentiating between: (a) Plato's knowledge as justified true belief, which seeks a formal proof in a first-order mathematical language in order (...) to justify a belief as true; and (b) Piccinini's knowledge as factually grounded belief, which seeks a pre-formal proof, in Pantsar's sense, in order to justify the axioms and rules of inference of a first-order mathematical language which can, then, formally prove the belief as justifiably true under a well-defined interpretation of the language. (shrink)
We argue the thesis that if (1) a physical process is mathematically representable by a Cauchy sequence; and (2) we accept that there can be no infinite processes, i.e., nothing corresponding to infinite sequences, in natural phenomena; then (a) in the absence of an extraneous, evidence-based, proof of `closure' which determines the behaviour of the physical process in the limit as corresponding to a `Cauchy' limit; (b) the physical process must tend to a discontinuity (singularity) which has not been reflected (...) in the Cauchy sequence that seeks to describe the behaviour of the physical process. We support our thesis by mathematical models of the putative behaviours of (i) a virus cluster; (ii) an elastic string; and (iii) a Universe that recycles from Big Bang to Ultimate Implosion, in which parity and local time reversal violation, and the existence of `dark energy' in a multiverse, need not violate Einstein's equations and quantum theory. We suggest that the barriers to modelling such processes in a mathematical language that seeks unambiguous communication are illusory; they merely reflect an attempt to ask of the language chosen for such representation more than it is designed to deliver. (shrink)
We show how removing faith-based beliefs in current philosophies of classical and constructive mathematics admits formal, evidence-based, definitions of constructive mathematics; of a constructively well-defined logic of a formal mathematical language; and of a constructively well-defined model of such a language. -/- We argue that, from an evidence-based perspective, classical approaches which follow Hilbert's formal definitions of quantification can be labelled `theistic'; whilst constructive approaches based on Brouwer's philosophy of Intuitionism can be labelled `atheistic'. -/- We then adopt what may (...) be labelled a finitary, evidence-based, `agnostic' perspective and argue that Brouwerian atheism is merely a restricted perspective within the finitary agnostic perspective, whilst Hilbertian theism contradicts the finitary agnostic perspective. -/- We then consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary, evidence-based, definitions of the satisfaction and truth of the atomic formulas of the first-order Peano Arithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways. -/- We show that the two definitions correspond to two distinctly different---not necessarily evidence-based but complementary---assignments of satisfaction and truth to the compound formulas of PA over N. -/- We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both the complementary interpretations; and conclude some unsuspected constructive consequences of such complementarity for the foundations of mathematics, logic, philosophy, and the physical sciences. -/- . (shrink)
The Foundational Model of Anatomy (FMA) is a map of the human body. Like maps of other sorts – including the map-like representations we find in familiar anatomical atlases – it is a representation of a certain portion of spatial reality as it exists at a certain (idealized) instant of time. But unlike other maps, the FMA comes in the form of a sophisticated ontology of its objectdomain, comprising some 1.5 million statements of anatomical relations among some 70,000 anatomical kinds. (...) It is further distinguished from other maps in that it represents not some specific portion of spatial reality (say: Leeds in 1996), but rather the generalized or idealized spatial reality associated with a generalized or idealized human being at some generalized or idealized instant of time. It will be our concern in what follows to outline the approach to ontology that is represented by the FMA and to argue that it can serve as the basis for a new type of anatomical information science. We also draw some implications for our understanding of spatial reasoning and spatial ontologies in general. (shrink)
In the recent past, Vietnam has dramatically increased its investment relationship and trade with the United States. At the same time, United States foreign direct investment and trade with China has been decreasing. This is even more significant when we are in a period of internal growth within the United States. Using comparative business system analysis theory and a mixed method approach we conclude that Vietnam is turning into the new China for United States firms due to the fewer differences (...) that exist between their business systems. The Chinese business system has major differences when compared with the economic system of the United States, whereas the Vietnam system has closer resemblance to the United States system. We have laid out inferences of our arguments for future research, particularly in the area of institutional comparative advantage. (shrink)
The fourteen papers in this collection offer a variety of original contributions to the epistemology of modality. In seeking to explain how we might account for our knowledge of possibility and necessity, they raise some novel questions, develop some unfamiliar theoretical perspectives, and make some intriguing proposals. Collectively, they advance our understanding of the field. In Part I of this Introduction, I give some general background about the contemporary literature in the area, by sketching a timeline of the main tendencies (...) of the past twenty-five years or so, up to the present debates. Next, I focus on four features that largely characterize the latest literature, and the papers in the present collection in particular: (i) an endorsement of the importance of essentialism; (ii) a shift to a “metaphysics-first” approach to modal epistemology; (iii) a focus on metaphysical modality as opposed to other kinds of modality; and (iv) a preference for non-uniform modal epistemology. In Part II, I present the individual papers in the volume. These are organized around the following four chapters, based on their topic: (A) Skepticism & Deflationism; (B) Essentialism; (C) Non-Essentialist Accounts; (D) Applications. -/- LIST OF CONTRIBUTORS: Francesco Berto; Stephen Biggs & Jessica Wilson; Justin Clark-Doane; Philip Goff; Bob Hale; Frank Jackson; Mark Jago; Boris Kment; Antonella Mallozzi; Graham Priest; Gabriel Rabin; Amie Thomasson; Anand Vaidya & Michael Wallner; Jennifer Wang. -/- The volume is dedicated to the memory of Bob Hale. -/- . (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.