Switch to: References

Add citations

You must login to add citations.
  1. A Modal Logic for Supervised Learning.Alexandru Baltag, Dazhu Li & Mina Young Pedersen - 2022 - Journal of Logic, Language and Information 31 (2):213-234.
    Formal learning theory formalizes the process of inferring a general result from examples, as in the case of inferring grammars from sentences when learning a language. In this work, we develop a general framework—the supervised learning game—to investigate the interaction between Teacher and Learner. In particular, our proposal highlights several interesting features of the agents: on the one hand, Learner may make mistakes in the learning process, and she may also ignore the potential relation between different hypotheses; on the other (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Completeness results for memory logics.Carlos Areces, Santiago Figueira & Sergio Mera - 2012 - Annals of Pure and Applied Logic 163 (7):961-972.
    Download  
     
    Export citation  
     
    Bookmark  
  • Completeness results for memory logics.Carlos Areces, Santiago Figueria & Sergio Mera - 2012 - Annals of Pure and Applied Logic 163 (7):961-972.
    Download  
     
    Export citation  
     
    Bookmark  
  • On the Comparisons of Logics in Terms of Expressive Power.Diego Pinheiro Fernandes - 2023 - Manuscrito 46 (4):2022-0054.
    This paper investigates the question “when is a logic more expressive than another?” In order to approach it, “logic” is understood in the model-theoretic sense and, contrary to other proposals in the literature, it is argued that relative expressiveness between logics is best framed with respect to the notion of expressing properties of models, a notion that can be captured precisely in various ways. It is shown that each precise rendering can give rise to a formal condition for relative expressiveness (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Expressive Power of “Now” and “Then” Operators.Igor Yanovich - 2015 - Journal of Logic, Language and Information 24 (1):65-93.
    Natural language provides motivation for studying modal backwards-looking operators such as “now”, “then” and “actually” that evaluate their argument formula at some previously considered point instead of the current one. This paper investigates the expressive power over models of both propositional and first-order basic modal language enriched with such operators. Having defined an appropriate notion of bisimulation for first-order modal logic, I show that backwards-looking operators increase its expressive power quite mildly, contrary to beliefs widespread among philosophers of language and (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • (1 other version)Extending ALCQ with Bounded Self-Reference.Daniel Gorín & Lutz Schröder - 1998 - In Marcus Kracht, Maarten de Rijke, Heinrich Wansing & Michael Zakharyaschev (eds.), Advances in Modal Logic. CSLI Publications. pp. 300-316.
    Download  
     
    Export citation  
     
    Bookmark