Results for 'RAG'

Order:
  1. Barah Maha - The Changing Phases of Nature.Devinder Pal Singh - 2020 - The Sikh Review 68 (10):9-15..
    Barah Maha (Twelve months) is a form of folk poetry that describes the emotions and yearnings of the human heart, expressed in terms of the changing moods of nature over the twelve months of a year. In this form of poetry, the mood of nature in each particular month, of the Indian calendar, depicts the inner agony of the human heart which in most cases happens to be a lovelorn young woman separated from her spouse or beloved. In other words, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  2. Sideloading: Creating A Model of a Person via LLM with Very Large Prompt.Alexey Turchin & Roman Sitelew - manuscript
    Sideloading is the creation of a digital model of a person during their life via iterative improvements of this model based on the person's feedback. The progress of LLMs with large prompts allows the creation of very large, book-size prompts which describe a personality. We will call mind-models created via sideloading "sideloads"; they often look like chatbots, but they are more than that as they have other output channels, like internal thought streams and descriptions of actions. -/- By arranging the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  3. Unjustified untrue "beliefs": AI hallucinations and justification logics.Kristina Šekrst - forthcoming - In Kordula Świętorzecka, Filip Grgić & Anna Brozek (eds.), Logic, Knowledge, and Tradition. Essays in Honor of Srecko Kovac.
    In artificial intelligence (AI), responses generated by machine-learning models (most often large language models) may be unfactual information presented as a fact. For example, a chatbot might state that the Mona Lisa was painted in 1815. Such phenomenon is called AI hallucinations, seeking inspiration from human psychology, with a great difference of AI ones being connected to unjustified beliefs (that is, AI “beliefs”) rather than perceptual failures). -/- AI hallucinations may have their source in the data itself, that is, the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation