Switch to: References

Add citations

You must login to add citations.
  1. Probabilistic syntax.Christopher Manning - manuscript
    “Everyone knows that language is variable.” This is the bald sentence with which Sapir (1921:147) begins his chapter on language as an historical product. He goes on to emphasize how two speakers’ usage is bound to differ “in choice of words, in sentence structure, in the relative frequency with which particular forms or combinations of words are used”. I should add that much sociolinguistic and historical linguistic research has shown that the same speaker’s usage is also variable (Labov 1966, Kroch (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Combining Machine Learning and Semantic Features in the Classification of Corporate Disclosures.Stefan Evert, Philipp Heinrich, Klaus Henselmann, Ulrich Rabenstein, Elisabeth Scherr, Martin Schmitt & Lutz Schröder - 2019 - Journal of Logic, Language and Information 28 (2):309-330.
    We investigate an approach to improving statistical text classification by combining machine learners with an ontology-based identification of domain-specific topic categories. We apply this approach to ad hoc disclosures by public companies. This form of obligatory publicity concerns all information that might affect the stock price; relevant topic categories are governed by stringent regulations. Our goal is to classify disclosures according to their effect on stock prices (negative, neutral, positive). In the study reported here, we combine natural language parsing with (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Using Shakespeare's Sotto Voce to Determine True Identity From Text.David Kernot, Terry Bossomaier & Roger Bradbury - 2018 - Frontiers in Psychology 9.
    Download  
     
    Export citation  
     
    Bookmark  
  • Feature-rich part-of-speech tagging with a cyclic dependency network.Christopher Manning - manuscript
    first-order HMM, the current tag t0 is predicted based on the previous tag t−1 (and the current word).1 The back- We present a new part-of-speech tagger that ward interaction between t0 and the next tag t+1 shows demonstrates the following ideas: (i) explicit up implicitly later, when t+1 is generated in turn. While unidirectional models are therefore able to capture both use of both preceding and following tag con-.
    Download  
     
    Export citation  
     
    Bookmark   7 citations