Switch to: Citations

Add references

You must login to add references.
  1. A Generative Constituent-Context Model for Improved Grammar Induction.Dan Klein & Christopher D. Manning - unknown
    We present a generative distributional model for the unsupervised induction of natural language syntax which explicitly models constituent yields and contexts. Parameter search with EM produces higher quality analyses than previously exhibited by unsupervised systems, giving the best published unsupervised parsing results on the ATIS corpus. Experiments on Penn treebank sentences of comparable length show an even higher F1 of 71% on nontrivial brackets. We compare distributionally induced and actual part-of-speech tags as input data, and examine extensions to the basic (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • (1 other version)Accurate Unlexicalized Parsing.Dan Klein & Christopher D. Manning - unknown
    We demonstrate that an unlexicalized PCFG can parse much more accurately than previously shown, by making use of simple, linguistically motivated state splits, which break down false independence assumptions latent in a vanilla treebank grammar. Indeed, its performance of 86.36% (LP/LR F1) is better than that of early lexicalized PCFG models, and surprisingly close to the current state-of-theart. This result has potential uses beyond establishing a strong lower bound on the maximum possible accuracy of unlexicalized models: an unlexicalized PCFG is (...)
    Download  
     
    Export citation  
     
    Bookmark   25 citations