Berkeley Neural Parser
Journal: Annual Meeting of the Association for Computational Linguistics
Languages: Arabic, Basque, Chinese, English, French, German, Hebrew, Hungarian, Korean, Polish, Swedish
Programming languages: Python
Project website: https://github.com/nikitakit/self-attentive-parser
We demonstrate that replacing an LSTM encoder with a self-attentive architecture can lead to improvements to a state-ofthe-art discriminative constituency parser. The use of attention makes explicit the manner in which information is propagated between different locations in the sentence, which we use to both analyze our model and propose potential improvements.