View on GitHub

NLP-progress

Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.

Dependency parsing

Dependency parsing is the task of extracting a dependency parse of a sentence that represents its grammatical structure and defines the relationships between “head” words and words, which modify those heads.

Example:

     root
      |
      | +-------dobj---------+
      | |                    |
nsubj | |   +------det-----+ | +-----nmod------+
+--+  | |   |              | | |               |
|  |  | |   |      +-nmod-+| | |      +-case-+ |
+  |  + |   +      +      || + |      +      | |
I  prefer  the  morning   flight  through  Denver

Relations among the words are illustrated above the sentence with directed, labeled arcs from heads to dependents (+ indicates the dependent).

Penn Treebank

Models are evaluated on the Stanford Dependency conversion (v3.3.0) of the Penn Treebank with predicted POS-tags. Punctuation symbols are excluded from the evaluation. Evaluation metrics are unlabeled attachment score (UAS) and labeled attachment score (LAS). Here, we also mention the predicted POS tagging accuracy.

Model POS UAS LAS Paper / Source Code
Deep Biaffine by Dozat and Manning (2017) 97.3 95.44 93.76 Deep Biaffine Attention for Neural Dependency Parsing Official
jPTDP by Nguyen and Verspoor (2018) 97.97 94.51 92.87 An improved neural network model for joint POS tagging and dependency parsing Official
by Andor et al. (2016) 97.44 94.61 92.79 Globally Normalized Transition-Based Neural Networks
Distilled neural FOG by Kuncoro et al. (2016) 97.3 94.26 92.06 Distilling an Ensemble of Greedy Dependency Parsers into One MST Parser
by Weiss et al. (2015) 97.44 93.99 92.05 Structured Training for Neural Network Transition-Based Parsing
BIST transition-based parser by Kiperwasser and Goldberg (2016) 97.3 93.9 91.9 Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations Official
Arc-hybrid by Ballesteros et al. (2016) 97.3 93.56 91.42 Training with Exploration Improves a Greedy Stack-LSTM Parser
BIST graph-based parser by Kiperwasser and Goldberg (2016) 97.3 93.1 91.0 Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations Official

The following results are just for references:

Model POS UAS comment Paper / Source Code
Stack-only RNNG by Kuncoro et al. (2017) 95.8 Constituent parser What Do Recurrent Neural Network Grammars Learn About Syntax?
Semi-supervised LSTM-LM by Choe and Charniak (2016) 95.9 Constituent parser Parsing as Language Modeling
Deep Biaffine by Dozat and Manning (2017) 97.3 95.66 Stanford conversion **v3.5.0** Deep Biaffine Attention for Neural Dependency Parsing Official

Unsupervised dependency parsing

Unsupervised dependency parsing is the task of inferring the dependency parse of sentences without any labeled training data.

Penn Treebank

As with supervised parsing, models are evaluated against the Penn Treebank. The most common evaluation setup is to use gold POS-tags as input and to evaluate systems using the unlabeled attachment score (also called ‘directed dependency accuracy’).

Model UAS Paper / Source Code
Iterative reranking by Le & Zuidema (2015) 66.2 Unsupervised Dependency Parsing - Let’s Use Supervised Parsers
Combined System by Spitkovsky et al (2013) 64.4 Breaking Out of Local Optima with Count Transforms and Model Recombination - A Study in Grammar Induction
Tree Substitution Grammar DMV by Blunsom & Cohn (2010) 55.7 Unsupervised Induction of Tree Substitution Grammars for Dependency Parsing
Shared Logistic Normal DMV by Cohen & Smith (2009) 41.4 Shared Logistic Normal Distributions for Soft Parameter Tying in Unsupervised Grammar Induction
DMV by Klein & Manning (2004) 35.9 Corpus-Based Induction of Syntactic Structure - Models of Dependency and Constituency

Go back to the README