A Novel Neural Network Model for Joint POS Tagging and Graph-based Dependency Parsing
May 16, 2017 ยท Entered Twilight ยท ๐ Conference on Computational Natural Language Learning
"Last commit was 6.0 years ago (โฅ5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: License.txt, README.md, decoder.py, jPTDP.py, learner.py, mnnl.py, sample, utils.py, utils
Authors
Dat Quoc Nguyen, Mark Dras, Mark Johnson
arXiv ID
1705.05952
Category
cs.CL: Computation & Language
Citations
38
Venue
Conference on Computational Natural Language Learning
Repository
https://github.com/datquocnguyen/jPTDP
โญ 156
Last Checked
1 month ago
Abstract
We present a novel neural network model that learns POS tagging and graph-based dependency parsing jointly. Our model uses bidirectional LSTMs to learn feature representations shared for both POS tagging and dependency parsing tasks, thus handling the feature-engineering problem. Our extensive experiments, on 19 languages from the Universal Dependencies project, show that our model outperforms the state-of-the-art neural network-based Stack-propagation model for joint POS tagging and transition-based dependency parsing, resulting in a new state of the art. Our code is open-source and available together with pre-trained models at: https://github.com/datquocnguyen/jPTDP
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted