Universal Dependency Parsing with a General Transition-Based DAG Parser

August 28, 2018 ยท Entered Twilight ยท ๐Ÿ› Conference on Computational Natural Language Learning

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 6.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .appveyor.yml, .travis.yml, LICENSE.txt, MANIFEST.in, README.md, ablation.txt, ablation_conll2018.sh, activate_conll2018.sh, activate_models_conll2018.sh, ci, count_enhanced_conll2018.sh, docs, eval_ablation_conll2018.sh, eval_conll2018.sh, eval_enhanced_conll2018.sh, experiments, grep_las_conll2018.sh, requirements.txt, rm_all_conll2018_parsed.sh, run_conll2018.sh, run_conll2018_tira.sh, server, setup.py, test_files, tests, train_2.3.sh, train_conll2018.sh, train_conll2018_delex.sh, train_conll2018_multitask.sh, train_conll2018_multitask_all.sh, tupa, ud-2.3.txt, udpipe.py, waiting.txt

Authors Daniel Hershcovich, Omri Abend, Ari Rappoport arXiv ID 1808.09354 Category cs.CL: Computation & Language Citations 10 Venue Conference on Computational Natural Language Learning Repository https://github.com/CoNLL-UD-2018/HUJI โญ 1 Last Checked 1 month ago
Abstract
This paper presents our experiments with applying TUPA to the CoNLL 2018 UD shared task. TUPA is a general neural transition-based DAG parser, which we use to present the first experiments on recovering enhanced dependencies as part of the general parsing task. TUPA was designed for parsing UCCA, a cross-linguistic semantic annotation scheme, exhibiting reentrancy, discontinuity and non-terminal nodes. By converting UD trees and graphs to a UCCA-like DAG format, we train TUPA almost without modification on the UD parsing task. The generic nature of our approach lends itself naturally to multitask learning. Our code is available at https://github.com/CoNLL-UD-2018/HUJI
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computation & Language

๐ŸŒ… ๐ŸŒ… Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL ๐Ÿ› NeurIPS ๐Ÿ“š 166.0K cites 8 years ago