Dependency Parsing as Head Selection

June 03, 2016 ยท Entered Twilight ยท ๐Ÿ› Conference of the European Chapter of the Association for Computational Linguistics

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 8.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, LICENSE, README.md, conllx_scripts, dataiter, dense_parser.lua, experiments, graph_alg, init.lua, layers, model_opts.lua, mst_postprocess.lua, nnets, post_train.lua, train.lua, train_labeled.lua, utils

Authors Xingxing Zhang, Jianpeng Cheng, Mirella Lapata arXiv ID 1606.01280 Category cs.CL: Computation & Language Cross-listed cs.LG Citations 95 Venue Conference of the European Chapter of the Association for Computational Linguistics Repository https://github.com/XingxingZhang/dense_parser โญ 25 Last Checked 1 month ago
Abstract
Conventional graph-based dependency parsers guarantee a tree structure both during training and inference. Instead, we formalize dependency parsing as the problem of independently selecting the head of each word in a sentence. Our model which we call \textsc{DeNSe} (as shorthand for {\bf De}pendency {\bf N}eural {\bf Se}lection) produces a distribution over possible heads for each word using features obtained from a bidirectional recurrent neural network. Without enforcing structural constraints during training, \textsc{DeNSe} generates (at inference time) trees for the overwhelming majority of sentences, while non-tree outputs can be adjusted with a maximum spanning tree algorithm. We evaluate \textsc{DeNSe} on four languages (English, Chinese, Czech, and German) with varying degrees of non-projectivity. Despite the simplicity of the approach, our parsers are on par with the state of the art.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computation & Language

๐ŸŒ… ๐ŸŒ… Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL ๐Ÿ› NeurIPS ๐Ÿ“š 166.0K cites 8 years ago