A Recurrent Neural Model with Attention for the Recognition of Chinese Implicit Discourse Relations
April 26, 2017 Β· Entered Twilight Β· π Annual Meeting of the Association for Computational Linguistics
"Last commit was 8.0 years ago (β₯5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: FarrokhAttentionLayer.py, README.md, acl_poster.pdf, class_dist.py, classify.py, model7301.keras.gz, model7301.map.gz, resources.py, train.py, zh-gw300_intersect.w2v.gz
Authors
Samuel RΓΆnnqvist, Niko Schenk, Christian Chiarcos
arXiv ID
1704.08092
Category
cs.CL: Computation & Language
Cross-listed
cs.AI,
cs.LG,
cs.NE
Citations
31
Venue
Annual Meeting of the Association for Computational Linguistics
Repository
https://github.com/sronnqvist/discourse-ablstm
β 33
Last Checked
1 month ago
Abstract
We introduce an attention-based Bi-LSTM for Chinese implicit discourse relations and demonstrate that modeling argument pairs as a joint sequence can outperform word order-agnostic approaches. Our model benefits from a partial sampling scheme and is conceptually simple, yet achieves state-of-the-art performance on the Chinese Discourse Treebank. We also visualize its attention activity to illustrate the model's ability to selectively focus on the relevant parts of an input sequence.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Computation & Language
π
π
Old Age
π
π
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
π»
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
π»
Ghosted