A Recurrent Neural Model with Attention for the Recognition of Chinese Implicit Discourse Relations

April 26, 2017 Β· Entered Twilight Β· πŸ› Annual Meeting of the Association for Computational Linguistics

πŸŒ… TWILIGHT: Old Age
Predates the code-sharing era β€” a pioneer of its time

"Last commit was 8.0 years ago (β‰₯5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: FarrokhAttentionLayer.py, README.md, acl_poster.pdf, class_dist.py, classify.py, model7301.keras.gz, model7301.map.gz, resources.py, train.py, zh-gw300_intersect.w2v.gz

Authors Samuel Rânnqvist, Niko Schenk, Christian Chiarcos arXiv ID 1704.08092 Category cs.CL: Computation & Language Cross-listed cs.AI, cs.LG, cs.NE Citations 31 Venue Annual Meeting of the Association for Computational Linguistics Repository https://github.com/sronnqvist/discourse-ablstm ⭐ 33 Last Checked 1 month ago
Abstract
We introduce an attention-based Bi-LSTM for Chinese implicit discourse relations and demonstrate that modeling argument pairs as a joint sequence can outperform word order-agnostic approaches. Our model benefits from a partial sampling scheme and is conceptually simple, yet achieves state-of-the-art performance on the Chinese Discourse Treebank. We also visualize its attention activity to illustrate the model's ability to selectively focus on the relevant parts of an input sequence.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Computation & Language

πŸŒ… πŸŒ… Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL πŸ› NeurIPS πŸ“š 166.0K cites 8 years ago