Direct Output Connection for a High-Rank Language Model

August 30, 2018 ยท Entered Twilight ยท ๐Ÿ› Conference on Empirical Methods in Natural Language Processing

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 7.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: LICENSE, NTT_LICENSE, README.md, cal_ppl.py, data.py, embed_regularize.py, finetune.py, generate.py, get_data.sh, locked_dropout.py, main.py, model.py, utils.py, weight_drop.py

Authors Sho Takase, Jun Suzuki, Masaaki Nagata arXiv ID 1808.10143 Category cs.CL: Computation & Language Citations 37 Venue Conference on Empirical Methods in Natural Language Processing Repository https://github.com/nttcslab-nlp/doc_lm โญ 12 Last Checked 1 month ago
Abstract
This paper proposes a state-of-the-art recurrent neural network (RNN) language model that combines probability distributions computed not only from a final RNN layer but also from middle layers. Our proposed method raises the expressive power of a language model based on the matrix factorization interpretation of language modeling introduced by Yang et al. (2018). The proposed method improves the current state-of-the-art language model and achieves the best score on the Penn Treebank and WikiText-2, which are the standard benchmark datasets. Moreover, we indicate our proposed method contributes to two application tasks: machine translation and headline generation. Our code is publicly available at: https://github.com/nttcslab-nlp/doc_lm.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computation & Language

๐ŸŒ… ๐ŸŒ… Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL ๐Ÿ› NeurIPS ๐Ÿ“š 166.0K cites 8 years ago