Better Long-Range Dependency By Bootstrapping A Mutual Information Regularizer

May 28, 2019 ยท Entered Twilight ยท ๐Ÿ› International Conference on Artificial Intelligence and Statistics

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 5.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, LICENSE, README.md, checkpoint, config.py, finetune.py, get_data.sh, models, train_lm.py, utils

Authors Yanshuai Cao, Peng Xu arXiv ID 1905.11978 Category cs.LG: Machine Learning Cross-listed cs.CL, stat.ML Citations 2 Venue International Conference on Artificial Intelligence and Statistics Repository https://github.com/BorealisAI/BMI โญ 5 Last Checked 1 month ago
Abstract
In this work, we develop a novel regularizer to improve the learning of long-range dependency of sequence data. Applied on language modelling, our regularizer expresses the inductive bias that sequence variables should have high mutual information even though the model might not see abundant observations for complex long-range dependency. We show how the `next sentence prediction (classification)' heuristic can be derived in a principled way from our mutual information estimation framework, and be further extended to maximize the mutual information of sequence variables. The proposed approach not only is effective at increasing the mutual information of segments under the learned model but more importantly, leads to a higher likelihood on holdout data, and improved generation quality. Code is released at https://github.com/BorealisAI/BMI.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning