An Unsupervised Autoregressive Model for Speech Representation Learning
April 05, 2019 ยท Entered Twilight ยท ๐ Interspeech
"Last commit was 6.0 years ago (โฅ5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: README.md, apc_model.py, datasets.py, load_pretrained_model.py, prepare_data.py, train_apc.py, utils.py
Authors
Yu-An Chung, Wei-Ning Hsu, Hao Tang, James Glass
arXiv ID
1904.03240
Category
cs.CL: Computation & Language
Cross-listed
cs.LG,
cs.SD,
eess.AS
Citations
425
Venue
Interspeech
Repository
https://github.com/iamyuanchung/Autoregressive-Predictive-Coding
โญ 189
Last Checked
1 month ago
Abstract
This paper proposes a novel unsupervised autoregressive neural model for learning generic speech representations. In contrast to other speech representation learning methods that aim to remove noise or speaker variabilities, ours is designed to preserve information for a wide range of downstream tasks. In addition, the proposed model does not require any phonetic or word boundary labels, allowing the model to benefit from large quantities of unlabeled data. Speech representations learned by our model significantly improve performance on both phone classification and speaker verification over the surface features and other supervised and unsupervised approaches. Further analysis shows that different levels of speech information are captured by our model at different layers. In particular, the lower layers tend to be more discriminative for speakers, while the upper layers provide more phonetic content.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted