R.I.P.
π»
Ghosted
Longformer for MS MARCO Document Re-ranking Task
September 20, 2020 Β· Entered Twilight Β· π Text Retrieval Conference
"Last commit was 5.0 years ago (β₯5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .gitignore, README.md, data, src
Authors
Ivan SekuliΔ, Amir Soleimani, Mohammad Aliannejadi, Fabio Crestani
arXiv ID
2009.09392
Category
cs.IR: Information Retrieval
Citations
13
Venue
Text Retrieval Conference
Repository
https://github.com/isekulic/longformer-marco
β 20
Last Checked
1 month ago
Abstract
Two step document ranking, where the initial retrieval is done by a classical information retrieval method, followed by neural re-ranking model, is the new standard. The best performance is achieved by using transformer-based models as re-rankers, e.g., BERT. We employ Longformer, a BERT-like model for long documents, on the MS MARCO document re-ranking task. The complete code used for training the model can be found on: https://github.com/isekulic/longformer-marco
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Information Retrieval
R.I.P.
π»
Ghosted
LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation
R.I.P.
π»
Ghosted
Graph Convolutional Neural Networks for Web-Scale Recommender Systems
π
π
Old Age
Neural Graph Collaborative Filtering
R.I.P.
π»
Ghosted
Self-Attentive Sequential Recommendation
R.I.P.
π»
Ghosted