Universal Graph Transformer Self-Attention Networks

September 26, 2019 ยท Declared Dead ยท ๐Ÿ› The Web Conference

๐Ÿ’€ CAUSE OF DEATH: 404 Not Found
Code link is broken/dead
Authors Dai Quoc Nguyen, Tu Dinh Nguyen, Dinh Phung arXiv ID 1909.11855 Category cs.LG: Machine Learning Cross-listed cs.CV, stat.ML Citations 85 Venue The Web Conference Repository https://github.com/daiquocnguyen/Graph-Transformer} Last Checked 1 month ago
Abstract
We introduce a transformer-based GNN model, named UGformer, to learn graph representations. In particular, we present two UGformer variants, wherein the first variant (publicized in September 2019) is to leverage the transformer on a set of sampled neighbors for each input node, while the second (publicized in May 2021) is to leverage the transformer on all input nodes. Experimental results demonstrate that the first UGformer variant achieves state-of-the-art accuracies on benchmark datasets for graph classification in both inductive setting and unsupervised transductive setting; and the second UGformer variant obtains state-of-the-art accuracies for inductive text classification. The code is available at: \url{https://github.com/daiquocnguyen/Graph-Transformer}.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning

Died the same way โ€” ๐Ÿ’€ 404 Not Found