Learning to Make Predictions on Graphs with Autoencoders

February 23, 2018 ยท Entered Twilight ยท ๐Ÿ› International Conference on Data Science and Advanced Analytics

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 6.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, LICENSE, README.md, figure1.png, longae, neurips2018-poster.pdf

Authors Phi Vu Tran arXiv ID 1802.08352 Category cs.LG: Machine Learning Cross-listed cs.AI, stat.ML Citations 58 Venue International Conference on Data Science and Advanced Analytics Repository https://github.com/vuptran/graph-representation-learning โญ 255 Last Checked 1 month ago
Abstract
We examine two fundamental tasks associated with graph representation learning: link prediction and semi-supervised node classification. We present a novel autoencoder architecture capable of learning a joint representation of both local graph structure and available node features for the multi-task learning of link prediction and node classification. Our autoencoder architecture is efficiently trained end-to-end in a single learning stage to simultaneously perform link prediction and node classification, whereas previous related methods require multiple training steps that are difficult to optimize. We provide a comprehensive empirical evaluation of our models on nine benchmark graph-structured datasets and demonstrate significant improvement over related methods for graph representation learning. Reference code and data are available at https://github.com/vuptran/graph-representation-learning
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning