R.I.P.
๐ป
Ghosted
Subgraph Pooling: Tackling Negative Transfer on Graphs
February 14, 2024 ยท Entered Twilight ยท ๐ International Joint Conference on Artificial Intelligence
Repo contents: .gitignore, aggr.py, data_utils.py, eval.py, main.py, main_elliptic.py, main_multi.py, model.py, params, pre_data, readme.md, requirements.txt, script, utils.py
Authors
Zehong Wang, Zheyuan Zhang, Chuxu Zhang, Yanfang Ye
arXiv ID
2402.08907
Category
cs.LG: Machine Learning
Cross-listed
cs.AI,
cs.SI
Citations
19
Venue
International Joint Conference on Artificial Intelligence
Repository
https://github.com/Zehong-Wang/Subgraph-Pooling
โญ 3
Last Checked
1 month ago
Abstract
Transfer learning aims to enhance performance on a target task by using knowledge from related tasks. However, when the source and target tasks are not closely aligned, it can lead to reduced performance, known as negative transfer. Unlike in image or text data, we find that negative transfer could commonly occur in graph-structured data, even when source and target graphs have semantic similarities. Specifically, we identify that structural differences significantly amplify the dissimilarities in the node embeddings across graphs. To mitigate this, we bring a new insight in this paper: for semantically similar graphs, although structural differences lead to significant distribution shift in node embeddings, their impact on subgraph embeddings could be marginal. Building on this insight, we introduce Subgraph Pooling (SP) by aggregating nodes sampled from a k-hop neighborhood and Subgraph Pooling++ (SP++) by a random walk, to mitigate the impact of graph structural differences on knowledge transfer. We theoretically analyze the role of SP in reducing graph discrepancy and conduct extensive experiments to evaluate its superiority under various settings. The proposed SP methods are effective yet elegant, which can be easily applied on top of any backbone Graph Neural Networks (GNNs). Our code and data are available at: https://github.com/Zehong-Wang/Subgraph-Pooling.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Machine Learning
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
๐ป
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
๐ป
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
๐ป
Ghosted