Feature Staleness Aware Incremental Learning for CTR Prediction

April 29, 2025 ยท Entered Twilight ยท ๐Ÿ› International Joint Conference on Artificial Intelligence

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: .gitignore, README.md, ablation1.jpg, code, data.zip, overview.jpg

Authors Zhikai Wang, Yanyan Shen, Zibin Zhang, Kangyi Lin arXiv ID 2505.02844 Category cs.IR: Information Retrieval Cross-listed cs.LG Citations 2 Venue International Joint Conference on Artificial Intelligence Repository https://github.com/cloudcatcher888/FeSAIL โญ 7 Last Checked 1 month ago
Abstract
Click-through Rate (CTR) prediction in real-world recommender systems often deals with billions of user interactions every day. To improve the training efficiency, it is common to update the CTR prediction model incrementally using the new incremental data and a subset of historical data. However, the feature embeddings of a CTR prediction model often get stale when the corresponding features do not appear in current incremental data. In the next period, the model would have a performance degradation on samples containing stale features, which we call the feature staleness problem. To mitigate this problem, we propose a Feature Staleness Aware Incremental Learning method for CTR prediction (FeSAIL) which adaptively replays samples containing stale features. We first introduce a staleness aware sampling algorithm (SAS) to sample a fixed number of stale samples with high sampling efficiency. We then introduce a staleness aware regularization mechanism (SAR) for a fine-grained control of the feature embedding updating. We instantiate FeSAIL with a general deep learning-based CTR prediction model and the experimental results demonstrate FeSAIL outperforms various state-of-the-art methods on four benchmark datasets.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Information Retrieval