Cauchy-Schwarz Divergence Information Bottleneck for Regression

April 27, 2024 Β· Declared Dead Β· πŸ› International Conference on Learning Representations

πŸ’€ CAUSE OF DEATH: 404 Not Found
Code link is broken/dead
Authors Shujian Yu, Xi Yu, Sigurd LΓΈkse, Robert Jenssen, Jose C. Principe arXiv ID 2404.17951 Category cs.LG: Machine Learning Cross-listed cs.IT, stat.ML Citations 12 Venue International Conference on Learning Representations Repository https://github.com/SJYuCNEL/Cauchy-Schwarz-Information-Bottleneck} Last Checked 1 month ago
Abstract
The information bottleneck (IB) approach is popular to improve the generalization, robustness and explainability of deep neural networks. Essentially, it aims to find a minimum sufficient representation $\mathbf{t}$ by striking a trade-off between a compression term $I(\mathbf{x};\mathbf{t})$ and a prediction term $I(y;\mathbf{t})$, where $I(\cdot;\cdot)$ refers to the mutual information (MI). MI is for the IB for the most part expressed in terms of the Kullback-Leibler (KL) divergence, which in the regression case corresponds to prediction based on mean squared error (MSE) loss with Gaussian assumption and compression approximated by variational inference. In this paper, we study the IB principle for the regression problem and develop a new way to parameterize the IB with deep neural networks by exploiting favorable properties of the Cauchy-Schwarz (CS) divergence. By doing so, we move away from MSE-based regression and ease estimation by avoiding variational approximations or distributional assumptions. We investigate the improved generalization ability of our proposed CS-IB and demonstrate strong adversarial robustness guarantees. We demonstrate its superior performance on six real-world regression tasks over other popular deep IB approaches. We additionally observe that the solutions discovered by CS-IB always achieve the best trade-off between prediction accuracy and compression ratio in the information plane. The code is available at \url{https://github.com/SJYuCNEL/Cauchy-Schwarz-Information-Bottleneck}.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Machine Learning

Died the same way β€” πŸ’€ 404 Not Found