Big Learning Expectation Maximization

December 19, 2023 ยท Entered Twilight ยท ๐Ÿ› AAAI Conference on Artificial Intelligence

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: BL_vs_deepClustering, LICENSE, README.md, dataset, function.py, main_biglearnEM_vs_EM_v1.ipynb, main_realworld_clustering.ipynb, method.py

Authors Yulai Cong, Sijia Li arXiv ID 2312.11926 Category cs.LG: Machine Learning Cross-listed stat.ME, stat.ML Citations 4 Venue AAAI Conference on Artificial Intelligence Repository https://github.com/YulaiCong/Big-Learning-Expectation-Maximization โญ 1 Last Checked 1 month ago
Abstract
Mixture models serve as one fundamental tool with versatile applications. However, their training techniques, like the popular Expectation Maximization (EM) algorithm, are notoriously sensitive to parameter initialization and often suffer from bad local optima that could be arbitrarily worse than the optimal. To address the long-lasting bad-local-optima challenge, we draw inspiration from the recent ground-breaking foundation models and propose to leverage their underlying big learning principle to upgrade the EM. Specifically, we present the Big Learning EM (BigLearn-EM), an EM upgrade that simultaneously performs joint, marginal, and orthogonally transformed marginal matchings between data and model distributions. Through simulated experiments, we empirically show that the BigLearn-EM is capable of delivering the optimal with high probability; comparisons on benchmark clustering datasets further demonstrate its effectiveness and advantages over existing techniques. The code is available at https://github.com/YulaiCong/Big-Learning-Expectation-Maximization.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning