Toward Unsupervised Outlier Model Selection

November 03, 2022 ยท Entered Twilight ยท ๐Ÿ› Industrial Conference on Data Mining

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: LICENSE, README.md, __init__.py, elect_controlled.py, elect_wild.py, initialization_converage.py, requirements.txt, utility.py

Authors Yue Zhao, Sean Zhang, Leman Akoglu arXiv ID 2211.01834 Category cs.LG: Machine Learning Citations 28 Venue Industrial Conference on Data Mining Repository https://github.com/yzhao062/ELECT โญ 11 Last Checked 1 month ago
Abstract
Today there exists no shortage of outlier detection algorithms in the literature, yet the complementary and critical problem of unsupervised outlier model selection (UOMS) is vastly understudied. In this work we propose ELECT, a new approach to select an effective candidate model, i.e. an outlier detection algorithm and its hyperparameter(s), to employ on a new dataset without any labels. At its core, ELECT is based on meta-learning; transferring prior knowledge (e.g. model performance) on historical datasets that are similar to the new one to facilitate UOMS. Uniquely, it employs a dataset similarity measure that is performance-based, which is more direct and goal-driven than other measures used in the past. ELECT adaptively searches for similar historical datasets, as such, it can serve an output on-demand, being able to accommodate varying time budgets. Extensive experiments show that ELECT significantly outperforms a wide range of basic UOMS baselines, including no model selection (always using the same popular model such as iForest) as well as more recent selection strategies based on meta-features.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning