SCARLET-NAS: Bridging the Gap between Stability and Scalability in Weight-sharing Neural Architecture Search

August 16, 2019 ยท Declared Dead ยท ๐Ÿ› 2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW)

๐Ÿ’€ CAUSE OF DEATH: 404 Not Found
Code link is broken/dead
Authors Xiangxiang Chu, Bo Zhang, Qingyuan Li, Ruijun Xu, Xudong Li arXiv ID 1908.06022 Category cs.LG: Machine Learning Cross-listed cs.CV, stat.ML Citations 27 Venue 2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW) Repository https://github.com/xiaomi-automl/ScarletNAS Last Checked 1 month ago
Abstract
To discover powerful yet compact models is an important goal of neural architecture search. Previous two-stage one-shot approaches are limited by search space with a fixed depth. It seems handy to include an additional skip connection in the search space to make depths variable. However, it creates a large range of perturbation during supernet training and it has difficulty giving a confident ranking for subnetworks. In this paper, we discover that skip connections bring about significant feature inconsistency compared with other operations, which potentially degrades the supernet performance. Based on this observation, we tackle the problem by imposing an equivariant learnable stabilizer to homogenize such disparities. Experiments show that our proposed stabilizer helps to improve the supernet's convergence as well as ranking performance. With an evolutionary search backend that incorporates the stabilized supernet as an evaluator, we derive a family of state-of-the-art architectures, the SCARLET series of several depths, especially SCARLET-A obtains 76.9% top-1 accuracy on ImageNet. Code is available at https://github.com/xiaomi-automl/ScarletNAS.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning

Died the same way โ€” ๐Ÿ’€ 404 Not Found