R.I.P.
π»
Ghosted
FreeREA: Training-Free Evolution-based Architecture Search
June 17, 2022 Β· Declared Dead Β· π IEEE Workshop/Winter Conference on Applications of Computer Vision
Authors
NiccolΓ² Cavagnero, Luca Robbiano, Barbara Caputo, Giuseppe Averta
arXiv ID
2207.05135
Category
cs.NE: Neural & Evolutionary
Cross-listed
cs.AI,
cs.CV,
cs.LG
Citations
27
Venue
IEEE Workshop/Winter Conference on Applications of Computer Vision
Repository
https://github.com/NiccoloCavagnero/FreeREA}
Last Checked
1 month ago
Abstract
In the last decade, most research in Machine Learning contributed to the improvement of existing models, with the aim of increasing the performance of neural networks for the solution of a variety of different tasks. However, such advancements often come at the cost of an increase of model memory and computational requirements. This represents a significant limitation for the deployability of research output in realistic settings, where the cost, the energy consumption, and the complexity of the framework play a crucial role. To solve this issue, the designer should search for models that maximise the performance while limiting its footprint. Typical approaches to reach this goal rely either on manual procedures, which cannot guarantee the optimality of the final design, or upon Neural Architecture Search algorithms to automatise the process, at the expenses of extremely high computational time. This paper provides a solution for the fast identification of a neural network that maximises the model accuracy while preserving size and computational constraints typical of tiny devices. Our approach, named FreeREA, is a custom cell-based evolution NAS algorithm that exploits an optimised combination of training-free metrics to rank architectures during the search, thus without need of model training. Our experiments, carried out on the common benchmarks NAS-Bench-101 and NATS-Bench, demonstrate that i) FreeREA is a fast, efficient, and effective search method for models automatic design; ii) it outperforms State of the Art training-based and training-free techniques in all the datasets and benchmarks considered, and iii) it can easily generalise to constrained scenarios, representing a competitive solution for fast Neural Architecture Search in generic constrained applications. The code is available at \url{https://github.com/NiccoloCavagnero/FreeREA}.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Neural & Evolutionary
R.I.P.
π»
Ghosted
Progressive Growing of GANs for Improved Quality, Stability, and Variation
R.I.P.
π»
Ghosted
Learning both Weights and Connections for Efficient Neural Networks
R.I.P.
π»
Ghosted
LSTM: A Search Space Odyssey
R.I.P.
π»
Ghosted
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
R.I.P.
π»
Ghosted
An Introduction to Convolutional Neural Networks
Died the same way β π 404 Not Found
R.I.P.
π
404 Not Found
Deep High-Resolution Representation Learning for Visual Recognition
R.I.P.
π
404 Not Found
HuggingFace's Transformers: State-of-the-art Natural Language Processing
R.I.P.
π
404 Not Found
CCNet: Criss-Cross Attention for Semantic Segmentation
R.I.P.
π
404 Not Found