Exploiting the Potential of Standard Convolutional Autoencoders for Image Restoration by Evolutionary Search

March 01, 2018 ยท Entered Twilight ยท ๐Ÿ› International Conference on Machine Learning

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 7.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: Denoising, Inpainting, LICENSE, README.md

Authors Masanori Suganuma, Mete Ozay, Takayuki Okatani arXiv ID 1803.00370 Category cs.NE: Neural & Evolutionary Citations 91 Venue International Conference on Machine Learning Repository https://github.com/sg-nm/Evolutionary-Autoencoders โญ 71 Last Checked 1 month ago
Abstract
Researchers have applied deep neural networks to image restoration tasks, in which they proposed various network architectures, loss functions, and training methods. In particular, adversarial training, which is employed in recent studies, seems to be a key ingredient to success. In this paper, we show that simple convolutional autoencoders (CAEs) built upon only standard network components, i.e., convolutional layers and skip connections, can outperform the state-of-the-art methods which employ adversarial training and sophisticated loss functions. The secret is to employ an evolutionary algorithm to automatically search for good architectures. Training optimized CAEs by minimizing the $\ell_2$ loss between reconstructed images and their ground truths using the ADAM optimizer is all we need. Our experimental results show that this approach achieves 27.8 dB peak signal to noise ratio (PSNR) on the CelebA dataset and 40.4 dB on the SVHN dataset, compared to 22.8 dB and 33.0 dB provided by the former state-of-the-art methods, respectively.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Neural & Evolutionary

R.I.P. ๐Ÿ‘ป Ghosted

LSTM: A Search Space Odyssey

Klaus Greff, Rupesh Kumar Srivastava, ... (+3 more)

cs.NE ๐Ÿ› IEEE TNNLS ๐Ÿ“š 6.0K cites 11 years ago