Instance Normalization: The Missing Ingredient for Fast Stylization

July 27, 2016 ยท Entered Twilight ยท ๐Ÿ› arXiv.org

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 8.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, InstanceNormalization.lua, LICENSE, README.md, data, dataloader.lua, datasets, demo.lua, models, riseml.yml, src, test.lua, train.lua

Authors Dmitry Ulyanov, Andrea Vedaldi, Victor Lempitsky arXiv ID 1607.08022 Category cs.CV: Computer Vision Citations 4.0K Venue arXiv.org Repository https://github.com/DmitryUlyanov/texture_nets โญ 1226 Last Checked 1 month ago
Abstract
It this paper we revisit the fast stylization method introduced in Ulyanov et. al. (2016). We show how a small change in the stylization architecture results in a significant qualitative improvement in the generated images. The change is limited to swapping batch normalization with instance normalization, and to apply the latter both at training and testing times. The resulting method can be used to train high-performance architectures for real-time image generation. The code will is made available on github at https://github.com/DmitryUlyanov/texture_nets. Full paper can be found at arXiv:1701.02096.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computer Vision