PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning

November 15, 2017 ยท Entered Twilight ยท ๐Ÿ› 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 7.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, README.md, checkpoints, data, logs, src

Authors Arun Mallya, Svetlana Lazebnik arXiv ID 1711.05769 Category cs.CV: Computer Vision Citations 1.5K Venue 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Repository https://github.com/arunmallya/packnet โญ 239 Last Checked 1 month ago
Abstract
This paper presents a method for adding multiple tasks to a single deep neural network while avoiding catastrophic forgetting. Inspired by network pruning techniques, we exploit redundancies in large deep networks to free up parameters that can then be employed to learn new tasks. By performing iterative pruning and network re-training, we are able to sequentially "pack" multiple tasks into a single network while ensuring minimal drop in performance and minimal storage overhead. Unlike prior work that uses proxy losses to maintain accuracy on older tasks, we always optimize for the task at hand. We perform extensive experiments on a variety of network architectures and large-scale datasets, and observe much better robustness against catastrophic forgetting than prior work. In particular, we are able to add three fine-grained classification tasks to a single ImageNet-trained VGG-16 network and achieve accuracies close to those of separately trained networks for each task. Code available at https://github.com/arunmallya/packnet
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computer Vision