Convolutional Neural Networks Do Work with Pre-Defined Filters

November 27, 2024 ยท Entered Twilight ยท ๐Ÿ› IEEE International Joint Conference on Neural Network

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: .gitignore, LICENSE, README.md, example.py, models, requirements.txt

Authors Christoph Linse, Erhardt Barth, Thomas Martinetz arXiv ID 2411.18388 Category cs.CV: Computer Vision Citations 6 Venue IEEE International Joint Conference on Neural Network Repository https://github.com/Criscraft/PredefinedFilterNetworks โญ 2 Last Checked 1 month ago
Abstract
We present a novel class of Convolutional Neural Networks called Pre-defined Filter Convolutional Neural Networks (PFCNNs), where all nxn convolution kernels with n>1 are pre-defined and constant during training. It involves a special form of depthwise convolution operation called a Pre-defined Filter Module (PFM). In the channel-wise convolution part, the 1xnxn kernels are drawn from a fixed pool of only a few (16) different pre-defined kernels. In the 1x1 convolution part linear combinations of the pre-defined filter outputs are learned. Despite this harsh restriction, complex and discriminative features are learned. These findings provide a novel perspective on the way how information is processed within deep CNNs. We discuss various properties of PFCNNs and prove their effectiveness using the popular datasets Caltech101, CIFAR10, CUB-200-2011, FGVC-Aircraft, Flowers102, and Stanford Cars. Our implementation of PFCNNs is provided on Github https://github.com/Criscraft/PredefinedFilterNetworks
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computer Vision