Batch Layer Normalization, A new normalization layer for CNNs and RNN
September 19, 2022 Β· Entered Twilight Β· π International Conference on Advances in Artificial Intelligence
Repo contents: .gitattributes, CNN_Results.ipynb, Cifar10 (With 0.2 of the training set and batch size 1).ipynb, Cifar10 (With 0.2 of the training set and batch size 25).ipynb, Cifar10 (With the whole training set and batch size 25).ipynb, IMDB (With 0.2 of the training set and batch size 1).ipynb, IMDB (With 0.2 of the training set and batch size 25).ipynb, IMDB (With the whole training set and batch size 25).ipynb, Images, LICENSE, README.md, RNN_Results.ipynb, environment.yml, helpers, logs
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning