bioinf-jku/SNNs
Tutorials and implementations for “Self-normalizing networks”
repo name | bioinf-jku/SNNs |
repo link | https://github.com/bioinf-jku/SNNs |
homepage | |
language | Jupyter Notebook |
size (curr.) | 559 kB |
stars (curr.) | 1452 |
created | 2017-06-08 |
license | GNU General Public License v3.0 |
Self-Normalizing Networks
Tutorials and implementations for “Self-normalizing networks”(SNNs) as suggested by Klambauer et al. (arXiv pre-print).
Versions
- Python 3.5 and Tensorflow 1.1
Note for Tensorflow 1.4 users
Tensorflow 1.4 already has the function “tf.nn.selu” and “tf.contrib.nn.alpha_dropout” that implement the SELU activation function and the suggested dropout version.
Tutorials
- Multilayer Perceptron (notebook)
- Convolutional Neural Network on MNIST (notebook)
- Convolutional Neural Network on CIFAR10 (notebook)
KERAS CNN scripts:
- KERAS: Convolutional Neural Network on MNIST (python script)
- KERAS: Convolutional Neural Network on CIFAR10 (python script)
Design novel SELU functions
- How to obtain the SELU parameters alpha and lambda for arbitrary fixed points (notebook)
Basic python functions to implement SNNs
are provided as code chunks here: selu.py
Notebooks and code to produce Figure 1
are provided here: Figure1
Calculations and numeric checks of the theorems (Mathematica)
are provided as mathematica notebooks here: