December 11, 2018

182 words 1 min read



Tutorials and implementations for “Self-normalizing networks”

repo name bioinf-jku/SNNs
repo link
language Jupyter Notebook
size (curr.) 559 kB
stars (curr.) 1452
created 2017-06-08
license GNU General Public License v3.0

Self-Normalizing Networks

Tutorials and implementations for “Self-normalizing networks”(SNNs) as suggested by Klambauer et al. (arXiv pre-print).


  • Python 3.5 and Tensorflow 1.1

Note for Tensorflow 1.4 users

Tensorflow 1.4 already has the function “tf.nn.selu” and “tf.contrib.nn.alpha_dropout” that implement the SELU activation function and the suggested dropout version.


  • Multilayer Perceptron (notebook)
  • Convolutional Neural Network on MNIST (notebook)
  • Convolutional Neural Network on CIFAR10 (notebook)

KERAS CNN scripts:

Design novel SELU functions

  • How to obtain the SELU parameters alpha and lambda for arbitrary fixed points (notebook)

Basic python functions to implement SNNs

are provided as code chunks here:

Notebooks and code to produce Figure 1

are provided here: Figure1

Calculations and numeric checks of the theorems (Mathematica)

are provided as mathematica notebooks here:

UCI, Tox21 and HTRU2 data sets

comments powered by Disqus