December 24, 2019

154 words 1 min read



RAdam implemented in Keras & TensorFlow

repo name CyberZHG/keras-radam
repo link
language Python
size (curr.) 63 kB
stars (curr.) 292
created 2019-08-16
license MIT License

Keras RAdam

Travis Coverage Version Downloads License


Unofficial implementation of RAdam in Keras and TensorFlow.


pip install keras-rectified-adam


import keras
import numpy as np
from keras_radam import RAdam

# Build toy model with RAdam optimizer
model = keras.models.Sequential()
model.add(keras.layers.Dense(input_shape=(17,), units=3))
model.compile(RAdam(), loss='mse')

# Generate toy data
x = np.random.standard_normal((4096 * 30, 17))
w = np.random.standard_normal((17, 3))
y =, w)

# Fit, y, epochs=5)

TensorFlow without Keras

from import RAdamOptimizer


Use Warmup

from keras_radam import RAdam

RAdam(total_steps=10000, warmup_proportion=0.1, min_lr=1e-5)

Q & A

About Correctness

The optimizer produces similar losses and weights to the official optimizer after 500 steps.

Use tf.keras or tf-2.0

Add TF_KERAS=1 to environment variables to use tensorflow.python.keras.

Use theano Backend

Add KERAS_BACKEND=theano to environment variables to enable theano backend.

comments powered by Disqus