Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers
-
Updated
Jan 6, 2022 - Python
Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers
Adam optimizer with learning rate multipliers for TensorFlow 2.0.
Add a description, image, and links to the learning-rate-multipliers topic page so that developers can more easily learn about it.
To associate your repository with the learning-rate-multipliers topic, visit your repo's landing page and select "manage topics."