optimizer & lr scheduler & loss function collections in PyTorch
-
Updated
Mar 31, 2026 - Python
optimizer & lr scheduler & loss function collections in PyTorch
Implementation of our paper "Wasserstein Adversarial Transformer for Cloud Workload Prediction"
Super-Convergence on CIFAR10
Provide a collection of flexible learning rate schedulers for PyTorch with presets, warmup wrappers, and research-based implementations.
Add a description, image, and links to the madgrad topic page so that developers can more easily learn about it.
To associate your repository with the madgrad topic, visit your repo's landing page and select "manage topics."