Contributions to pytorch-optimizer for code, documentation, and tests are always welcome!
# Clone the repository
git clone https://github.com/kozistr/pytorch_optimizer.git
cd pytorch_optimizer
# Install dependencies using uv (recommended)
uv sync
# Or using pip
pip install -e ".[dev]"# Format code
make format
# Lint code
make lint
# Full check (lint + type checking)
make check
# Run tests
make test
# Run a specific test
python -m pytest tests/test_optimizers.py::test_name -sv -vv
# Serve documentation locally
make docs- Line length: 119 characters
- Use single quotes for strings (not double quotes)
- Formatter: black with
-S -l 119flags - Linter: ruff
- Docstring style: Google style (example)
Run make format and make check before submitting a PR.
Reference existing implementations:
- Optimizers:
pytorch_optimizer/optimizer/ - Loss functions:
pytorch_optimizer/loss/ - LR schedulers:
pytorch_optimizer/lr_scheduler/
- Create a new file in the appropriate directory
- For optimizers: inherit from
BaseOptimizer, implementinit_group()andstep() - Utilize existing
BaseOptimizermethods instead of reimplementing:apply_weight_decay(),apply_ams_bound(),apply_adam_debias()debias(),debias_beta(),get_rectify_step_size()apply_cautious(),get_adanorm_gradient()validate_learning_rate(),validate_betas(),validate_range()
- Register in the corresponding
__init__.pyfiles - Run
make formatandmake check - Add tests with 100% coverage requirement
- For new optimizers: add a minimal training recipe to
tests/constants.py(seeOPTIMIZERSlist) - Add a short description to the latest changelog in
docs/changelogs/ - Update
README.md:- Update the count of optimizers/loss functions/schedulers
- Add entry to the appropriate markdown table
Tests are in tests/ directory:
test_optimizers.py- Main optimizer teststest_optimizer_parameters.py- Parameter validation teststest_optimizer_variants.py- Variant tests (Cautious, AdamD, etc.)test_loss_functions.py- Loss function teststest_lr_schedulers.py- Scheduler tests
100% test coverage is required.
If you have any questions about contribution, please ask in the Issues, Discussions, or just in PR :)
Thank you!