Skip to content

Latest commit

 

History

History
73 lines (63 loc) · 3.65 KB

README.md

File metadata and controls

73 lines (63 loc) · 3.65 KB
HippoTrainer

HippoTrainer

Gradient-Based Hyperparameter Optimization for PyTorch 🦛

PyTorch Inspired by Optuna

License Contributors Issues Pull Requests

HippoTrainer is a PyTorch-compatible library for gradient-based hyperparameter optimization, implementing cutting-edge algorithms that leverage automatic differentiation to efficiently tune hyperparameters.

📬 Assets

  1. Technical Meeting 1 - Presentation

🚀 Features

  • Algorithm Zoo: T1-T2, IFT, HOAG, DrMAD
  • PyTorch Native: Direct integration with torch.nn.Module
  • Memory Efficient: Checkpointing & implicit differentiation
  • Scalable: From laptop to cluster with PyTorch backend

📜 Algorithms

  • T1-T2 (Paper): One-step unrolled optimization
  • IFT (Paper): Leveraging Neumann series approximation for implicit differentiation
  • HOAG (Paper): Implicit differentiation via conjugate gradient
  • DrMAD (Paper): Memory-efficient piecewise-linear backpropagation

🤝 Contributors

  • Daniil Dorin (Basic code writing, Final demo, Algorithms)
  • Igor Ignashin (Project wrapping, Documentation writing, Algorithms)
  • Nikita Kiselev (Project planning, Blog post, Algorithms)
  • Andrey Veprikov (Tests writing, Documentation writing, Algorithms)
  • We welcome contributions!

📄 License

HippoTrainer is MIT licensed. See LICENSE for details.