Skip to content

[BMM 24-25] HippoTrainer: Gradient-Based Hyperparameter Optimization

License

Notifications You must be signed in to change notification settings

intsystems/hippotrainer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HippoTrainer

HippoTrainer

Gradient-Based Hyperparameter Optimization for PyTorch 🦛

PyTorch Inspired by Optuna

License Contributors Issues Pull Requests

HippoTrainer is a PyTorch-compatible library for gradient-based hyperparameter optimization, implementing cutting-edge algorithms that leverage automatic differentiation to efficiently tune hyperparameters.

📬 Assets

  1. Technical Meeting 1 - Presentation

🚀 Features

  • Algorithm Zoo: T1-T2, IFT, HOAG, DrMAD
  • PyTorch Native: Direct integration with torch.nn.Module
  • Memory Efficient: Checkpointing & implicit differentiation
  • Scalable: From laptop to cluster with PyTorch backend

📜 Algorithms

  • T1-T2 (Paper): One-step unrolled optimization
  • IFT (Paper): Leveraging Neumann series approximation for implicit differentiation
  • HOAG (Paper): Implicit differentiation via conjugate gradient
  • DrMAD (Paper): Memory-efficient piecewise-linear backpropagation

🤝 Contributors

  • Daniil Dorin (Basic code writing, Final demo, Algorithms)
  • Igor Ignashin (Project wrapping, Documentation writing, Algorithms)
  • Nikita Kiselev (Project planning, Blog post, Algorithms)
  • Andrey Veprikov (Tests writing, Documentation writing, Algorithms)
  • We welcome contributions!

📄 License

HippoTrainer is MIT licensed. See LICENSE for details.

About

[BMM 24-25] HippoTrainer: Gradient-Based Hyperparameter Optimization

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published