Optuna: Hyperparameter optimization framework for machine learning
Define-by-run Python framework for automated hyperparameter tuning.
Learn more about optuna
Optuna is a hyperparameter optimization framework written in Python that automates the process of finding optimal hyperparameter values for machine learning models. It employs a define-by-run programming style where search spaces are constructed dynamically at runtime using standard Python syntax, including conditionals and loops. The framework implements state-of-the-art sampling algorithms and trial pruning strategies to reduce computational overhead. Optuna supports distributed optimization across multiple workers and is commonly used in machine learning pipelines, AutoML systems, and research workflows where hyperparameter tuning is required.

Define-by-run API
Search spaces are constructed dynamically using imperative Python code rather than static configuration, allowing conditional parameters and loops within the optimization logic. This approach provides modularity and flexibility compared to declarative search space definitions.
Distributed optimization
The framework supports scaling studies across multiple workers with minimal code changes, enabling parallel trial execution on local machines or distributed systems. This architecture allows efficient utilization of computational resources for large-scale optimization tasks.
Algorithm flexibility
Optuna includes multiple sampling strategies such as Tree-structured Parzen Estimator, Gaussian Process-based sampling, and supports multi-objective and constrained optimization. Users can select or customize algorithms based on their optimization problem characteristics.
import optuna
def objective(trial):
x = trial.suggest_float('x', -10, 10)
return (x - 2) ** 2
study = optuna.create_study()
study.optimize(objective, n_trials=100)
print(f"Best value: {study.best_value}")
print(f"Best params: {study.best_params}")This is the release note of v4.7.0.
- –SPEA-II: https://hub.optuna.org/samplers/speaii/
- –HypE: https://hub.optuna.org/samplers/hype/
- –Introduce stacklevel-aware custom warnings (#6293)
- –Cache distributions to skip consistency check (#6301)
- –Add warnings when `JournalStorage` lock acquisition is delayed (#6361)
This is the release note of v4.6.0.
- –Drop Python 3.8 & Support Python 3.13 (
- –Change `TrialState.repr` and `TrialState.str` (#6281, )
- –Drop Python 3.8 (#6302)
- –Use iterator for lazy evaluation in journal storage’s `read_logs` (#6144)
- –Cache pair-wise distances to speed up `GPSampler` (#6244)
This is the release note of v4.5.0.
- –Add `ConstrainedLogEHVI` (#6198)
- –Add support for constrained multi-objective optimization in `GPSampler` (#6224)
- –Support 1D Search Spaces in `CmaEsSampler` (#6228)
- –Move `optuna.lightgbmtuner` module ( )
- –Fix numerical issue warning on `qehvicandidatesfunc` ( )
See how people are using optuna
Top in AI & ML
Related Repositories
Discover similar tools and frameworks used by developers
CodeFormer
Transformer-based face restoration using vector-quantized codebook lookup.
Codex CLI
OpenAI's command-line coding assistant that runs locally with ChatGPT integration for terminal use.
ollama
Go-based CLI for local LLM inference and management.
Chart-GPT
AI tool that generates charts from natural language text descriptions.
xformers
Memory-efficient PyTorch components for transformer architectures.