Navigate:
Optuna
~$OPTUN0.4%

Optuna: Hyperparameter optimization framework for machine learning

Define-by-run Python framework for automated hyperparameter tuning.

LIVE RANKINGS • 12:29 PM • STEADY
OVERALL
#190
142
AI & ML
#67
30
30 DAY RANKING TREND
ovr#190
·AI#67
STARS
13.6K
FORKS
1.3K
7D STARS
+55
7D FORKS
+12
Tags:
See Repo:
Share:

Learn more about Optuna

Optuna is a hyperparameter optimization framework written in Python that automates the process of finding optimal hyperparameter values for machine learning models. It employs a define-by-run programming style where search spaces are constructed dynamically at runtime using standard Python syntax, including conditionals and loops. The framework implements state-of-the-art sampling algorithms and trial pruning strategies to reduce computational overhead. Optuna supports distributed optimization across multiple workers and is commonly used in machine learning pipelines, AutoML systems, and research workflows where hyperparameter tuning is required.

Optuna

1

Define-by-run API

Search spaces are constructed dynamically using imperative Python code rather than static configuration, allowing conditional parameters and loops within the optimization logic. This approach provides modularity and flexibility compared to declarative search space definitions.

2

Distributed optimization

The framework supports scaling studies across multiple workers with minimal code changes, enabling parallel trial execution on local machines or distributed systems. This architecture allows efficient utilization of computational resources for large-scale optimization tasks.

3

Algorithm flexibility

Optuna includes multiple sampling strategies such as Tree-structured Parzen Estimator, Gaussian Process-based sampling, and supports multi-objective and constrained optimization. Users can select or customize algorithms based on their optimization problem characteristics.


import optuna

def objective(trial):
    x = trial.suggest_float('x', -10, 10)
    return (x - 2) ** 2

study = optuna.create_study()
study.optimize(objective, n_trials=100)

print(f"Best value: {study.best_value}")
print(f"Best params: {study.best_params}")

vv4.7.0

This release introduces stacklevel-aware warnings, performance improvements, and fixes for TPESampler distribution handling.

  • Introduce stacklevel-aware custom warnings
  • Cache distributions to skip consistency check
  • Add warnings when JournalStorage lock acquisition is delayed
  • Add support for local HPI in PED-ANOVA
  • Fix log PDF of discrete trunc log-norm distribution for TPESampler
vv4.6.0

This release drops Python 3.8 support, adds Python 3.13 support, and includes performance improvements for storage and sampling.

  • Drop Python 3.8 & Support Python 3.13
  • Change TrialState.repr and TrialState.str
  • Use iterator for lazy evaluation in journal storage's read_logs
  • Cache pair-wise distances to speed up GPSampler
vv4.5.0

This release adds constrained multi-objective optimization support and improves CMA-ES sampler capabilities.

  • Add ConstrainedLogEHVI
  • Add support for constrained multi-objective optimization in GPSampler
  • Support 1D Search Spaces in CmaEsSampler
  • Move optuna.lightgbmtuner module
  • Fix numerical issue warning on qehvicandidatesfunc

See how people are using Optuna

Loading tweets...


[ EXPLORE MORE ]

Related Repositories

Discover similar tools and frameworks used by developers