Navigate:
optuna
~$OPTUN0.2%

Optuna: Hyperparameter optimization framework for machine learning

Define-by-run Python framework for automated hyperparameter tuning.

LIVE RANKINGS • 07:01 AM • STEADY
OVERALL
#224
40
AI & ML
#76
6
30 DAY RANKING TREND
ovr#224
·AI#76
STARS
13.5K
FORKS
1.2K
7D STARS
+30
7D FORKS
+3
Tags:
See Repo:
Share:

Learn more about optuna

Optuna is a hyperparameter optimization framework written in Python that automates the process of finding optimal hyperparameter values for machine learning models. It employs a define-by-run programming style where search spaces are constructed dynamically at runtime using standard Python syntax, including conditionals and loops. The framework implements state-of-the-art sampling algorithms and trial pruning strategies to reduce computational overhead. Optuna supports distributed optimization across multiple workers and is commonly used in machine learning pipelines, AutoML systems, and research workflows where hyperparameter tuning is required.

optuna

1

Define-by-run API

Search spaces are constructed dynamically using imperative Python code rather than static configuration, allowing conditional parameters and loops within the optimization logic. This approach provides modularity and flexibility compared to declarative search space definitions.

2

Distributed optimization

The framework supports scaling studies across multiple workers with minimal code changes, enabling parallel trial execution on local machines or distributed systems. This architecture allows efficient utilization of computational resources for large-scale optimization tasks.

3

Algorithm flexibility

Optuna includes multiple sampling strategies such as Tree-structured Parzen Estimator, Gaussian Process-based sampling, and supports multi-objective and constrained optimization. Users can select or customize algorithms based on their optimization problem characteristics.


import optuna

def objective(trial):
    x = trial.suggest_float('x', -10, 10)
    return (x - 2) ** 2

study = optuna.create_study()
study.optimize(objective, n_trials=100)

print(f"Best value: {study.best_value}")
print(f"Best params: {study.best_params}")

vv4.7.0

This is the release note of v4.7.0.

  • SPEA-II: https://hub.optuna.org/samplers/speaii/
  • HypE: https://hub.optuna.org/samplers/hype/
  • Introduce stacklevel-aware custom warnings (#6293)
  • Cache distributions to skip consistency check (#6301)
  • Add warnings when `JournalStorage` lock acquisition is delayed (#6361)
vv4.6.0

This is the release note of v4.6.0.

  • Drop Python 3.8 & Support Python 3.13 (
  • Change `TrialState.repr` and `TrialState.str` (#6281, )
  • Drop Python 3.8 (#6302)
  • Use iterator for lazy evaluation in journal storage’s `read_logs` (#6144)
  • Cache pair-wise distances to speed up `GPSampler` (#6244)
vv4.5.0

This is the release note of v4.5.0.

  • Add `ConstrainedLogEHVI` (#6198)
  • Add support for constrained multi-objective optimization in `GPSampler` (#6224)
  • Support 1D Search Spaces in `CmaEsSampler` (#6228)
  • Move `optuna.lightgbmtuner` module ( )
  • Fix numerical issue warning on `qehvicandidatesfunc` ( )

See how people are using optuna

Loading tweets...


[ EXPLORE MORE ]

Related Repositories

Discover similar tools and frameworks used by developers