Installation

% pip install optuna

Optuna supports Python 2.7, 3.4, 3.5, 3.6 and 3.7.
We recommend to install Optuna via pip.

Quick Start


import optuna

def objective(trial):
    x = trial.suggest_uniform('x', -10, 10)
    return (x - 2) ** 2

study = optuna.create_study()
study.optimize(objective, n_trials=100)

print(study.best_params)
													

Let's try a very simple optimization problem.

  1. Define objective function to be optimized. In this example, we'll minimize (x - 2)^2.
  2. Suggest hyperparameter values using trial object. Here, a float value of x is suggested from -10 to 10.
  3. Create study object and invoke optimize method. Then, you can get the best configuration among 100 trials.

Key Features

1. Define-by-Run

Optuna: Define-by-Run

Existing frameworks separately define the search space and the objective function. In Optuna, the search spaces are defined inside the objective function, and all hyperparameters are defined on the run. This feature makes the code written in Optuna more modulated and easier to modify.

2. Parallel distributed optimization

Optuna: Parallel distributed optimization Effect of parallelization sizes of 1, 2, 4, and 8.

Optuna can parallelize your optimization with near-linear scalability. To setup parallelization, users simply execute multiple optimization processes, and Optuna will automatically share trials in background.

3. Pruning of unpromising trials

Optuna: Pruning of unpromising trials Learning curves of pruned and completed trials.

Pruning feature automatically stops unpromising trials at the early stages of the training (a.k.a., automated early-stopping). Optuna provides interfaces to concisely implement the pruning mechanism in iterative training algorithms.

Performance

Optuna: Performance Comparison of Optuna with Hyperopt which has no pruning mechanism.

Based on a Bayesian optimization algorithm, Optuna accelerates your hyperparameter search. The pruning and parallelization features help try out large amount of hyperparameter combinations in a short time.

For instance, our benchmark experiment demonstrates the advantage of the pruning feature in comparison with an existing optimization framework.