% pip install optuna
import optuna def objective(trial): x = trial.suggest_uniform('x', -10, 10) return (x - 2) ** 2 study = optuna.create_study() study.optimize(objective, n_trials=100) print(study.best_params)
Let's try a very simple optimization problem.
objectivefunction to be optimized. In this example, we'll minimize
(x - 2)^2.
- Suggest hyperparameter values using
trialobject. Here, a float value of
xis suggested from
studyobject and invoke
optimizemethod. Then, you can get the best configuration among 100 trials.
Existing frameworks separately define the search space and the objective function. In Optuna, the search spaces are defined inside the objective function, and all hyperparameters are defined on the run. This feature makes the code written in Optuna more modulated and easier to modify.
2. Parallel distributed optimization
Effect of parallelization sizes of 1, 2, 4, and 8.
Optuna can parallelize your optimization with near-linear scalability. To setup parallelization, users simply execute multiple optimization processes, and Optuna will automatically share trials in background.
3. Pruning of unpromising trials
Learning curves of pruned and completed trials.
Pruning feature automatically stops unpromising trials at the early stages of the training (a.k.a., automated early-stopping). Optuna provides interfaces to concisely implement the pruning mechanism in iterative training algorithms.
Comparison of Optuna with Hyperopt which has no pruning mechanism.
Based on a Bayesian optimization algorithm, Optuna accelerates your hyperparameter search. The pruning and parallelization features help try out large amount of hyperparameter combinations in a short time.
For instance, our benchmark experiment demonstrates the advantage of the pruning feature in comparison with an existing optimization framework.