Installation
Quick Start
Let's try a very simple optimization problem.
- Define
objective
function to be optimized. In this example, we'll minimize(x - 2)^2
. - Suggest hyperparameter values using
trial
object. Here, a float value ofx
is suggested from-10
to10
. - Create
study
object and invokeoptimize
method. Then, you can get the best configuration among 100 trials.
Key Features
1. Define-by-Run
Existing frameworks separately define the search space and the objective function. In Optuna, the search spaces are defined inside the objective function, and all hyperparameters are defined on the run. This feature makes the code written in Optuna more modulated and easier to modify.
2. Parallel distributed optimization
Effect of parallelization sizes of 1, 2, 4, and 8.
Optuna can parallelize your optimization with near-linear scalability. To setup parallelization, users simply execute multiple optimization processes, and Optuna will automatically share trials in background.
3. Pruning of unpromising trials
Learning curves of pruned and completed trials.
Pruning feature automatically stops unpromising trials at the early stages of the training (a.k.a., automated early-stopping). Optuna provides interfaces to concisely implement the pruning mechanism in iterative training algorithms.
Performance
Comparison of Optuna with Hyperopt which has no pruning mechanism.
Based on a Bayesian optimization algorithm, Optuna accelerates your hyperparameter search. The pruning and parallelization features help try out large amount of hyperparameter combinations in a short time.
For instance, our benchmark experiment demonstrates the advantage of the pruning feature in comparison with an existing optimization framework.