Skip to content

Hyper-Parameter Optimization

This example is based on optuna quick start example. Optuna is an open-source hyperparameter optimization framework which is easy to use:

import optuna

def objective(trial):
    x = trial.suggest_float('x', -10, 10)
    return (x - 2) ** 2

study = optuna.create_study()
study.optimize(objective, n_trials=100)

study.best_params  # E.g. {'x': 2.002108042}

The above example creates a study object to search for the best parameter x that minimizes the objective function (x-2)^2.

Parameter Searching with HyperParameter

Parameter searching can be much easier with HyperParameter:

import optuna
import hyperparameter as hp

@hp.param
def objective(x = 0.0):
    return (x - 2) ** 2

def wrapper(trial):
    trial = lazy_dispatch(trial)
    with hp.scope(**{
        "objective.x": trial.suggest_float('objective.x', -10, 10)
    }):
        return objective()

study = optuna.create_study()
study.optimize(wrapper, n_trials=100)

study.best_params  # E.g. {'x': 2.002108042}

We directly apply the param decorator to the objective function so that it accepts parameters from scope. Then we define a wrapper function that adapts scope API to optuna's trial API and starts the parameter experiment as suggested in optuna's example.

Put the Best Parameters into Production

To put the best parameters into production, we can directly pass them to scope. This is very convenient if you want to put a ML model into production.

with hp.scope(**study.best_params):
    print(f"{study.best_params} => {objective()}")

Optimization on Nested Functions

scope and param also support complex problems with nested functions:

@hp.param
def objective_x(x = 0.0):
    return (x - 2) ** 2

@hp.param
def objective_y(y = 0.0):
    return (y - 1) ** 3

def objective():
    return objective_x() * objective_y()

def wrapper(trial):
    trial = lazy_dispatch(trial)
    with hp.scope(**{
        "objective_x.x": trial.suggest_float('objective_x.x', -10, 10),
        "objective_y.y": trial.suggest_float('objective_y.y', -10, 10)
    }):
        return objective()

study = optuna.create_study()
study.optimize(wrapper, n_trials=100)

study.best_params  # E.g. {'x': 2.002108042}