Customizing the Optimization Cycle¶
When you run optimize() of the Optimizer class, the processes that make up the optimization cycle are executed repeatedly, and various models are considered according to each process.
This section explains these models and processes, and provides a guide to implementing a custom optimization cycle.
Overview of the optimization cycle¶
The diagram below illustrates the processing steps within each optimization cycle and the models considered at each step, as described in the following sections.
Three models considered in the optimization cycle¶
Each optimization cycle repeatedly executes a series of processing steps based on the following three models:
Black-box function (
Optimizer.blackbox)The target objective function to be optimized (e.g., a simulation or experiment), as defined in Defining the Objective Function
Surrogate model (
Optimizer.surrogate_model)A surrogate model of the black-box function constructed based on the training data.
Amplify model (
Optimizer.amplify_model)A logical model representation of the surrogate model for use with the Amplify SDK. Mathematically, it is identical to the surrogate model, but its input variables are replaced with the decision variables handled by the Amplify SDK.
Processing steps in the optimization cycle¶
Each optimization cycle executed by the optimize() method consists of the following steps.
Step 1: Build the surrogate model function¶
Run the train_surrogate() method to construct the surrogate model function surrogate_model based on the training data.
Step 2: Optimize the surrogate model function¶
Run minimize_surrogate() to obtain input values (“potentially-optimal” input) that minimize the constructed surrogate model function using the Ising machine. During this optimization, any constraints specified when instantiating the Optimizer are also taken into account. The return value is a tuple consisting of a list of solutions (Solution) and an amplify.Result object from the Amplify SDK. The solutions are ordered by quality, with the best solution obtained from the Ising machine stored as the first element of the list. Other elements in the list are used for the deduplication process described below.
Note
If a “potentially-optimal” solution obtained from the Ising machine already exists in the training data (i.e., the black-box function has already been evaluated for that input), re-evaluating the input would be a waste. Moreover, storing duplicate samples in the training dataset is undesirable for constructing the surrogate model.
Amplify-BBOpt therefore employs, by default, the following mechanism to avoid duplicate solutions whenever possible:
(a) Selection of a unique solution (
find_unique_solution()method)The
find_unique_solution()method is used to select the best unique solution — i.e., one not yet present in the training dataset — from the list of solutions returned byminimize_surrogate()(Solutionlist). If no such solution is found, the method returnsNone.(b) Generation of a fallback solution (
fallback_solution()method)If no unique solution is found by
find_unique_solution(), thefallback_solution()method is executed. This method generates a fallback solution that satisfies the user-defined constraints and is both unique and feasible, lying near the provided reference solution. The Ising machine is used to generate this fallback solution.
Step 3: Update the Dataset¶
Use the evaluate_objective() method to evaluate the black-box function at the new solution (“potentially-optimal” input). If the new solution duplicates an existing sample in the training dataset, the corresponding stored output value is reused. Then, add the input and its corresponding output as a new input–output pair to the training data using add_solution().
Step 4 (optional): Update the Optimization History¶
At the end of each optimization cycle, you can record the solution information obtained in that cycle as an IterationResult and append it to history. This step does not affect the optimization itself.
Implementing a custom optimization cycle¶
By combining the Optimizer class methods described above, you can implement your own optimization cycle as an alternative to optimize(). This allows you to flexibly apply and test various black-box optimization strategies. A basic implementation example is shown below:
from amplify_bbopt import Optimizer
# Instantiate the optimizer class
optimizer = Optimizer(
blackbox_func,
trainer,
client,
)
# Generate and add 10 initial training samples
optimizer.add_random_training_data(num_data=10)
# Run 10 optimization cycles
# optimizer.optimize(10)
num_cycles = 10
for n_iter in range(num_cycles):
# Step 1: Build the surrogate model function
optimizer.train_surrogate()
# Step 2: Optimize the surrogate model function using the Ising machine (minimization)
solutions, _ = optimizer.minimize_surrogate()
if len(solutions) == 0:
raise RuntimeError(f"No feasible solution was found in iteration {n_iter}")
# Step 3: Update the dataset
# (Always adopt the best solution, solutions[0]; no uniqueness check or fallback)
new_solution = solutions[0]
new_objective = optimizer.evaluate_objective(new_solution)
optimizer.add_solution(new_solution, new_objective)