Running the Optimization¶
The optimization class Optimizer is a class to optimize a pre-defined black-box function (defined as in Defining the Objective Function) according to the optimization process flow. Follow the steps below to prepare, instantiate the optimizer, generate and add initial training data, and then run the optimization.
Preparation for optimization¶
You can configure the black-box function, surrogate model function, and solver client as follows.
Black-box function¶
You can define a black-box function with the @blackbox decorator as described in “Defining the Objective Function”. Here, we will refer to the black-box function created in this way as my_blackbox_func.
Surrogate model function¶
A surrogate model function class provides a polynomial surrogate of the black-box function and utilities for building that model. As the model function, choose one of the following:
Below is an example using FM (FMTrainer) as the model function, but you can additionally configure various settings related to the model function and how to construct the model.
from amplify_bbopt import FMTrainer
my_trainer = FMTrainer()
my_trainer.epochs = 100 # Set the number of epochs to 100
my_trainer.optimizer_params = {"lr": 1} # Set the learning rate to 1
my_trainer.lr_scheduler_class = None # Turn off learning rate scheduling
Note
The black-box optimization method is called Factorization Machine with Quadratic-optimization Annealing (FMQA), when you use “Factorization Machine” as the surrogate model function. When using “Kernel Model”, the method is called polynomial-based Kernels with Quadratic-optimization Annealing (Kernel-QA).
FMQA: Kitai, K., Guo, J., Ju, S., Tanaka, S., Tsuda, K., Shiomi, J., Tamura, R.: Designing metamaterials with quantum annealing and factorization machines. Phys. Rev. Res. 2, 013319 (2020).
Kernel-QA: Minamoto, Y., and Sakamoto, Y.: A black-box optimization method with polynomial-based kernels and quadratic-optimization annealing. arXiv:2501.04225 (2025)
Solver client¶
During the optimization cycle, Amplify-BBOpt optimizes the surrogate model function by running an Ising machine via the Amplify SDK.
To do this, you need to create a solver client – an abstraction for each solver – and specify the endpoint, API token, and execution parameters. Choose one Ising machine (solver client) from the solvers supported by the Amplify SDK, and configure its API token and parameters. Below is an example using the Fixstars Amplify Annealing Engine (Amplify AE):
from datetime import timedelta
from amplify import FixstarsClient
my_client = FixstarsClient()
my_client.token = "ENTER_YOUR_API_TOKEN" # Set your API token
my_client.parameters.timeout = timedelta(seconds=1) # Set annealing time to 1 second
Running optimization with the optimizer class¶
Based on the previously created my_blackbox_func, my_trainer and my_client, you can instantiate the Optimizer class as shown below.
Then, optionally add a small amount of initial training data, and run the optimization cycle using Optimizer.optimize.
from amplify_bbopt import Optimizer
# Instantiate the optimizer class
optimizer = Optimizer(
blackbox=my_blackbox_func, # Black-box function defined with @blackbox
trainer=my_trainer, # Instance of the surrogate model function class
client=my_client, # Solver client for the Ising machine
)
# Generate and add 10 random samples as initial training data
optimizer.add_random_training_data(num_data=10)
# Run 10 optimization cycles
optimizer.optimize(10)
Hint
Using the surrogate_data_transformer argument of Optimizer, you can apply data transformations to improve surrogate model performance. For example, using ExpScaler automatically applies exponential transformation to the objective function values. For details, see “Improving Surrogate Model Performance” section on “Transforming training data”.
from amplify_bbopt import ExpScaler, Optimizer
# Using ExpScaler
optimizer = Optimizer(
blackbox=my_blackbox_func,
trainer=my_trainer,
client=my_client,
surrogate_data_transformer=ExpScaler(),
)
Note
Before starting the optimization, you need to add at least one input-output pairs as the initial training data. As shown in the program above, you can use the add_random_training_data() method of the instantiated Optimizer to add randomly generated samples, consisting of solution–evaluation pairs, as initial training data.
If you already have existing data or have created training data manually, you can also provide it when instantiating the Optimizer. In that case, pass the training data as a Dataset object to the training_data argument of the constructor.
import numpy as np
from amplify_bbopt import Dataset, Optimizer
# Input data for the black-box function
# (example: a 2D array composed of two input vectors)
dataset_x = np.array([[1, 2, 3], [-1, 0, 1]])
# A 1D vector containing the output values of
# the black-box function corresponding to each input
dataset_y = np.array([1, 5])
# Create an optimizer with a user-defined dataset
optimizer = Optimizer(
my_blackbox_func,
my_trainer,
my_client,
training_data=Dataset(dataset_x, dataset_y),
)
# Run the optimization cycle
optimizer.optimize(10)
Retrieving optimization results¶
Newly discovered solutions from the optimization cycles are stored in Optimizer.history. Among them, the best solution – the one yelds the minimum objective function value – and its corresponding objective value can be retrieved as follows:
print(optimizer.best.values) # Best solution (input values)
print(optimizer.best.objective) # Objective function value corresponding to the best solution
To examine the optimization results in more detail, you can also access the history data and other related information. For more details, see Evaluating Results.