Custom Trainer and ModelΒΆ
You can use your own custom classes for model and model training, although the default trainer and model classes are sufficient for most black-box optimization problems.
Here, we explain how to do so, by using the FMQAOptimizer as an example. Below code instantiates the optimizer for a black-box objective function, which are already mentioned in β2. Black-Box Functionβ and β4. Initial Training Dataβ, respectively.
from datetime import timedelta
from amplify import FixstarsClient
from amplify_bbopt import (
DatasetGenerator,
FMQAOptimizer,
RealVariable,
blackbox,
)
from utils.pseudo_simulators import (
pseudo_wing_simulator as wing_simulator,
)
@blackbox
def objective_lift_drag(
wing_width: float = RealVariable(bounds=(1, 20), nbins=100),
wing_height: float = RealVariable(bounds=(1, 5), nbins=20),
wing_angle: float = RealVariable(bounds=(0, 45), nbins=20),
) -> float:
"""This black-box function executes wing_simulator(), and returns
the negative lift-drag ratio for a given wing's width, height, and angle.
"""
lift, drag = wing_simulator(wing_width, wing_height, wing_angle)
return -lift / drag # value to minimize
# Generate initial training data set
data = DatasetGenerator(objective=objective_lift_drag).generate(num_samples=3)
# Set up solver client
client = FixstarsClient()
client.parameters.timeout = timedelta(milliseconds=2000) # 2 seconds
# client.token = "xxxxxxxxxxx" # Enter your Amplify AE API token.
# Instantiate the FMQA optimizer
optimizer = FMQAOptimizer(
data=data, objective=objective_lift_drag, client=client
)
# Display the overall black-box optimization setting
print(optimizer)
amplify-bbopt | 2024/10/04 05:12:58 | INFO | ----------------------------------------
amplify-bbopt | 2024/10/04 05:12:58 | INFO | #0/3 initial data for objective_lift_drag
amplify-bbopt | 2024/10/04 05:12:58 | INFO | ----------------------------------------
amplify-bbopt | 2024/10/04 05:12:58 | INFO | #1/3 initial data for objective_lift_drag
amplify-bbopt | 2024/10/04 05:12:58 | INFO | ----------------------------------------
amplify-bbopt | 2024/10/04 05:12:58 | INFO | #2/3 initial data for objective_lift_drag
num variables: 3
num elemental variables: 3
num amplify variables: 137
optimizer client: FixstarsClient
objective weight: 1.0
--------------------
trainer class: TorchFMTrainer
model class: TorchFM
model params: {d: 137, k: 10}
batch size: 8
epochs: 2000
loss class: MSELoss
optimizer class: AdamW
optimizer params: {'lr': 0.5}
lr_sche class: StepLR
lr_sche params: {'step_size': 100, 'gamma': 0.8}
data split ratio (train): 0.8
Using custom model classΒΆ
You can create your own surrogate model by, for example, inheriting an appropriate model class such as QUBOConvertibleBase
, TorchFM
or ModelKernel
depending on the trainer class (see below) to be used with. To notify the current optimizer which model class to use, use the set_model_class
method of the trainer class as follows.
from amplify_bbopt import TorchFM
class MyModelClass(TorchFM):
"""My custom surrogate model."""
def __init__(self, d: int, k: int, **kwargs):
super().__init__(d, k)
pass
def to_qubo(self, x):
return XXXX
trainer = optimizer.trainer
trainer.set_model_class(MyModelClass)
print(trainer)
--------------------
trainer class: TorchFMTrainer
model class: MyModelClass
model params: {d: 137, k: 10}
batch size: 8
epochs: 2000
loss class: MSELoss
optimizer class: AdamW
optimizer params: {'lr': 0.5}
lr_sche class: StepLR
lr_sche params: {'step_size': 100, 'gamma': 0.8}
data split ratio (train): 0.8
Trainer classΒΆ
If your model class needs to be trained with a class other than the default trainer class ModelKernelTrainer
for the optimizer, you can also create your own trainer. You must inherit the trainer base class TrainerBase
(or ModelKernelTrainer
) to do so. Then, you can specify the new trainer class at the instantiation of the optimizer class.
from amplify_bbopt import TrainerBase
class MyTrainerClass(TrainerBase):
def __init__(self, model_class=MyModelClass):
# ....
pass
def train(self, x_values, y_values, logger):
# ....
pass
def __str__(self):
return f"trainer class: {self.__class__.__name__}"
# ...
optimizer = FMQAOptimizer(
data=data,
client=client,
objective=objective_lift_drag,
trainer_class=MyTrainerClass, # type: ignore
)
trainer = optimizer.trainer
print(optimizer)
# trainer.set_model_params(...) # if you implement set_model_params() in My_Trainer_Class
# trainer.set_train_params(...) # if you implement set_train_params() in My_Trainer_Class
# trainer.set_model_class(...) # if you implement set_model_class() in My_Trainer_Class
num variables: 3
num elemental variables: 3
num amplify variables: 137
optimizer client: FixstarsClient
objective weight: 1.0
trainer class: MyTrainerClass