TorchFMTrainer

class TorchFMTrainer

Bases: TrainerBase

A trainer class for a Factorization Machine model implemented by PyTorch.

Methods

__init__

Initialize the model trainer.

init_seed

Initialize random with a seed.

set_model_class

Set a model class.

set_model_params

Set model parameters to be used to initialize the model class with (TorchFM by default).

set_train_params

Set machine learning parameters.

train

Train an FM model (TorchFM or equivalent model class instance inheriting TorchFM).

Attributes

batch_size

Batch size considered for training.

epochs

The number of epochs for training.

loss_class

A loss function class for model training.

lr_sche_class

A learning rate scheduler class.

lr_sche_params

A learning rate scheduler parameters.

model_class

A model class considered in the training class.

model_params

Model parameters the model class is initialized with.

optimizer_class

An optimizer class for model training.

optimizer_params

Optimizer parameters for model training.

__init__(model_class: type[~amplify_bbopt.model.TorchFM] = <class 'amplify_bbopt.model.TorchFM'>) None

Initialize the model trainer.

By default TorchFM is considered and the default training parameters are set here by calling TorchFMTrainer.set_train_params.

Parameters:

model_class (Type[TorchFM]) – A surrogate model class. By calling the constructor, default training parameters are set with TorchFMTrainer.set_train_params. Defaults to TorchFM.

__str__() str

Return human-readable training information.

Returns:

Training information.

Return type:

str

init_seed(seed: int) None

Initialize random with a seed.

Parameters:

seed (int) – A seed.

set_model_class(model_class: type[TorchFM]) None

Set a model class.

Parameters:

model_class (Type[TorchFM]) – An FM model class.

set_model_params(**model_params: dict) None

Set model parameters to be used to initialize the model class with (TorchFM by default).

The following parameters can be set for the default model class (this overwrites the model parameters the optimizer set based on the observation of objective function):

  • d (int): The size of an FM input (= the number of the Amplify SDK variables fed to the model).

  • k (int): The FM hyperparameter. In FMQAOptimizer default to 10.

set_train_params(
batch_size: int = 8,
epochs: int = 2000,
loss_class: type[torch.nn.modules.loss._Loss] = <class 'torch.nn.modules.loss.MSELoss'>,
optimizer_class: type[Optimizer] = <class 'torch.optim.adamw.AdamW'>,
optimizer_params: dict[str,
Any] | None = None,
lr_sche_class: type[lr_scheduler._LRScheduler] | None = <class 'torch.optim.lr_scheduler.StepLR'>,
lr_sche_params: dict[str,
Any] | None = None,
data_split_ratio_train: float = 0.8,
num_threads: int | None = None,
) None

Set machine learning parameters.

Parameters:
  • batch_size (int, optional) – A batch size. Defaults to 8.

  • epochs (int, optional) – A number of epochs. Defaults to 2000.

  • loss_class (Type[torch.nn.modules.loss._Loss], optional) – A loss function class. Defaults to nn.MSELoss.

  • optimizer_class (Type[torch.optim.Optimizer], optional) – An optimizer class. Defaults to torch.optim.AdamW.

  • optimizer_params (Dict, optional) – Optimization parameters. Defaults to {“lr”: 0.5}.

  • lr_sche_class (Type[lr_scheduler._LRScheduler] | None, optional) – A learning rate scheduler class. Defaults to lr_scheduler.StepLR.

  • lr_sche_params (Dict, optional) – Learning rate scheduler parameters. Defaults to {“step_size”: 100, “gamma”: 0.8}.

  • data_split_ratio_train (float, optional) – Training dataset is split for training and validation. data_split_ratio_train defines the ratio of data used for traininig to the entire dataset samples. Note setting this either 0 or 1, training and validation use the same dataset (no split). Defaults to 0.8.

  • num_threads (int | None, optional) – The number of threads used for intraop parallel processing in PyTorch. If set None, available threads are used. Defaults to None.

train(x_values: list[list[bool | int | float]], y_values: list[int | float], logger: Logger | None = None) TorchFM

Train an FM model (TorchFM or equivalent model class instance inheriting TorchFM).

For adjustable training parameters, see TorchFMTrainer.set_train_params.

Parameters:
  • x_values (list[list[bool | int | float]]) – A list of the input value vectors in the training data.

  • y_values (list[int | float]) – A list of the corresponding output values in the training data.

  • logger (Logger | None) – A logger. Defaults to None.

Returns:

A trained FM model.

Return type:

TorchFM

__abstractmethods__ = frozenset({})
__dict__ = mappingproxy({'__module__': 'amplify_bbopt.trainer', '__doc__': 'A trainer class for a Factorization Machine model implemented by PyTorch.', '__init__': <function TorchFMTrainer.__init__>, 'train': <function TorchFMTrainer.train>, 'set_model_class': <function TorchFMTrainer.set_model_class>, 'model_params': <property object>, 'model_class': <property object>, 'batch_size': <property object>, 'epochs': <property object>, 'loss_class': <property object>, 'optimizer_class': <property object>, 'optimizer_params': <property object>, 'lr_sche_class': <property object>, 'lr_sche_params': <property object>, 'set_model_params': <function TorchFMTrainer.set_model_params>, 'set_train_params': <function TorchFMTrainer.set_train_params>, '_corrcoef': <function TorchFMTrainer._corrcoef>, '__str__': <function TorchFMTrainer.__str__>, 'init_seed': <function TorchFMTrainer.init_seed>, '__abstractmethods__': frozenset(), '_abc_impl': <_abc._abc_data object>, '__annotations__': {'_model_params': 'dict[str, Any]'}})
__slots__ = ()
__weakref__

list of weak references to the object (if defined)

property batch_size: int

Batch size considered for training.

property epochs: int

The number of epochs for training.

property loss_class: type[_Loss]

A loss function class for model training.

property lr_sche_class: type[_LRScheduler] | None

A learning rate scheduler class. If not used, return None.

property lr_sche_params: dict[str, Any] | None

A learning rate scheduler parameters. If scheduler is not used, return None.

property model_class: type[TorchFM]

A model class considered in the training class.

property model_params: dict[str, Any] | None

Model parameters the model class is initialized with.

property optimizer_class: type[Optimizer]

An optimizer class for model training.

property optimizer_params: dict[str, Any]

Optimizer parameters for model training.