Optimizer#

This module implements a custom Python class to manage PyTorch optimizers and learning rate schedulers in TopoBenchmarkX.

Abstract class for the optimizer manager class.

class topobenchmarkx.optimizer.base.AbstractOptimizer[source]#

Abstract class for the optimizer manager class.

abstract configure_optimizer(model_parameters: dict)[source]#

Configure the optimizer and scheduler.

Act as a wrapper.

Parameters:
model_parametersdict

The model parameters.

Optimizer class responsible of managing both optimizer and scheduler.

class topobenchmarkx.optimizer.optimizer.TBXOptimizer(optimizer_id, parameters, scheduler=None)[source]#

Optimizer class that manage both optimizer and scheduler, fully compatible with torch.optim classes.

Parameters:
optimizer_idstr

Name of the torch optimizer class to be used.

parametersdict

Parameters to be passed to the optimizer.

schedulerdict, optional

Scheduler id and parameters to be used. Default is None.

configure_optimizer(model_parameters) dict[slice(<class 'str'>, typing.Any, None)][source]#

Configure the optimizer and scheduler.

Act as a wrapper to provide the LightningTrainer module the required config dict when it calls TBXModel’s configure_optimizers() method.

Parameters:
model_parametersdict

The model parameters.

Returns:
dict

The optimizer and scheduler configuration.