Optimizers

Optimizer is an interface that enables the use of declearn 's optimizers for Federated Learning inside Fed-BioMed

Classes

BaseOptimizer

BaseOptimizer(model, optimizer)

Bases: Generic[OT]

Abstract base class for Optimizer and Model wrappers.

Parameters:

Name Type Description Default
model Model

model to train, interfaced via a framework-specific Model.

required
optimizer OT

optimizer that will be used for optimizing the model.

required

Raises:

Type Description
FedbiomedOptimizerError

Raised if model is not an instance of _model_cls (which may be a subset of the generic Model type).

Source code in fedbiomed/common/optimizers/generic_optimizers.py
def __init__(self, model: Model, optimizer: OT):
    """Constuctor of the optimizer wrapper that sets a reference to model and optimizer.

    Args:
        model: model to train, interfaced via a framework-specific Model.
        optimizer: optimizer that will be used for optimizing the model.

    Raises:
        FedbiomedOptimizerError:
            Raised if model is not an instance of `_model_cls` (which may
            be a subset of the generic Model type).
    """
    if not isinstance(model, self._model_cls):
        raise FedbiomedOptimizerError(
            f"{ErrorNumbers.FB626.value}, in `model` argument, expected an instance "
            f"of {self._model_cls} but got an object of type {type(model)}."
        )
    self._model: Model = model
    self.optimizer: OT = optimizer

Attributes

Functions

DeclearnOptimizer

DeclearnOptimizer(model, optimizer)

Bases: BaseOptimizer

Base Optimizer subclass to use a declearn-backed Optimizer.

Parameters:

Name Type Description Default
model Model

Model that wraps the actual model

required
optimizer Union[Optimizer, Optimizer]

declearn optimizer, or fedbiomed optimizer (that wraps declearn optimizer)

required
Source code in fedbiomed/common/optimizers/generic_optimizers.py
def __init__(self, model: Model, optimizer: Union[FedOptimizer, declearn.optimizer.Optimizer]):
    """Constructor of Optimizer wrapper for declearn's optimizers

    Args:
        model: Model that wraps the actual model
        optimizer: declearn optimizer,
            or fedbiomed optimizer (that wraps declearn optimizer)
    """
    logger.debug("Using declearn optimizer")
    if isinstance(optimizer, declearn.optimizer.Optimizer):
        # convert declearn optimizer into a fedbiomed optimizer wrapper
        optimizer = FedOptimizer.from_declearn_optimizer(optimizer)
    elif not isinstance(optimizer, FedOptimizer):
        raise FedbiomedOptimizerError(
            f"{ErrorNumbers.FB626.value}: expected a declearn optimizer,"
            f" but got an object with type {type(optimizer)}."
        )
    super().__init__(model, optimizer)
    self.optimizer.init_round()

Attributes

Functions

NativeSkLearnOptimizer

NativeSkLearnOptimizer(model, optimizer=None)

Bases: BaseOptimizer

Optimizer wrapper for scikit-learn native models.

Parameters:

Name Type Description Default
model SkLearnModel

SkLearnModel model that builds a scikit-learn model.

required
optimizer Optional[None]

unused. Defaults to None.

None
Source code in fedbiomed/common/optimizers/generic_optimizers.py
def __init__(self, model: SkLearnModel, optimizer: Optional[None] = None):
    """Constructor of the Optimizer wrapper for scikit-learn native models.

    Args:
        model: SkLearnModel model that builds a scikit-learn model.
        optimizer: unused. Defaults to None.
    """

    if optimizer is not None:
        logger.info(f"Passed Optimizer {optimizer} won't be used (using only native scikit learn optimization)")
    super().__init__(model, None)
    logger.debug("Using native Sklearn Optimizer")

Functions

NativeTorchOptimizer

NativeTorchOptimizer(model, optimizer)

Bases: BaseOptimizer

Optimizer wrapper for pytorch native optimizers and models.

Parameters:

Name Type Description Default
model TorchModel

fedbiomed model wrapper that warps the pytorch model

required
optimizer Optimizer

pytorch native optimizers (inhereting from torch.optim.Optimizer)

required

Raises:

Type Description
FedbiomedOptimizerError

raised if optimizer is not a pytorch native optimizer ie a torch.optim.Optimizer object.

Source code in fedbiomed/common/optimizers/generic_optimizers.py
def __init__(self, model: TorchModel, optimizer: torch.optim.Optimizer):
    """Constructor of the optimizer wrapper

    Args:
        model: fedbiomed model wrapper that warps the pytorch model
        optimizer: pytorch native optimizers (inhereting from `torch.optim.Optimizer`)

    Raises:
        FedbiomedOptimizerError: raised if optimizer is not a pytorch native optimizer ie a `torch.optim.Optimizer`
            object.
    """
    if not isinstance(optimizer, torch.optim.Optimizer):
        raise FedbiomedOptimizerError(f"{ErrorNumbers.FB626.value} Expected a native pytorch `torch.optim` "
                                      f"optimizer, but got {type(optimizer)}")
    super().__init__(model, optimizer)
    logger.debug("using native torch optimizer")

Functions

Optimizer

Optimizer(lr, decay=0.0, modules=None, regularizers=None)

Optimizer class with a declearn-backed modular SGD-core algorithm.

Parameters:

Name Type Description Default
lr float

Base learning rate (i.e. step size) applied to gradients-based updates upon applying them to a model's weights.

required
decay float

Optional weight decay parameter, used to parameterize a decoupled weight decay regularization term (see [1]) added to the updates right before the learning rate is applied and model weights are effectively updated.

0.0
modules Optional[Sequence[Union[OptiModule, str, Tuple[str, Dict[str, Any]]]]]

Optional list of plug-in modules implementing gradients' alteration into model weights' udpates. Modules will be applied to gradients following this list's ordering. See declearn.optimizer.modules.OptiModule for details. See Notes section below for details on the "specs" format.

None
regularizers Optional[Sequence[Union[Regularizer, str, Tuple[str, Dict[str, Any]]]]]

Optional list of plug-in loss regularizers. Regularizers will be applied to gradients following this list's order, prior to any other alteration (see modules above). See declearn.optimizer.regularizers.Regularizer for details. See Notes section below for details on the "specs" format.

None

Note

Regularizer and OptiModule to be used by this optimizer, specified using the regularizers and modules parameters, may be passed as ready-for-use instances, or be instantiated from specs, consisting either of a single string (the name attribute of the class to build) or a tuple grouping this name and a config dict (to specify some hyper-parameters).

References

[1] Loshchilov & Hutter, 2019. Decoupled Weight Decay Regularization. https://arxiv.org/abs/1711.05101

Source code in fedbiomed/common/optimizers/optimizer.py
def __init__(
    self,
    lr: float,
    decay: float = 0.0,
    modules: Optional[
        Sequence[Union[OptiModule, str, Tuple[str, Dict[str, Any]]]]
    ] = None,
    regularizers: Optional[
        Sequence[Union[Regularizer, str, Tuple[str, Dict[str, Any]]]]
    ] = None,
) -> None:
    """Instantiate the declearn-issued gradient-descent optimizer.

    Args:
        lr: Base learning rate (i.e. step size) applied to gradients-based
            updates upon applying them to a model's weights.
        decay: Optional weight decay parameter, used to parameterize a
            decoupled weight decay regularization term (see [1]) added to
            the updates right before the learning rate is applied and model
            weights are effectively updated.
        modules: Optional list of plug-in modules implementing gradients'
            alteration into model weights' udpates. Modules will be applied
            to gradients following this list's ordering.
            See `declearn.optimizer.modules.OptiModule` for details.
            See Notes section below for details on the "specs" format.
        regularizers: Optional list of plug-in loss regularizers.
            Regularizers will be applied to gradients following this list's
            order, prior to any other alteration (see `modules` above).
            See `declearn.optimizer.regularizers.Regularizer` for details.
            See Notes section below for details on the "specs" format.

    !!! info "Note"
        `Regularizer` and `OptiModule` to be used by this optimizer,
        specified using the `regularizers` and `modules` parameters,
        may be passed as ready-for-use instances, or be instantiated
        from specs, consisting either of a single string (the `name`
        attribute of the class to build) or a tuple grouping this
        name and a config dict (to specify some hyper-parameters).

    !!! info "References"
        [1] Loshchilov & Hutter, 2019.
            Decoupled Weight Decay Regularization.
            https://arxiv.org/abs/1711.05101
    """
    try:
        self._optimizer = DeclearnOptimizer(
            lrate=lr,
            w_decay=decay,
            modules=modules,
            regularizers=regularizers,
        )
    except (KeyError, TypeError) as exc:
        raise FedbiomedOptimizerError(
            f"{ErrorNumbers.FB621.value}: declearn Optimizer instantiation"
            f" raised the following exception: {repr(exc)}"
        ) from exc

Functions

SklearnOptimizerProcessing

SklearnOptimizerProcessing(model, disable_internal_optimizer)

Context manager used for scikit-learn model, that checks if model parameter(s) has(ve) been changed when disabling scikit-learn internal optimizer - ie when calling disable_internal_optimizer method

Parameters:

Name Type Description Default
model SkLearnModel

a SkLearnModel that wraps a scikit-learn model

required
disable_internal_optimizer bool

whether to disable scikit-learn model internal optimizer (True) in order to apply declearn one or to keep it (False)

required
Source code in fedbiomed/common/optimizers/generic_optimizers.py
def __init__(
    self,
    model: SkLearnModel,
    disable_internal_optimizer: bool
) -> None:
    """Constructor of the object. Sets internal variables

    Args:
        model: a SkLearnModel that wraps a scikit-learn model
        disable_internal_optimizer: whether to disable scikit-learn model internal optimizer (True) in order
            to apply declearn one or to keep it (False)
    """
    self._model = model
    self._disable_internal_optimizer = disable_internal_optimizer