flwr (Python API reference)#

client#

Flower client.

Client#

class flwr.client.Client[source]#

Abstract base class for Flower clients.

evaluate(ins: EvaluateIns) EvaluateRes[source]#

Evaluate the provided parameters using the locally held dataset.

Parameters:

ins (EvaluateIns) – The evaluation instructions containing (global) model parameters received from the server and a dictionary of configuration values used to customize the local evaluation process.

Returns:

The evaluation result containing the loss on the local dataset and other details such as the number of local data examples used for evaluation.

Return type:

EvaluateRes

fit(ins: FitIns) FitRes[source]#

Refine the provided parameters using the locally held dataset.

Parameters:

ins (FitIns) – The training instructions containing (global) model parameters received from the server and a dictionary of configuration values used to customize the local training process.

Returns:

The training result containing updated parameters and other details such as the number of local training examples used for training.

Return type:

FitRes

get_parameters(ins: GetParametersIns) GetParametersRes[source]#

Return the current local model parameters.

Parameters:

ins (GetParametersIns) – The get parameters instructions received from the server containing a dictionary of configuration values.

Returns:

The current local model parameters.

Return type:

GetParametersRes

get_properties(ins: GetPropertiesIns) GetPropertiesRes[source]#

Return set of client’s properties.

Parameters:

ins (GetPropertiesIns) – The get properties instructions received from the server containing a dictionary of configuration values.

Returns:

The current client properties.

Return type:

GetPropertiesRes

start_client#

flwr.client.start_client(*, server_address: str, client_fn: Optional[Callable[[str], Union[Client, NumPyClient]]] = None, client: Optional[Union[Client, NumPyClient]] = None, grpc_max_message_length: int = 536870912, root_certificates: Optional[Union[bytes, str]] = None, transport: Optional[str] = None) None[source]#

Start a Flower client node which connects to a Flower server.

Parameters:
  • server_address (str) – The IPv4 or IPv6 address of the server. If the Flower server runs on the same machine on port 8080, then server_address would be β€œ[::]:8080”.

  • client_fn (Optional[ClientFn]) – A callable that instantiates a Client. (default: None)

  • client (Optional[flwr.client.Client]) – An implementation of the abstract base class flwr.client.Client (default: None)

  • grpc_max_message_length (int (default: 536_870_912, this equals 512MB)) – The maximum length of gRPC messages that can be exchanged with the Flower server. The default should be sufficient for most models. Users who train very large models might need to increase this value. Note that the Flower server needs to be started with the same value (see flwr.server.start_server), otherwise it will not know about the increased limit and block larger messages.

  • root_certificates (Optional[Union[bytes, str]] (default: None)) – The PEM-encoded root certificates as a byte string or a path string. If provided, a secure connection using the certificates will be established to an SSL-enabled Flower server.

  • transport (Optional[str] (default: None)) – Configure the transport layer. Allowed values: - β€˜grpc-bidi’: gRPC, bidirectional streaming - β€˜grpc-rere’: gRPC, request-response (experimental) - β€˜rest’: HTTP (experimental)

Examples

Starting a gRPC client with an insecure server connection:

>>> def client_fn(cid: str):
>>>     return FlowerClient()
>>>
>>> start_client(
>>>     server_address=localhost:8080,
>>>     client_fn=client_fn,
>>> )

Starting an SSL-enabled gRPC client:

>>> from pathlib import Path
>>> def client_fn(cid: str):
>>>     return FlowerClient()
>>>
>>> start_client(
>>>     server_address=localhost:8080,
>>>     client_fn=client_fn,
>>>     root_certificates=Path("/crts/root.pem").read_bytes(),
>>> )

NumPyClient#

class flwr.client.NumPyClient[source]#

Abstract base class for Flower clients using NumPy.

evaluate(parameters: List[ndarray[Any, dtype[Any]]], config: Dict[str, Union[bool, bytes, float, int, str]]) Tuple[float, int, Dict[str, Union[bool, bytes, float, int, str]]][source]#

Evaluate the provided parameters using the locally held dataset.

Parameters:
  • parameters (NDArrays) – The current (global) model parameters.

  • config (Dict[str, Scalar]) – Configuration parameters which allow the server to influence evaluation on the client. It can be used to communicate arbitrary values from the server to the client, for example, to influence the number of examples used for evaluation.

Returns:

  • loss (float) – The evaluation loss of the model on the local dataset.

  • num_examples (int) – The number of examples used for evaluation.

  • metrics (Dict[str, Scalar]) – A dictionary mapping arbitrary string keys to values of type bool, bytes, float, int, or str. It can be used to communicate arbitrary values back to the server.

Warning

The previous return type format (int, float, float) and the extended format (int, float, float, Dict[str, Scalar]) have been deprecated and removed since Flower 0.19.

fit(parameters: List[ndarray[Any, dtype[Any]]], config: Dict[str, Union[bool, bytes, float, int, str]]) Tuple[List[ndarray[Any, dtype[Any]]], int, Dict[str, Union[bool, bytes, float, int, str]]][source]#

Train the provided parameters using the locally held dataset.

Parameters:
  • parameters (NDArrays) – The current (global) model parameters.

  • config (Dict[str, Scalar]) – Configuration parameters which allow the server to influence training on the client. It can be used to communicate arbitrary values from the server to the client, for example, to set the number of (local) training epochs.

Returns:

  • parameters (NDArrays) – The locally updated model parameters.

  • num_examples (int) – The number of examples used for training.

  • metrics (Dict[str, Scalar]) – A dictionary mapping arbitrary string keys to values of type bool, bytes, float, int, or str. It can be used to communicate arbitrary values back to the server.

get_parameters(config: Dict[str, Union[bool, bytes, float, int, str]]) List[ndarray[Any, dtype[Any]]][source]#

Return the current local model parameters.

Parameters:

config (Config) – Configuration parameters requested by the server. This can be used to tell the client which parameters are needed along with some Scalar attributes.

Returns:

parameters – The local model parameters as a list of NumPy ndarrays.

Return type:

NDArrays

get_properties(config: Dict[str, Union[bool, bytes, float, int, str]]) Dict[str, Union[bool, bytes, float, int, str]][source]#

Return a client’s set of properties.

Parameters:

config (Config) – Configuration parameters requested by the server. This can be used to tell the client which properties are needed along with some Scalar attributes.

Returns:

properties – A dictionary mapping arbitrary string keys to values of type bool, bytes, float, int, or str. It can be used to communicate arbitrary property values back to the server.

Return type:

Dict[str, Scalar]

start_numpy_client#

flwr.client.start_numpy_client(*, server_address: str, client_fn: Optional[Callable[[str], NumPyClient]] = None, client: Optional[NumPyClient] = None, grpc_max_message_length: int = 536870912, root_certificates: Optional[bytes] = None, transport: Optional[str] = None) None[source]#

Start a Flower NumPyClient which connects to a gRPC server.

Parameters:
  • server_address (str) – The IPv4 or IPv6 address of the server. If the Flower server runs on the same machine on port 8080, then server_address would be β€œ[::]:8080”.

  • client_fn (Optional[Callable[[str], NumPyClient]]) – A callable that instantiates a NumPyClient. (default: None)

  • client (Optional[flwr.client.NumPyClient]) – An implementation of the abstract base class flwr.client.NumPyClient.

  • grpc_max_message_length (int (default: 536_870_912, this equals 512MB)) – The maximum length of gRPC messages that can be exchanged with the Flower server. The default should be sufficient for most models. Users who train very large models might need to increase this value. Note that the Flower server needs to be started with the same value (see flwr.server.start_server), otherwise it will not know about the increased limit and block larger messages.

  • root_certificates (bytes (default: None)) – The PEM-encoded root certificates as a byte string or a path string. If provided, a secure connection using the certificates will be established to an SSL-enabled Flower server.

  • transport (Optional[str] (default: None)) – Configure the transport layer. Allowed values: - β€˜grpc-bidi’: gRPC, bidirectional streaming - β€˜grpc-rere’: gRPC, request-response (experimental) - β€˜rest’: HTTP (experimental)

Examples

Starting a client with an insecure server connection:

>>> def client_fn(cid: str):
>>>     return FlowerClient()
>>>
>>> start_numpy_client(
>>>     server_address=localhost:8080,
>>>     client_fn=client_fn,
>>> )

Starting an SSL-enabled gRPC client:

>>> from pathlib import Path
>>> def client_fn(cid: str):
>>>     return FlowerClient()
>>>
>>> start_numpy_client(
>>>     server_address=localhost:8080,
>>>     client_fn=client_fn,
>>>     root_certificates=Path("/crts/root.pem").read_bytes(),
>>> )

start_simulation#

flwr.simulation.start_simulation(*, client_fn: ~typing.Callable[[str], ~typing.Union[~flwr.client.client.Client, ~flwr.client.numpy_client.NumPyClient]], num_clients: ~typing.Optional[int] = None, clients_ids: ~typing.Optional[~typing.List[str]] = None, client_resources: ~typing.Optional[~typing.Dict[str, float]] = None, server: ~typing.Optional[~flwr.server.server.Server] = None, config: ~typing.Optional[~flwr.server.app.ServerConfig] = None, strategy: ~typing.Optional[~flwr.server.strategy.strategy.Strategy] = None, client_manager: ~typing.Optional[~flwr.server.client_manager.ClientManager] = None, ray_init_args: ~typing.Optional[~typing.Dict[str, ~typing.Any]] = None, keep_initialised: ~typing.Optional[bool] = False, actor_type: ~typing.Type[~flwr.simulation.ray_transport.ray_actor.VirtualClientEngineActor] = <flwr.simulation.ray_transport.ray_actor.ActorClass(DefaultActor) object>, actor_kwargs: ~typing.Optional[~typing.Dict[str, ~typing.Any]] = None, actor_scheduling: ~typing.Union[str, ~ray.util.scheduling_strategies.NodeAffinitySchedulingStrategy] = 'DEFAULT') History[source]#

Start a Ray-based Flower simulation server.

Parameters:
  • client_fn (ClientFn) – A function creating client instances. The function must take a single str argument called cid. It should return a single client instance of type ClientLike. Note that the created client instances are ephemeral and will often be destroyed after a single method invocation. Since client instances are not long-lived, they should not attempt to carry state over method invocations. Any state required by the instance (model, dataset, hyperparameters, …) should be (re-)created in either the call to client_fn or the call to any of the client methods (e.g., load evaluation data in the evaluate method itself).

  • num_clients (Optional[int]) – The total number of clients in this simulation. This must be set if clients_ids is not set and vice-versa.

  • clients_ids (Optional[List[str]]) – List client_id`s for each client. This is only required if `num_clients is not set. Setting both num_clients and clients_ids with len(clients_ids) not equal to num_clients generates an error.

  • client_resources (Optional[Dict[str, float]] (default: `{β€œnum_cpus”: 1,) – β€œnum_gpus”: 0.0}` CPU and GPU resources for a single client. Supported keys are num_cpus and num_gpus. To understand the GPU utilization caused by num_gpus, as well as using custom resources, please consult the Ray documentation.

  • server (Optional[flwr.server.Server] (default: None).) – An implementation of the abstract base class flwr.server.Server. If no instance is provided, then start_server will create one.

  • config (ServerConfig (default: None).) – Currently supported values are num_rounds (int, default: 1) and round_timeout in seconds (float, default: None).

  • strategy (Optional[flwr.server.Strategy] (default: None)) – An implementation of the abstract base class flwr.server.Strategy. If no strategy is provided, then start_server will use flwr.server.strategy.FedAvg.

  • client_manager (Optional[flwr.server.ClientManager] (default: None)) – An implementation of the abstract base class flwr.server.ClientManager. If no implementation is provided, then start_simulation will use flwr.server.client_manager.SimpleClientManager.

  • ray_init_args (Optional[Dict[str, Any]] (default: None)) –

    Optional dictionary containing arguments for the call to ray.init. If ray_init_args is None (the default), Ray will be initialized with the following default args:

    { β€œignore_reinit_error”: True, β€œinclude_dashboard”: False }

    An empty dictionary can be used (ray_init_args={}) to prevent any arguments from being passed to ray.init.

  • keep_initialised (Optional[bool] (default: False)) – Set to True to prevent ray.shutdown() in case ray.is_initialized()=True.

  • actor_type (VirtualClientEngineActor (default: DefaultActor)) – Optionally specify the type of actor to use. The actor object, which persists throughout the simulation, will be the process in charge of running the clients’ jobs (i.e. their fit() method).

  • actor_kwargs (Optional[Dict[str, Any]] (default: None)) – If you want to create your own Actor classes, you might need to pass some input argument. You can use this dictionary for such purpose.

  • actor_scheduling (Optional[Union[str, NodeAffinitySchedulingStrategy]]) – (default: β€œDEFAULT”) Optional string (β€œDEFAULT” or β€œSPREAD”) for the VCE to choose in which node the actor is placed. If you are an advanced user needed more control you can use lower-level scheduling strategies to pin actors to specific compute nodes (e.g. via NodeAffinitySchedulingStrategy). Please note this is an advanced feature. For all details, please refer to the Ray documentation: https://docs.ray.io/en/latest/ray-core/scheduling/index.html

Returns:

hist – Object containing metrics from training.

Return type:

flwr.server.history.History

server#

Flower server.

server.start_server#

flwr.server.start_server(*, server_address: str = '[::]:8080', server: Optional[Server] = None, config: Optional[ServerConfig] = None, strategy: Optional[Strategy] = None, client_manager: Optional[ClientManager] = None, grpc_max_message_length: int = 536870912, certificates: Optional[Tuple[bytes, bytes, bytes]] = None) History[source]#

Start a Flower server using the gRPC transport layer.

Parameters:
  • server_address (Optional[str]) – The IPv4 or IPv6 address of the server. Defaults to β€œ[::]:8080”.

  • server (Optional[flwr.server.Server] (default: None)) – A server implementation, either flwr.server.Server or a subclass thereof. If no instance is provided, then start_server will create one.

  • config (Optional[ServerConfig] (default: None)) – Currently supported values are num_rounds (int, default: 1) and round_timeout in seconds (float, default: None).

  • strategy (Optional[flwr.server.Strategy] (default: None).) – An implementation of the abstract base class flwr.server.strategy.Strategy. If no strategy is provided, then start_server will use flwr.server.strategy.FedAvg.

  • client_manager (Optional[flwr.server.ClientManager] (default: None)) – An implementation of the abstract base class flwr.server.ClientManager. If no implementation is provided, then start_server will use flwr.server.client_manager.SimpleClientManager.

  • grpc_max_message_length (int (default: 536_870_912, this equals 512MB)) – The maximum length of gRPC messages that can be exchanged with the Flower clients. The default should be sufficient for most models. Users who train very large models might need to increase this value. Note that the Flower clients need to be started with the same value (see flwr.client.start_client), otherwise clients will not know about the increased limit and block larger messages.

  • certificates (Tuple[bytes, bytes, bytes] (default: None)) –

    Tuple containing root certificate, server certificate, and private key to start a secure SSL-enabled server. The tuple is expected to have three bytes elements in the following order:

    • CA certificate.

    • server certificate.

    • server private key.

Returns:

hist – Object containing training and evaluation metrics.

Return type:

flwr.server.history.History

Examples

Starting an insecure server:

>>> start_server()

Starting an SSL-enabled server:

>>> start_server(
>>>     certificates=(
>>>         Path("/crts/root.pem").read_bytes(),
>>>         Path("/crts/localhost.crt").read_bytes(),
>>>         Path("/crts/localhost.key").read_bytes()
>>>     )
>>> )

server.strategy#

Contains the strategy abstraction and different implementations.

server.strategy.Strategy#

class flwr.server.strategy.Strategy[source]#

Abstract base class for server strategy implementations.

abstract aggregate_evaluate(server_round: int, results: List[Tuple[ClientProxy, EvaluateRes]], failures: List[Union[Tuple[ClientProxy, EvaluateRes], BaseException]]) Tuple[Optional[float], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate evaluation results.

Parameters:
  • server_round (int) – The current round of federated learning.

  • results (List[Tuple[ClientProxy, FitRes]]) – Successful updates from the previously selected and configured clients. Each pair of (ClientProxy, FitRes constitutes a successful update from one of the previously selected clients. Not that not all previously selected clients are necessarily included in this list: a client might drop out and not submit a result. For each client that did not submit an update, there should be an Exception in failures.

  • failures (List[Union[Tuple[ClientProxy, EvaluateRes], BaseException]]) – Exceptions that occurred while the server was waiting for client updates.

Returns:

aggregation_result – The aggregated evaluation result. Aggregation typically uses some variant of a weighted average.

Return type:

Optional[float]

abstract aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate training results.

Parameters:
  • server_round (int) – The current round of federated learning.

  • results (List[Tuple[ClientProxy, FitRes]]) – Successful updates from the previously selected and configured clients. Each pair of (ClientProxy, FitRes) constitutes a successful update from one of the previously selected clients. Not that not all previously selected clients are necessarily included in this list: a client might drop out and not submit a result. For each client that did not submit an update, there should be an Exception in failures.

  • failures (List[Union[Tuple[ClientProxy, FitRes], BaseException]]) – Exceptions that occurred while the server was waiting for client updates.

Returns:

parameters – If parameters are returned, then the server will treat these as the new global model parameters (i.e., it will replace the previous parameters with the ones returned from this method). If None is returned (e.g., because there were only failures and no viable results) then the server will no update the previous model parameters, the updates received in this round are discarded, and the global model parameters remain the same.

Return type:

Optional[Parameters]

abstract configure_evaluate(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, EvaluateIns]][source]#

Configure the next round of evaluation.

Parameters:
  • server_round (int) – The current round of federated learning.

  • parameters (Parameters) – The current (global) model parameters.

  • client_manager (ClientManager) – The client manager which holds all currently connected clients.

Returns:

evaluate_configuration – A list of tuples. Each tuple in the list identifies a ClientProxy and the EvaluateIns for this particular ClientProxy. If a particular ClientProxy is not included in this list, it means that this ClientProxy will not participate in the next round of federated evaluation.

Return type:

List[Tuple[ClientProxy, EvaluateIns]]

abstract configure_fit(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, FitIns]][source]#

Configure the next round of training.

Parameters:
  • server_round (int) – The current round of federated learning.

  • parameters (Parameters) – The current (global) model parameters.

  • client_manager (ClientManager) – The client manager which holds all currently connected clients.

Returns:

fit_configuration – A list of tuples. Each tuple in the list identifies a ClientProxy and the FitIns for this particular ClientProxy. If a particular ClientProxy is not included in this list, it means that this ClientProxy will not participate in the next round of federated learning.

Return type:

List[Tuple[ClientProxy, FitIns]]

abstract evaluate(server_round: int, parameters: Parameters) Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]][source]#

Evaluate the current model parameters.

This function can be used to perform centralized (i.e., server-side) evaluation of model parameters.

Parameters:
  • server_round (int) – The current round of federated learning.

  • parameters (Parameters) – The current (global) model parameters.

Returns:

evaluation_result – The evaluation result, usually a Tuple containing loss and a dictionary containing task-specific metrics (e.g., accuracy).

Return type:

Optional[Tuple[float, Dict[str, Scalar]]]

abstract initialize_parameters(client_manager: ClientManager) Optional[Parameters][source]#

Initialize the (global) model parameters.

Parameters:

client_manager (ClientManager) – The client manager which holds all currently connected clients.

Returns:

parameters – If parameters are returned, then the server will treat these as the initial global model parameters.

Return type:

Optional[Parameters]

server.strategy.FedAvg#

class flwr.server.strategy.FedAvg(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None)[source]#

Configurable FedAvg strategy implementation.

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None) None[source]#

Federated Averaging strategy.

Implementation based on https://arxiv.org/abs/1602.05629

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. In case min_fit_clients is larger than fraction_fit * available_clients, min_fit_clients will still be sampled. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. In case min_evaluate_clients is larger than fraction_evaluate * available_clients, min_evaluate_clients will still be sampled. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

aggregate_evaluate(server_round: int, results: List[Tuple[ClientProxy, EvaluateRes]], failures: List[Union[Tuple[ClientProxy, EvaluateRes], BaseException]]) Tuple[Optional[float], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate evaluation losses using weighted average.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using weighted average.

configure_evaluate(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, EvaluateIns]][source]#

Configure the next round of evaluation.

configure_fit(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, FitIns]][source]#

Configure the next round of training.

evaluate(server_round: int, parameters: Parameters) Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]][source]#

Evaluate model parameters using an evaluation function.

initialize_parameters(client_manager: ClientManager) Optional[Parameters][source]#

Initialize global model parameters.

num_evaluation_clients(num_available_clients: int) Tuple[int, int][source]#

Use a fraction of available clients for evaluation.

num_fit_clients(num_available_clients: int) Tuple[int, int][source]#

Return the sample size and the required number of available clients.

server.strategy.FedAvgM#

class flwr.server.strategy.FedAvgM(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, server_learning_rate: float = 1.0, server_momentum: float = 0.0)[source]#

Configurable FedAvg with Momentum strategy implementation.

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, server_learning_rate: float = 1.0, server_momentum: float = 0.0) None[source]#

Federated Averaging with Momentum strategy.

Implementation based on https://arxiv.org/pdf/1909.06335.pdf

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. Defaults to 0.1.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. Defaults to 0.1.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

  • server_learning_rate (float) – Server-side learning rate used in server-side optimization. Defaults to 1.0.

  • server_momentum (float) – Server-side momentum factor used for FedAvgM. Defaults to 0.0.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using weighted average.

initialize_parameters(client_manager: ClientManager) Optional[Parameters][source]#

Initialize global model parameters.

server.strategy.FedMedian#

class flwr.server.strategy.FedMedian(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None)[source]#

Configurable FedAvg with Momentum strategy implementation.

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None) None#

Federated Averaging strategy.

Implementation based on https://arxiv.org/abs/1602.05629

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. In case min_fit_clients is larger than fraction_fit * available_clients, min_fit_clients will still be sampled. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. In case min_evaluate_clients is larger than fraction_evaluate * available_clients, min_evaluate_clients will still be sampled. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using median.

server.strategy.QFedAvg#

class flwr.server.strategy.QFedAvg(*, q_param: float = 0.2, qffl_learning_rate: float = 0.1, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 1, min_evaluate_clients: int = 1, min_available_clients: int = 1, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None)[source]#

Configurable QFedAvg strategy implementation.

__init__(*, q_param: float = 0.2, qffl_learning_rate: float = 0.1, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 1, min_evaluate_clients: int = 1, min_available_clients: int = 1, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None) None[source]#

Federated Averaging strategy.

Implementation based on https://arxiv.org/abs/1602.05629

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. In case min_fit_clients is larger than fraction_fit * available_clients, min_fit_clients will still be sampled. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. In case min_evaluate_clients is larger than fraction_evaluate * available_clients, min_evaluate_clients will still be sampled. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

aggregate_evaluate(server_round: int, results: List[Tuple[ClientProxy, EvaluateRes]], failures: List[Union[Tuple[ClientProxy, EvaluateRes], BaseException]]) Tuple[Optional[float], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate evaluation losses using weighted average.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using weighted average.

configure_evaluate(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, EvaluateIns]][source]#

Configure the next round of evaluation.

configure_fit(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, FitIns]][source]#

Configure the next round of training.

num_evaluation_clients(num_available_clients: int) Tuple[int, int][source]#

Use a fraction of available clients for evaluation.

num_fit_clients(num_available_clients: int) Tuple[int, int][source]#

Return the sample size and the required number of available clients.

server.strategy.FaultTolerantFedAvg#

class flwr.server.strategy.FaultTolerantFedAvg(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 1, min_evaluate_clients: int = 1, min_available_clients: int = 1, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, min_completion_rate_fit: float = 0.5, min_completion_rate_evaluate: float = 0.5, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None)[source]#

Configurable fault-tolerant FedAvg strategy implementation.

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 1, min_evaluate_clients: int = 1, min_available_clients: int = 1, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, min_completion_rate_fit: float = 0.5, min_completion_rate_evaluate: float = 0.5, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None) None[source]#

Federated Averaging strategy.

Implementation based on https://arxiv.org/abs/1602.05629

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. In case min_fit_clients is larger than fraction_fit * available_clients, min_fit_clients will still be sampled. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. In case min_evaluate_clients is larger than fraction_evaluate * available_clients, min_evaluate_clients will still be sampled. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

aggregate_evaluate(server_round: int, results: List[Tuple[ClientProxy, EvaluateRes]], failures: List[Union[Tuple[ClientProxy, EvaluateRes], BaseException]]) Tuple[Optional[float], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate evaluation losses using weighted average.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using weighted average.

server.strategy.FedOpt#

class flwr.server.strategy.FedOpt(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Parameters, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, eta: float = 0.1, eta_l: float = 0.1, beta_1: float = 0.0, beta_2: float = 0.0, tau: float = 1e-09)[source]#

Configurable FedAdagrad strategy implementation.

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Parameters, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, eta: float = 0.1, eta_l: float = 0.1, beta_1: float = 0.0, beta_2: float = 0.0, tau: float = 1e-09) None[source]#

Federated Optim strategy interface.

Implementation based on https://arxiv.org/abs/2003.00295v5

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • eta (float, optional) – Server-side learning rate. Defaults to 1e-1.

  • eta_l (float, optional) – Client-side learning rate. Defaults to 1e-1.

  • beta_1 (float, optional) – Momentum parameter. Defaults to 0.0.

  • beta_2 (float, optional) – Second moment parameter. Defaults to 0.0.

  • tau (float, optional) – Controls the algorithm’s degree of adaptability. Defaults to 1e-9.

server.strategy.FedProx#

class flwr.server.strategy.FedProx(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, proximal_mu: float)[source]#

Configurable FedProx strategy implementation.

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, proximal_mu: float) None[source]#

Federated Optimization strategy.

Implementation based on https://arxiv.org/abs/1812.06127

The strategy in itself will not be different than FedAvg, the client needs to be adjusted. A proximal term needs to be added to the loss function during the training:

\[\begin{split}\\frac{\\mu}{2} || w - w^t ||^2\end{split}\]

Where $w^t$ are the global parameters and $w$ are the local weights the function will be optimized with.

In PyTorch, for example, the loss would go from:

loss = criterion(net(inputs), labels)

To:

for local_weights, global_weights in zip(net.parameters(), global_params):
    proximal_term += (local_weights - global_weights).norm(2)
loss = criterion(net(inputs), labels) + (config["proximal_mu"] / 2) *
proximal_term

With global_params being a copy of the parameters before the training takes place.

global_params = copy.deepcopy(net).parameters()
Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. In case min_fit_clients is larger than fraction_fit * available_clients, min_fit_clients will still be sampled. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. In case min_evaluate_clients is larger than fraction_evaluate * available_clients, min_evaluate_clients will still be sampled. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • proximal_mu (float) – The weight of the proximal term used in the optimization. 0.0 makes this strategy equivalent to FedAvg, and the higher the coefficient, the more regularization will be used (that is, the client parameters will need to be closer to the server parameters during training).

configure_fit(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, FitIns]][source]#

Configure the next round of training.

Sends the proximal factor mu to the clients

server.strategy.FedAdagrad#

class flwr.server.strategy.FedAdagrad(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Parameters, eta: float = 0.1, eta_l: float = 0.1, tau: float = 1e-09)[source]#

FedAdagrad strategy - Adaptive Federated Optimization using Adagrad.

Paper: https://arxiv.org/abs/2003.00295

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Parameters, eta: float = 0.1, eta_l: float = 0.1, tau: float = 1e-09) None[source]#

Federated learning strategy using Adagrad on server-side.

Implementation based on https://arxiv.org/abs/2003.00295v5

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters) – Initial global model parameters.

  • eta (float, optional) – Server-side learning rate. Defaults to 1e-1.

  • eta_l (float, optional) – Client-side learning rate. Defaults to 1e-1.

  • tau (float, optional) – Controls the algorithm’s degree of adaptability. Defaults to 1e-9.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using weighted average.

server.strategy.FedAdam#

class flwr.server.strategy.FedAdam(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Parameters, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, eta: float = 0.1, eta_l: float = 0.1, beta_1: float = 0.9, beta_2: float = 0.99, tau: float = 1e-09)[source]#

FedAdam - Adaptive Federated Optimization using Adam.

Paper: https://arxiv.org/abs/2003.00295

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Parameters, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, eta: float = 0.1, eta_l: float = 0.1, beta_1: float = 0.9, beta_2: float = 0.99, tau: float = 1e-09) None[source]#

Federated learning strategy using Adagrad on server-side.

Implementation based on https://arxiv.org/abs/2003.00295v5

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • eta (float, optional) – Server-side learning rate. Defaults to 1e-1.

  • eta_l (float, optional) – Client-side learning rate. Defaults to 1e-1.

  • beta_1 (float, optional) – Momentum parameter. Defaults to 0.9.

  • beta_2 (float, optional) – Second moment parameter. Defaults to 0.99.

  • tau (float, optional) – Controls the algorithm’s degree of adaptability. Defaults to 1e-9.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using weighted average.

server.strategy.FedYogi#

class flwr.server.strategy.FedYogi(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Parameters, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, eta: float = 0.01, eta_l: float = 0.0316, beta_1: float = 0.9, beta_2: float = 0.99, tau: float = 0.001)[source]#

FedYogi [Reddi et al., 2020] strategy.

Adaptive Federated Optimization using Yogi.

Paper: https://arxiv.org/abs/2003.00295

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Parameters, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, eta: float = 0.01, eta_l: float = 0.0316, beta_1: float = 0.9, beta_2: float = 0.99, tau: float = 0.001) None[source]#

Federated learning strategy using Yogi on server-side.

Implementation based on https://arxiv.org/abs/2003.00295v5

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • eta (float, optional) – Server-side learning rate. Defaults to 1e-1.

  • eta_l (float, optional) – Client-side learning rate. Defaults to 1e-1.

  • beta_1 (float, optional) – Momentum parameter. Defaults to 0.9.

  • beta_2 (float, optional) – Second moment parameter. Defaults to 0.99.

  • tau (float, optional) – Controls the algorithm’s degree of adaptability. Defaults to 1e-9.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using weighted average.

server.strategy.FedTrimmedAvg#

class flwr.server.strategy.FedTrimmedAvg(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, beta: float = 0.2)[source]#

Federated Averaging with Trimmed Mean [Dong Yin, et al., 2021].

Paper: https://arxiv.org/abs/1803.01498

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, beta: float = 0.2) None[source]#

Federated Averaging with Trimmed Mean [Dong Yin, et al., 2021].

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. Defaults to 0.1.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. Defaults to 0.1.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

  • beta (float, optional) – Fraction to cut off of both tails of the distribution. Defaults to 0.2.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using trimmed average.

server.strategy.Krum#

class flwr.server.strategy.Krum(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, num_malicious_clients: int = 0, num_clients_to_keep: int = 0, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None)[source]#

Configurable Krum strategy implementation.

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, num_malicious_clients: int = 0, num_clients_to_keep: int = 0, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None) None[source]#

Krum strategy.

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. Defaults to 0.1.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. Defaults to 0.1.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • num_malicious_clients (int, optional) – Number of malicious clients in the system. Defaults to 0.

  • num_clients_to_keep (int, optional) – Number of clients to keep before averaging (MultiKrum). Defaults to 0, in that case classical Krum is applied.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using Krum.

server.strategy.FedXgbNnAvg#

class flwr.server.strategy.FedXgbNnAvg(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None)[source]#

Configurable FedXgbNnAvg strategy implementation.

__init__(*, fraction_fit: float = 1.0, fraction_evaluate: float = 1.0, min_fit_clients: int = 2, min_evaluate_clients: int = 2, min_available_clients: int = 2, evaluate_fn: Optional[Callable[[int, List[ndarray[Any, dtype[Any]]], Dict[str, Union[bool, bytes, float, int, str]]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None) None#

Federated Averaging strategy.

Implementation based on https://arxiv.org/abs/1602.05629

Parameters:
  • fraction_fit (float, optional) – Fraction of clients used during training. In case min_fit_clients is larger than fraction_fit * available_clients, min_fit_clients will still be sampled. Defaults to 1.0.

  • fraction_evaluate (float, optional) – Fraction of clients used during validation. In case min_evaluate_clients is larger than fraction_evaluate * available_clients, min_evaluate_clients will still be sampled. Defaults to 1.0.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_evaluate_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • evaluate_fn (Optional[Callable[[int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Any], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using weighted average.

evaluate(server_round: int, parameters: Any) Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]][source]#

Evaluate model parameters using an evaluation function.

server.strategy.DPFedAvgAdaptive#

class flwr.server.strategy.DPFedAvgAdaptive(strategy: Strategy, num_sampled_clients: int, init_clip_norm: float = 0.1, noise_multiplier: float = 1.0, server_side_noising: bool = True, clip_norm_lr: float = 0.2, clip_norm_target_quantile: float = 0.5, clip_count_stddev: Optional[float] = None)[source]#

Wrapper for configuring a Strategy for DP with Adaptive Clipping.

__init__(strategy: Strategy, num_sampled_clients: int, init_clip_norm: float = 0.1, noise_multiplier: float = 1.0, server_side_noising: bool = True, clip_norm_lr: float = 0.2, clip_norm_target_quantile: float = 0.5, clip_count_stddev: Optional[float] = None) None[source]#
aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate training results as in DPFedAvgFixed and update clip norms.

configure_fit(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, FitIns]][source]#

Configure the next round of training.

server.strategy.DPFedAvgFixed#

class flwr.server.strategy.DPFedAvgFixed(strategy: Strategy, num_sampled_clients: int, clip_norm: float, noise_multiplier: float = 1, server_side_noising: bool = True)[source]#

Wrapper for configuring a Strategy for DP with Fixed Clipping.

__init__(strategy: Strategy, num_sampled_clients: int, clip_norm: float, noise_multiplier: float = 1, server_side_noising: bool = True) None[source]#
aggregate_evaluate(server_round: int, results: List[Tuple[ClientProxy, EvaluateRes]], failures: List[Union[Tuple[ClientProxy, EvaluateRes], BaseException]]) Tuple[Optional[float], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate evaluation losses using the given strategy.

aggregate_fit(server_round: int, results: List[Tuple[ClientProxy, FitRes]], failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]]) Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate training results using unweighted aggregation.

configure_evaluate(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, EvaluateIns]][source]#

Configure the next round of evaluation using the specified strategy.

Parameters:
  • server_round (int) – The current round of federated learning.

  • parameters (Parameters) – The current (global) model parameters.

  • client_manager (ClientManager) – The client manager which holds all currently connected clients.

Returns:

evaluate_configuration – A list of tuples. Each tuple in the list identifies a ClientProxy and the EvaluateIns for this particular ClientProxy. If a particular ClientProxy is not included in this list, it means that this ClientProxy will not participate in the next round of federated evaluation.

Return type:

List[Tuple[ClientProxy, EvaluateIns]]

configure_fit(server_round: int, parameters: Parameters, client_manager: ClientManager) List[Tuple[ClientProxy, FitIns]][source]#

Configure the next round of training incorporating Differential Privacy (DP).

Configuration of the next training round includes information related to DP, such as clip norm and noise stddev.

Parameters:
  • server_round (int) – The current round of federated learning.

  • parameters (Parameters) – The current (global) model parameters.

  • client_manager (ClientManager) – The client manager which holds all currently connected clients.

Returns:

fit_configuration – A list of tuples. Each tuple in the list identifies a ClientProxy and the FitIns for this particular ClientProxy. If a particular ClientProxy is not included in this list, it means that this ClientProxy will not participate in the next round of federated learning.

Return type:

List[Tuple[ClientProxy, FitIns]]

evaluate(server_round: int, parameters: Parameters) Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]][source]#

Evaluate model parameters using an evaluation function from the strategy.

initialize_parameters(client_manager: ClientManager) Optional[Parameters][source]#

Initialize global model parameters using given strategy.

common#

Common components shared between server and client.

class flwr.common.ClientMessage(get_properties_res: Optional[GetPropertiesRes] = None, get_parameters_res: Optional[GetParametersRes] = None, fit_res: Optional[FitRes] = None, evaluate_res: Optional[EvaluateRes] = None)[source]#

ClientMessage is a container used to hold one result message.

class flwr.common.Code(value)[source]#

Client status codes.

class flwr.common.DisconnectRes(reason: str)[source]#

DisconnectRes message from client to server.

class flwr.common.EvaluateIns(parameters: Parameters, config: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Evaluate instructions for a client.

class flwr.common.EvaluateRes(status: Status, loss: float, num_examples: int, metrics: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Evaluate response from a client.

class flwr.common.EventType(value)[source]#

Types of telemetry events.

class flwr.common.FitIns(parameters: Parameters, config: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Fit instructions for a client.

class flwr.common.FitRes(status: Status, parameters: Parameters, num_examples: int, metrics: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Fit response from a client.

class flwr.common.GetParametersIns(config: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Parameters request for a client.

class flwr.common.GetParametersRes(status: Status, parameters: Parameters)[source]#

Response when asked to return parameters.

class flwr.common.GetPropertiesIns(config: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Properties request for a client.

class flwr.common.GetPropertiesRes(status: Status, properties: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Properties response from a client.

class flwr.common.Parameters(tensors: List[bytes], tensor_type: str)[source]#

Model parameters.

class flwr.common.ReconnectIns(seconds: Optional[int])[source]#

ReconnectIns message from server to client.

class flwr.common.ServerMessage(get_properties_ins: Optional[GetPropertiesIns] = None, get_parameters_ins: Optional[GetParametersIns] = None, fit_ins: Optional[FitIns] = None, evaluate_ins: Optional[EvaluateIns] = None)[source]#

ServerMessage is a container used to hold one instruction message.

class flwr.common.Status(code: Code, message: str)[source]#

Client status.

flwr.common.bytes_to_ndarray(tensor: bytes) ndarray[Any, dtype[Any]][source]#

Deserialize NumPy ndarray from bytes.

flwr.common.configure(identifier: str, filename: Optional[str] = None, host: Optional[str] = None) None[source]#

Configure logging to file and/or remote log server.

flwr.common.log(level, msg, *args, **kwargs)[source]#

Log β€˜msg % args’ with the integer severity β€˜level’.

To pass exception information, use the keyword argument exc_info with a true value, e.g.

logger.log(level, β€œWe have a %s”, β€œmysterious problem”, exc_info=1)

flwr.common.ndarray_to_bytes(ndarray: ndarray[Any, dtype[Any]]) bytes[source]#

Serialize NumPy ndarray to bytes.

flwr.common.ndarrays_to_parameters(ndarrays: List[ndarray[Any, dtype[Any]]]) Parameters[source]#

Convert NumPy ndarrays to parameters object.

flwr.common.now() datetime[source]#

Construct a datetime from time.time() with time zone set to UTC.

flwr.common.parameters_to_ndarrays(parameters: Parameters) List[ndarray[Any, dtype[Any]]][source]#

Convert parameters object to NumPy ndarrays.