API Reference - flwr#

client#

Flower client.

Client#

class flwr.client.Client[source]#

Abstract base class for Flower clients.

abstract evaluate(ins: flwr.common.typing.EvaluateIns) flwr.common.typing.EvaluateRes[source]#

Evaluate the provided weights using the locally held dataset.

Parameters

ins (EvaluateIns) – The evaluation instructions containing (global) model parameters received from the server and a dictionary of configuration values used to customize the local evaluation process.

Returns

The evaluation result containing the loss on the local dataset and other details such as the number of local data examples used for evaluation.

Return type

EvaluateRes

abstract fit(ins: flwr.common.typing.FitIns) flwr.common.typing.FitRes[source]#

Refine the provided weights using the locally held dataset.

Parameters

ins (FitIns) – The training instructions containing (global) model parameters received from the server and a dictionary of configuration values used to customize the local training process.

Returns

The training result containing updated parameters and other details such as the number of local training examples used for training.

Return type

FitRes

abstract get_parameters() flwr.common.typing.ParametersRes[source]#

Return the current local model parameters.

Returns

The current local model parameters.

Return type

ParametersRes

get_properties(ins: flwr.common.typing.PropertiesIns) flwr.common.typing.PropertiesRes[source]#

Return set of client’s properties.

Parameters

ins (PropertiesIns) – The get properties instructions received from the server containing a dictionary of configuration values used to configure.

Returns

Client’s properties.

Return type

PropertiesRes

start_client#

flwr.client.start_client(server_address: str, client: flwr.client.client.Client, grpc_max_message_length: int = 536870912, root_certificates: Optional[bytes] = None) None[source]#

Start a Flower Client which connects to a gRPC server.

Parameters
  • server_address (str. The IPv6 address of the server. If the Flower) – server runs on the same machine on port 8080, then server_address would be “[::]:8080”.

  • client (flwr.client.Client. An implementation of the abstract base) – class flwr.client.Client.

  • grpc_max_message_length (int (default: 536_870_912, this equals 512MB).) – The maximum length of gRPC messages that can be exchanged with the Flower server. The default should be sufficient for most models. Users who train very large models might need to increase this value. Note that the Flower server needs to be started with the same value (see flwr.server.start_server), otherwise it will not know about the increased limit and block larger messages.

  • root_certificates (bytes (default: None)) – The PEM-encoded root certificates as a byte string. If provided, a secure connection using the certificates will be established to a SSL-enabled Flower server.

Return type

None

Examples

Starting a client with insecure server connection:

>>> start_client(
>>>     server_address=localhost:8080,
>>>     client=FlowerClient(),
>>> )

Starting a SSL-enabled client:

>>> from pathlib import Path
>>> start_client(
>>>     server_address=localhost:8080,
>>>     client=FlowerClient(),
>>>     root_certificates=Path("/crts/root.pem").read_bytes(),
>>> )

NumPyClient#

class flwr.client.NumPyClient[source]#

Abstract base class for Flower clients using NumPy.

abstract evaluate(parameters: List[numpy.ndarray], config: Dict[str, Union[bool, bytes, float, int, str]]) Tuple[float, int, Dict[str, Union[bool, bytes, float, int, str]]][source]#

Evaluate the provided weights using the locally held dataset.

Parameters
  • parameters (List[np.ndarray]) – The current (global) model parameters.

  • config (Dict[str, Scalar]) – Configuration parameters which allow the server to influence evaluation on the client. It can be used to communicate arbitrary values from the server to the client, for example, to influence the number of examples used for evaluation.

Returns

  • loss (float) – The evaluation loss of the model on the local dataset.

  • num_examples (int) – The number of examples used for evaluation.

  • metrics (Dict[str, Scalar]) – A dictionary mapping arbitrary string keys to values of type bool, bytes, float, int, or str. It can be used to communicate arbitrary values back to the server.

Warning

The previous return type format (int, float, float) and the extended format (int, float, float, Dict[str, Scalar]) have been deprecated and removed since Flower 0.19.

abstract fit(parameters: List[numpy.ndarray], config: Dict[str, Union[bool, bytes, float, int, str]]) Tuple[List[numpy.ndarray], int, Dict[str, Union[bool, bytes, float, int, str]]][source]#

Train the provided parameters using the locally held dataset.

Parameters
  • parameters (List[numpy.ndarray]) – The current (global) model parameters.

  • config (Dict[str, Scalar]) – Configuration parameters which allow the server to influence training on the client. It can be used to communicate arbitrary values from the server to the client, for example, to set the number of (local) training epochs.

Returns

  • parameters (List[numpy.ndarray]) – The locally updated model parameters.

  • num_examples (int) – The number of examples used for training.

  • metrics (Dict[str, Scalar]) – A dictionary mapping arbitrary string keys to values of type bool, bytes, float, int, or str. It can be used to communicate arbitrary values back to the server.

abstract get_parameters() List[numpy.ndarray][source]#

Return the current local model parameters.

Returns

parameters – The local model parameters as a list of NumPy ndarrays.

Return type

List[numpy.ndarray]

get_properties(config: Dict[str, Union[bool, bytes, float, int, str]]) Dict[str, Union[bool, bytes, float, int, str]][source]#

Returns a client’s set of properties.

Parameters

config (Config) – Configuration parameters requested by the server. This can be used to tell the client which parameters are needed along with some Scalar attributes.

Returns

properties – A dictionary mapping arbitrary string keys to values of type bool, bytes, float, int, or str. It can be used to communicate arbitrary property values back to the server.

Return type

Dict[str, Scalar]

start_numpy_client#

flwr.client.start_numpy_client(server_address: str, client: flwr.client.numpy_client.NumPyClient, grpc_max_message_length: int = 536870912, root_certificates: Optional[bytes] = None) None[source]#

Start a Flower NumPyClient which connects to a gRPC server.

Parameters
  • server_address (str. The IPv6 address of the server. If the Flower) – server runs on the same machine on port 8080, then server_address would be “[::]:8080”.

  • client (flwr.client.NumPyClient. An implementation of the abstract base) – class flwr.client.NumPyClient.

  • grpc_max_message_length (int (default: 536_870_912, this equals 512MB).) – The maximum length of gRPC messages that can be exchanged with the Flower server. The default should be sufficient for most models. Users who train very large models might need to increase this value. Note that the Flower server needs to be started with the same value (see flwr.server.start_server), otherwise it will not know about the increased limit and block larger messages.

  • root_certificates (bytes (default: None)) – The PEM-encoded root certificates a byte string. If provided, a secure connection using the certificates will be established to a SSL-enabled Flower server.

Return type

None

Examples

Starting a client with an insecure server connection:

>>> start_client(
>>>     server_address=localhost:8080,
>>>     client=FlowerClient(),
>>> )

Starting a SSL-enabled client:

>>> from pathlib import Path
>>> start_client(
>>>     server_address=localhost:8080,
>>>     client=FlowerClient(),
>>>     root_certificates=Path("/crts/root.pem").read_bytes(),
>>> )

server#

Flower server.

server.start_server#

flwr.server.start_server(server_address: str = '[::]:8080', server: Optional[flwr.server.server.Server] = None, config: Optional[Dict[str, Optional[Union[int, float]]]] = None, strategy: Optional[flwr.server.strategy.strategy.Strategy] = None, client_manager: Optional[flwr.server.client_manager.ClientManager] = None, grpc_max_message_length: int = 536870912, force_final_distributed_eval: bool = False, certificates: Optional[Tuple[bytes, bytes, bytes]] = None) flwr.server.history.History[source]#

Start a Flower server using the gRPC transport layer.

Parameters
  • server_address (Optional[str] (default: “[::]:8080”). The IPv6) – address of the server.

  • server (Optional[flwr.server.Server] (default: None). An implementation) – of the abstract base class flwr.server.Server. If no instance is provided, then start_server will create one.

  • config (Optional[Dict[str, Union[int, Optional[float]]]] (default: None).) – Currently supported values are num_rounds (int, default: 1) and round_timeout in seconds (float, default: None), so a full configuration object instructing the server to perform three rounds of federated learning with a round timeout of 10min looks like the following: {“num_rounds”: 3, “round_timeout”: 600.0}.

  • strategy (Optional[flwr.server.Strategy] (default: None). An) – implementation of the abstract base class flwr.server.Strategy. If no strategy is provided, then start_server will use flwr.server.strategy.FedAvg.

  • client_manager (Optional[flwr.server.ClientManager] (default: None)) – An implementation of the abstract base class flwr.server.ClientManager. If no implementation is provided, then start_server will use flwr.server.client_manager.SimpleClientManager.

  • grpc_max_message_length (int (default: 536_870_912, this equals 512MB).) – The maximum length of gRPC messages that can be exchanged with the Flower clients. The default should be sufficient for most models. Users who train very large models might need to increase this value. Note that the Flower clients need to be started with the same value (see flwr.client.start_client), otherwise clients will not know about the increased limit and block larger messages.

  • force_final_distributed_eval (bool (default: False).) – Forces a distributed evaluation to occur after the last training epoch when enabled.

  • certificates (Tuple[bytes, bytes, bytes] (default: None)) –

    Tuple containing root certificate, server certificate, and private key to start a secure SSL-enabled server. The tuple is expected to have three bytes elements in the following order:

    • CA certificate.

    • server certificate.

    • server private key.

Returns

hist

Return type

flwr.server.history.History. Object containing metrics from training.

Examples

Starting an insecure server:

>>> start_server()

Starting a SSL-enabled server:

>>> start_server(
>>>     certificates=(
>>>         Path("/crts/root.pem").read_bytes(),
>>>         Path("/crts/localhost.crt").read_bytes(),
>>>         Path("/crts/localhost.key").read_bytes()
>>>     )
>>> )

server.strategy#

Contains the strategy abstraction and different implementations.

server.strategy.Strategy#

class flwr.server.strategy.Strategy[source]#

Abstract base class for server strategy implementations.

abstract aggregate_evaluate(rnd: int, results: List[Tuple[flwr.server.client_proxy.ClientProxy, flwr.common.typing.EvaluateRes]], failures: List[BaseException]) Tuple[Optional[float], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate evaluation results.

Parameters
  • rnd – int. The current round of federated learning.

  • results – List[Tuple[ClientProxy, FitRes]]. Successful updates from the previously selected and configured clients. Each pair of (ClientProxy, FitRes constitutes a successful update from one of the previously selected clients. Not that not all previously selected clients are necessarily included in this list: a client might drop out and not submit a result. For each client that did not submit an update, there should be an Exception in failures.

  • failures – List[BaseException]. Exceptions that occurred while the server was waiting for client updates.

Returns

Optional float representing the aggregated evaluation result. Aggregation typically uses some variant of a weighted average.

abstract aggregate_fit(rnd: int, results: List[Tuple[flwr.server.client_proxy.ClientProxy, flwr.common.typing.FitRes]], failures: List[BaseException]) Tuple[Optional[flwr.common.typing.Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate training results.

Parameters
  • rnd (int) – The current round of federated learning.

  • results (List[Tuple[ClientProxy, FitRes]]) – Successful updates from the previously selected and configured clients. Each pair of (ClientProxy, FitRes) constitutes a successful update from one of the previously selected clients. Not that not all previously selected clients are necessarily included in this list: a client might drop out and not submit a result. For each client that did not submit an update, there should be an Exception in failures.

  • failures (List[BaseException]) – Exceptions that occurred while the server was waiting for client updates.

Returns

parameters – If parameters are returned, then the server will treat these as the new global model parameters (i.e., it will replace the previous parameters with the ones returned from this method). If None is returned (e.g., because there were only failures and no viable results) then the server will no update the previous model parameters, the updates received in this round are discarded, and the global model parameters remain the same.

Return type

Parameters (optional)

abstract configure_evaluate(rnd: int, parameters: flwr.common.typing.Parameters, client_manager: flwr.server.client_manager.ClientManager) List[Tuple[flwr.server.client_proxy.ClientProxy, flwr.common.typing.EvaluateIns]][source]#

Configure the next round of evaluation.

Parameters
  • rnd – Integer. The current round of federated learning.

  • parameters – Parameters. The current (global) model parameters.

  • client_manager – ClientManager. The client manager which holds all currently connected clients.

Returns

A list of tuples. Each tuple in the list identifies a ClientProxy and the EvaluateIns for this particular ClientProxy. If a particular ClientProxy is not included in this list, it means that this ClientProxy will not participate in the next round of federated evaluation.

abstract configure_fit(rnd: int, parameters: flwr.common.typing.Parameters, client_manager: flwr.server.client_manager.ClientManager) List[Tuple[flwr.server.client_proxy.ClientProxy, flwr.common.typing.FitIns]][source]#

Configure the next round of training.

Parameters
  • rnd (int) – The current round of federated learning.

  • parameters (Parameters) – The current (global) model parameters.

  • client_manager (ClientManager) – The client manager which holds all currently connected clients.

Returns

  • A list of tuples. Each tuple in the list identifies a ClientProxy and the

  • FitIns for this particular ClientProxy. If a particular ClientProxy

  • is not included in this list, it means that this ClientProxy

  • will not participate in the next round of federated learning.

abstract evaluate(parameters: flwr.common.typing.Parameters) Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]][source]#

Evaluate the current model parameters.

This function can be used to perform centralized (i.e., server-side) evaluation of model parameters.

Parameters

parameters – Parameters. The current (global) model parameters.

Returns

The evaluation result, usually a Tuple containing loss and a dictionary containing task-specific metrics (e.g., accuracy).

abstract initialize_parameters(client_manager: flwr.server.client_manager.ClientManager) Optional[flwr.common.typing.Parameters][source]#

Initialize the (global) model parameters.

Parameters

client_manager (ClientManager. The client manager which holds all currently) – connected clients.

Returns

parameters – If parameters are returned, then the server will treat these as the initial global model parameters.

Return type

Parameters (optional)

server.strategy.FedAvg#

class flwr.server.strategy.FedAvg(fraction_fit: float = 0.1, fraction_eval: float = 0.1, min_fit_clients: int = 2, min_eval_clients: int = 2, min_available_clients: int = 2, eval_fn: Optional[Callable[[List[numpy.ndarray]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[flwr.common.typing.Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None)[source]#

Configurable FedAvg strategy implementation.

__init__(fraction_fit: float = 0.1, fraction_eval: float = 0.1, min_fit_clients: int = 2, min_eval_clients: int = 2, min_available_clients: int = 2, eval_fn: Optional[Callable[[List[numpy.ndarray]], Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]]]] = None, on_fit_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, on_evaluate_config_fn: Optional[Callable[[int], Dict[str, Union[bool, bytes, float, int, str]]]] = None, accept_failures: bool = True, initial_parameters: Optional[flwr.common.typing.Parameters] = None, fit_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None, evaluate_metrics_aggregation_fn: Optional[Callable[[List[Tuple[int, Dict[str, Union[bool, bytes, float, int, str]]]]], Dict[str, Union[bool, bytes, float, int, str]]]] = None) None[source]#

Federated Averaging strategy.

Implementation based on https://arxiv.org/abs/1602.05629

Parameters
  • fraction_fit (float, optional) – Fraction of clients used during training. Defaults to 0.1.

  • fraction_eval (float, optional) – Fraction of clients used during validation. Defaults to 0.1.

  • min_fit_clients (int, optional) – Minimum number of clients used during training. Defaults to 2.

  • min_eval_clients (int, optional) – Minimum number of clients used during validation. Defaults to 2.

  • min_available_clients (int, optional) – Minimum number of total clients in the system. Defaults to 2.

  • eval_fn (Callable[[Weights], Optional[Tuple[float, Dict[str, Scalar]]]]) – Optional function used for validation. Defaults to None.

  • on_fit_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure training. Defaults to None.

  • on_evaluate_config_fn (Callable[[int], Dict[str, Scalar]], optional) – Function used to configure validation. Defaults to None.

  • accept_failures (bool, optional) – Whether or not accept rounds containing failures. Defaults to True.

  • initial_parameters (Parameters, optional) – Initial global model parameters.

  • fit_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

  • evaluate_metrics_aggregation_fn (Optional[MetricsAggregationFn]) – Metrics aggregation function, optional.

aggregate_evaluate(rnd: int, results: List[Tuple[flwr.server.client_proxy.ClientProxy, flwr.common.typing.EvaluateRes]], failures: List[BaseException]) Tuple[Optional[float], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate evaluation losses using weighted average.

aggregate_fit(rnd: int, results: List[Tuple[flwr.server.client_proxy.ClientProxy, flwr.common.typing.FitRes]], failures: List[BaseException]) Tuple[Optional[flwr.common.typing.Parameters], Dict[str, Union[bool, bytes, float, int, str]]][source]#

Aggregate fit results using weighted average.

configure_evaluate(rnd: int, parameters: flwr.common.typing.Parameters, client_manager: flwr.server.client_manager.ClientManager) List[Tuple[flwr.server.client_proxy.ClientProxy, flwr.common.typing.EvaluateIns]][source]#

Configure the next round of evaluation.

configure_fit(rnd: int, parameters: flwr.common.typing.Parameters, client_manager: flwr.server.client_manager.ClientManager) List[Tuple[flwr.server.client_proxy.ClientProxy, flwr.common.typing.FitIns]][source]#

Configure the next round of training.

evaluate(parameters: flwr.common.typing.Parameters) Optional[Tuple[float, Dict[str, Union[bool, bytes, float, int, str]]]][source]#

Evaluate model parameters using an evaluation function.

initialize_parameters(client_manager: flwr.server.client_manager.ClientManager) Optional[flwr.common.typing.Parameters][source]#

Initialize global model parameters.

num_evaluation_clients(num_available_clients: int) Tuple[int, int][source]#

Use a fraction of available clients for evaluation.

num_fit_clients(num_available_clients: int) Tuple[int, int][source]#

Return the sample size and the required number of available clients.

typing#

Flower type definitions.

class flwr.common.typing.Code(value)[source]#

Client status codes.

class flwr.common.typing.Disconnect(reason: str)[source]#

Disconnect message from client to server.

class flwr.common.typing.EvaluateIns(parameters: flwr.common.typing.Parameters, config: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Evaluate instructions for a client.

class flwr.common.typing.EvaluateRes(loss: float, num_examples: int, metrics: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Evaluate response from a client.

class flwr.common.typing.FitIns(parameters: flwr.common.typing.Parameters, config: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Fit instructions for a client.

class flwr.common.typing.FitRes(parameters: flwr.common.typing.Parameters, num_examples: int, metrics: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Fit response from a client.

class flwr.common.typing.Parameters(tensors: List[bytes], tensor_type: str)[source]#

Model parameters.

class flwr.common.typing.ParametersRes(parameters: flwr.common.typing.Parameters)[source]#

Response when asked to return parameters.

class flwr.common.typing.PropertiesIns(config: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Properties requests for a client.

class flwr.common.typing.PropertiesRes(status: flwr.common.typing.Status, properties: Dict[str, Union[bool, bytes, float, int, str]])[source]#

Properties response from a client.

class flwr.common.typing.Reconnect(seconds: Optional[int])[source]#

Reconnect message from server to client.

class flwr.common.typing.Status(code: flwr.common.typing.Code, message: str)[source]#

Client status.