Flower Example using XGBoost#
This example demonstrates how to perform EXtreme Gradient Boosting (XGBoost) within Flower using
We use HIGGS dataset for this example to perform a binary classification task.
Tree-based with bagging method is used for aggregation on the server.
This project provides a minimal code example to enable you to get stated quickly. For a more comprehensive code example, take a look at xgboost-comprehensive.
Start by cloning the example project. We prepared a single-line command that you can copy into your shell which will checkout the example for you:
git clone --depth=1 https://github.com/adap/flower.git && mv flower/examples/xgboost-quickstart . && rm -rf flower && cd xgboost-quickstart
This will create a new directory called
xgboost-quickstart containing the following files:
-- README.md <- Your're reading this right now -- server.py <- Defines the server-side logic -- client.py <- Defines the client-side logic -- run.sh <- Commands to run experiments -- pyproject.toml <- Example dependencies (if you use Poetry) -- requirements.txt <- Example dependencies
Project dependencies (such as
flwr) are defined in
requirements.txt. We recommend Poetry to install those dependencies and manage your virtual environment (Poetry installation) or pip, but feel free to use a different way of installing dependencies and managing virtual environments if you have other preferences.
poetry install poetry shell
Poetry will install all your dependencies in a newly created virtual environment. To verify that everything works correctly you can run the following command:
poetry run python3 -c "import flwr"
If you don’t see any errors you’re good to go!
Write the command below in your terminal to install the dependencies according to the configuration file requirements.txt.
pip install -r requirements.txt
Run Federated Learning with XGBoost and Flower#
Afterwards you are ready to start the Flower server as well as the clients. You can simply start the server in a terminal as follows:
Now you are ready to start the Flower clients which will participate in the learning. To do so simply open two more terminal windows and run the following commands.
Start client 1 in the first terminal:
python3 client.py --node-id=0
Start client 2 in the second terminal:
python3 client.py --node-id=1
You will see that XGBoost is starting a federated training.
Alternatively, you can use
run.sh to run the same experiment in a single terminal as follows:
poetry run ./run.sh