Prototype-Guided Lightweight Adapters for Communication-Efficient and Generalisable Federated Learning
Implementation of the paper submitted to MICCAI 2025.
This code requires the following:
torch=2.5.0
torchvision=0.20.0
torchsampler=0.1.2
numpy=1.26.4
pandas=2.2.2
scikit-learn=1.5.2
opencv-python-headless==4.11.0.86
tqdm=2.2.3
The code supports the following datasets: The EyePACS dataset can be accessed upon request: https://www.eyepacs.com/.
Run the line below to train the proposed algorithm
python exps/federated_main.py \
--model FedAdapterPrototype
--dataset fundus \
--num_classes 2 \
--num_users 4 \
--rounds 50 \
--local_bs 16 \
--num_channels 3 \
--lr 0.0001 \
--optimizer adam \
--train_ep 1 \
--use_sampler 1 \
You can change the default values of other parameters to simulate different conditions. Refer to the options section.
The default values for various paramters parsed to the experiment are given in options.py
.
Parts of this code are adapted from yuetan031/FedProto by @yuetan031.
If you find this project helpful, please consider citing this work:
@misc{mensah2025prototype,
author={Mensah, Samuel Ofosu and Djoumessi, Kerol and Berens, Philipp},
title={Prototype-Guided and Lightweight Adapters for Inherent Interpretation and Generalisation in Federated Learning},
year={2025},
eprint={2507.05852},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2507.05852},
}