CHASe: Client Heterogeneity-Aware Data Selection for Effective Federated Active Learning
- PyTorch
- NumPy
- options.py: Hyperparameter setting for CHASe.
- sampling.py: Sample the MNIST, EMNIST, CIFAR10, CIFAR-100 and Shakespeare in a IID/NonIID manner.
- utils.py:
- Construction of labeled, unlabeled and global test sets for sampled dataset;
- Server's aggregation & Definition of log detail.
- model.py: Models for MNIST, EMNIST, CIFAR10, CIFAR-100 and Shakespeare datasets.
- localtraining.py:
- Clients' local training ;
- Quantify Epistemic Variation;
- Calibrate Decision Boundary;
- Inference of local & global model.
- Clients' local training ;
- slected_strategy.py: Definition of the sampling with EV.
- main: Core code for CHASe, Logic and interaction throughout the pipeline.
- Download MNIST, EMNIST, CIFAR10 , CIFAR-100 and Shakespeare datasets or Execute the program default download;
- Set parameters in options.py;
- Execute main.py to run the CHASe.