Adaptive EMG decomposition in dynamic conditions based on online learning metrics with tunable hyperparameters
This repository contains functions to adaptively decompose electromyography (EMG) into motor unit firings during dynamic conditions in real-time (~22 ms per 100 ms batch, CPU only with loss calculation) based online learning metrics with tunable hyperparameters as described in Mendez Guerra et al, JNE, 2024. The code is implemented in python
using pytorch
.
- Installation
- Tutorial
- Package structure
- Command line inteface
- Data loaders
- Optimization
- Contributing
- License
- Citation
- Contact
To set up the project locally do the following:
- Clone the repository:
git clone https://github.com/imendezguerra/adapt_decomp.git
- Navigate to the project directory:
cd adapt_decomp
- Create the conda environment from the
environment.yml
file:conda env create -f environment.yml
- Activate the environment:
conda activate adapt_decomp
- Install adaptation package:
pip install -e .
Please note that the environment.yml
only installs the cpu
version of pytorch
. To enable gpu acceleration, cuda
will need to be installed manually (check command here)
The code has been tested in MacOs, Windows, and Linux.
To learn how to use the adaptive decomposition go to adaptive_emg_decomp_dyn_example for a step by step tutorial. Please note that the model requires a precalibrated decomposition model including extension factor, whitening, separation vectors, spike and baseline centroids, emg during calibration as well as the resulting IPTs and spikes. The model uses the last three variables to compute the whitening and separation vector losses based on the median squared error between the Kullback–Leibler divergence of the whitened covariance and the kurtosis of the sources between the adaptive and calibration conditions. To execute the code, download this example contraction and save it in the repository directory under data\example
folders.
The package is composed of the following modules:
loaders.py
: Functions to load EMG and decomposition files.data_structures.py
: Classes for the input EMG and initial decomposition parameters.adaptation.py
: Main class with the adaptive decomposition.config.py
: Dataclass with the parameters for the decomposition adaptation.preprocesing.py
: Functions for EMG preprocessing such as filtering.plots.py
: Functions to plot the results.utils.py
: Functions to extract motor unit properties and compute rate of agreement.io.py
: Functions to save and load the adaptive decomposition outputs.
The code is integrated with Weights and Biases to track and visualise the results. To enable this, run the code via the command line using:
python scripts/run.py --data_config configs/data_configs/data_example.yml --model_config configs/model_configs/default_neuromotion.yml --wandb_project_name adapt_decomp
This command executes the decomposition adaptation for a given input described in data_config with parameters specified in model_config. Take a look at the config to see examples of both structures. The data_config is simply a wrapper .yml
file with the paths to the calibrated decomposition model (path_decomp
), the input EMG for the decomposition adaptation (path_emg
), the ground truth data if available (path_gt
), a flag to activate/deactivate data preprocessing (preprocess
), and the corresponding data loader (loader
).
To use the command line interface with your own data, please implement the corresponding data loader in loaders.py, and add it to the load_data
wrapper function. Also, create the corresponding .yml
file and store it under data_configs
.
The default hyperparameters are the optimal for the dataset presented in Mendez Guerra et al, JNE, 2024 for simulations (neuromotion) and experimental wrist and forearm data. However, if those were not appropriate for a given dataset, the code is integrated with Weights and Biases which enables hyperparameter sweeps based on random sampling, grid search, and bayesian optimisation (default here). To access this functionality, use the command line to execute the code as:
python scripts/run.py --data_config configs/data_configs/data_example.yml --model_config configs/model_configs/default_neuromotion.yml --sweep_config configs/model_configs/sweep_loss.yml --sweep_counts 30 --wandb_project_name adapt_decomp
This will perform a hyperparameter sweep with 30 iterations (can be changed) for the hyperparameters, range, criteria, and method specified in sweep_loss.yml.
We welcome contributions! Here’s how you can contribute:
- Fork the repository.
- Create a feature branch (
git checkout -b feature/newfeature
). - Commit your changes (
git commit -m 'Add some newfeature'
). - Push to the branch (
git push origin feature/newfeature
). - Open a pull request.
This repository is licensed under the MIT License.
If you use this code in your research, please cite this repository:
@article{Mendez Guerra_2024,
author={Mendez Guerra, Irene and Barsakcioglu, Deren Y. and Farina, Dario},
title={Adaptive EMG decomposition in dynamic conditions based on online learning metrics with tunable hyperparameters},
journal={Journal of Neural Engineering},
publisher={IOP Publishing},
volume={21},
number={4},
ISSN={1741-2552},
DOI={10.1088/1741-2552/ad5ebf},
url={https://dx.doi.org/10.1088/1741-2552/ad5ebf}
}
For any questions or inquiries, please contact us at:
Irene Mendez Guerra
irene.mendez17@imperial.ac.uk