Skip to content

KudryashevLab/DeBCR

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DeBCR

Deblurring of light microscopy images using a multi-resolution neural network

DeBCR is a compact multi-resolution deep learning model for light microscopy image restorations (such as denoising and deconvolution).

This is an open-source project and is licensed under MIT license.

You can use DeBCR via:

  • Jupyter Notebook/Lab session as a Python library debcr - proceed with reading this repository;
  • Napari viewer as an add-on plugin napari-debcr - proceed with the napari-debcr repository.

For any installation/usage questions please write to the Issue Tracker.

Contents

  • Installation - installation options, dependencies and instructions
  • Usage - usage scenarious and respective tutorials
  • Samples - link to the example data and respective trained model weigths
  • About - key points of the network structure and results demo

Installation

There are two installation versions for DeBCR:

  • a GPU version (recommended) - allows full DeBCR functionality, including fast model training;
  • a CPU version (limited) - suitable only if you do not plan to use training, since doing it on CPUs might be very slow.

For a GPU version you need to have access to a GPU device with:

  • preferrably at least 16Gb of VRAM;
  • a CUDA Toolkit version compatible to your device (recommemded: CUDA-11.7);
  • a cuDNN version compatible to the CUDA above (recommemded: v8.4.0 for CUDA-11.x from cuDNN archive).

For GPU dependencies installation/configuration please check our tips on GPU-advice page.

Note
A proper CUDA and cuDNN installation and configuration might be tricky, especially if you work on an HPC cluster. Thus, try to contact your local system administrator first, before trying to install it yourself.

Create a package environment (optional)

For a clean installation, we also recommend using one of Python package environment managers, for example:

We will use micromamba as an example package manager. Create an environment for DeBCR using

micromamba env create -n debcr python=3.9

and activate it for further installation or usage by

micromamba activate debcr

Install DeBCR

Clone this repository to the desired directory by

cd /path/for/clone
git clone https://github.com/DeBCR/DeBCR

Next, enter the cloned DeBCR directory by

cd ./DeBCR

and install one of the DeBCR versions as

Target hardware Backend Command
GPU (recommended) TensorFlow-GPU-2.11
 pip install -e .[tf-gpu] 
CPU (limited) TensorFlow-CPU-2.11
 pip install -e .[tf-cpu] 

For a GPU version installation, it is recommended to check if your GPU device is recognised by TensorFlow using

python -c "import tensorflow as tf; print(tf.config.list_physical_devices('GPU'))"

which for a single GPU device should produce a similar output as below:

[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]

If your GPU device list is empty, please check our tips on GPU-advice page.

Install Jupyter

Finally, to use debcr as a python library (API) interactively as either CPU version (for prediction only) or as a GPU version (for both traininig and prediction) you need to install a Jupyter Notebook/Lab.

For example, install Jupyter Lab to your debcr environment by

pip install jupyterlab

Usage

To showcase how to use debcr as a python library (API) interactively in Jupyter Notebook/Lab, we prepared several usage examples (available in the cloned repository at DeBCR/notebooks):

Notebook Purpose Hardware Inputs
debcr_predict.ipynb deblurred prediction CPU/GPU pre-processed input data (NPZ/NPY),
trained DeBCR model.
debcr_train.ipynb model training GPU training/validation data (NPZ/NPY).
debcr_preproc.ipynb raw data pre-processing CPU raw data (TIF(F), JP(E)G, PNG).

To use notebooks, activate the respective environment (if any) and start Jupyter session in the directory with notebook

micromamba activate debcr
jupyter-lab

The tutorial notebooks use "samples":

  • sample data - examples of pre-processed training/validation/testing data;
  • sample weights - examples of the trained model weights, respective to sample data.

Samples

To evaluate DeBCR on various image restoration tasks, several previously published datasets were assembled, pre-processed and publicly deposited as NumPy (.npz) arrays in three essential sets (train, validation and test). The corresponding weights for DeBCR model, trained on respective train subsets, are provided along with the data.

The datasets aim at the image restoration tasks such as denoising and super-resolution deconvolution.

Access data and weights on Zenodo: 10.5281/zenodo.12626121.

About

DeBCR approximates imaging process inversion with deep convolutional neural network (DCNN), based on compact BCR-representation (Beylkin G. et al., Comm. Pure Appl. Math, 1991) for convolutions and its DCNN implementation as proposed in BCR-Net (Fan Y. et al., J. Comput. Phys., 2019): DeBCR network structure

In contrast to the traditional single-stage residual BCR learning process, DeBCR integrates feature maps from multiple resolution levels: DeBCR multi-resolution

The example of the DeBCR performance on the low/high exposure confocal data of Tribolium castaneum sample from the CARE work (Weigert et al., Nat. Methods, 2018) is shown below: DeBCR LM

For more details on the multi-stage residual BCR (m-rBCR) architechture used in DeBCR toolkit see:

Li, R., Kudryashev, M., Yakimovich, A. Solving the Inverse Problem of Microscopy Deconvolution with a Residual Beylkin-Coifman-Rokhlin Neural Network. ECCV 2024, Lecture Notes in Computer Science, vol 15133. Springer, Cham. https://doi.org/10.1007/978-3-031-73226-3_22

About

DeBCR for microscopy Denoising/Deblurring/Deconvolution

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 61.0%
  • Jupyter Notebook 39.0%