Skip to content

Official implementation of the paper, "Full-scale Representation Guided Network for Retinal Vessel Segmentation"

Notifications You must be signed in to change notification settings

ZombaSY/FSG-Net-pytorch

Repository files navigation

Full-scale Representation Guided Network for Retinal Vessel Segmentation

This is official repository of the paper Full-scale Representation Guided Network for Retinal Vessel Segmentation

image_2

Environment

  • OS: Ubuntu 16.04
  • GPU: Tesla V100 32GB
  • GPU Driver version: 460.106.00
  • CUDA: 11.2
  • Pytorch 1.8.1

✅ Experimental Result

Dataset mIoU F1 score Acc AUC Sen MCC
DRIVE 84.068 83.229 97.042 98.235 84.207 81.731
STARE 86.118 85.100 97.746 98.967 86.608 83.958
CHASE_DB1 82.680 81.019 97.515 99.378 85.995 79.889
HRF 83.088 81.567 97.106 98.744 83.616 80.121

✅ Pretrained model for each dataset

Each pre-trained model could be found on release version

🧻 Dataset Preparation

You can edit 'train_x_path...' in "configs/train.yml"
The input and label should be sorted by name, or the dataset is unmatched to learn.

For train/validation set, you can download from public link or release version


🚄 Train

If you have installed 'WandB', login your ID in command line.
If not, fix 'wandb: false' in "configs/train.yml" You can login through your command line or 'wandb.login()' inside "main.py"

For Train, edit the configs/train.yml and execute below command

bash bash_train.sh

🛴 Inference

For Inference, edit the configs/inference.yml and execute below command.
Please locate your model path via 'model_path' in "configs/inference.yml"

bash bash_inference.sh
  • If you are using pretrained model, the result should be approximate to table's

About

Official implementation of the paper, "Full-scale Representation Guided Network for Retinal Vessel Segmentation"

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages