Skip to content

Commit b059bb3

Browse files
committed
docs: minor typo fixes
1 parent 8023bf1 commit b059bb3

File tree

2 files changed

+34
-13
lines changed

2 files changed

+34
-13
lines changed

examples/pannuke_nuclei_segmentation_cellpose.ipynb

Lines changed: 17 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -199,7 +199,7 @@
199199
"cell_type": "markdown",
200200
"metadata": {},
201201
"source": [
202-
"Next, we will define a a simple `train`-loop wrapper function to train the model. The training logic in the wrapper is built with the [`accelerate`](https://huggingface.co/docs/accelerate/index). It is a convenient way to build training scripts in different types of computing environments. Check out https://huggingface.co/docs/accelerate/index for more.\n",
202+
"Next, we will define a simple `train`-loop wrapper function to train the model. The training logic in the wrapper is built with the [`accelerate`](https://huggingface.co/docs/accelerate/index). It is a convenient way to build training scripts in different types of computing environments. Check out https://huggingface.co/docs/accelerate/index for more.\n",
203203
"\n",
204204
"In the training and validation loops we will have to take into account that the `PannukeDataModule`'s `DataLoader` returns the the inputs and targets in a dictionary like this:\n",
205205
"```\n",
@@ -808,7 +808,7 @@
808808
"source": [
809809
"# Model outputs\n",
810810
"\n",
811-
"Next we will visualize what kind of outputs the model is able to produce after 5 epochs of training. \n",
811+
"Next we will visualize what kind of outputs the model is able to produce after 10 epochs of training. \n",
812812
"\n",
813813
"Especially, we will look at the `cellpose`, and the `type` maps.\n",
814814
"\n",
@@ -903,6 +903,21 @@
903903
"ax[8].imshow(pred3[\"cellpose\"].squeeze().detach().numpy()[0])"
904904
]
905905
},
906+
{
907+
"cell_type": "markdown",
908+
"metadata": {},
909+
"source": [
910+
"Typically, encoder-decoder based nuclei segmentation model outputs require post-processing. The main-task of the post-processing is to separate clumped nuclear-objects which is a renowned problem in nuclei segmentation. With `cellseg_models.pytorch`, inference and post-processing can be executed with specific `Inferer` classes that can be found in the `cellseg_models_pytorch.inference` module. \n",
911+
"\n",
912+
"Since the Pannuke-dataset has only 256x256px images, we can use the `ResizeInferer` to run the inference and post-processing (without actually resizing the images). The `Inferer`s take in an input directory and a set of arguments, from which, the `instance_postproc` is the most important since it sets the post-processing method to be used. Here, naturally, we will use `cellpose` post-processing since we're running inference for a Cellpose model.\n",
913+
"\n",
914+
"Other important params include: \n",
915+
"- `out_activations` - Sets the output activation functions for each of the model outputs\n",
916+
"- `out_boundary_weights` - Sets whether we will use a weight matrix to add less weight to boundaries of the predictions. This can only be useful when inference is run for bigger images that are patched in overlapping patches (inference with overlapping patches can be done with the `SlidingWindowInferer`).\n",
917+
"- `normalization` - Should be set to the same one as during training.\n",
918+
"- `n_images` - Run inference only for the 50 first images of inside the input folder."
919+
]
920+
},
906921
{
907922
"cell_type": "code",
908923
"execution_count": 12,

examples/pannuke_nuclei_segmentation_stardist.ipynb

Lines changed: 17 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -41,10 +41,12 @@
4141
],
4242
"source": [
4343
"# version info\n",
44-
"import torch\n",
44+
"from platform import python_version\n",
45+
"\n",
4546
"import lightning\n",
47+
"import torch\n",
48+
"\n",
4649
"import cellseg_models_pytorch\n",
47-
"from platform import python_version\n",
4850
"\n",
4951
"print(\"torch version:\", torch.__version__)\n",
5052
"print(\"lightning version:\", lightning.__version__)\n",
@@ -86,6 +88,7 @@
8688
],
8789
"source": [
8890
"from pathlib import Path\n",
91+
"\n",
8992
"from cellseg_models_pytorch.datamodules import PannukeDataModule\n",
9093
"\n",
9194
"# fold1 and fold2 are used for training, fold3 is used for validation\n",
@@ -144,17 +147,18 @@
144147
}
145148
],
146149
"source": [
147-
"import numpy as np\n",
148150
"import matplotlib.pyplot as plt\n",
151+
"import numpy as np\n",
149152
"from skimage.color import label2rgb\n",
150153
"\n",
151-
"# filehandler contains methods to read and write images and masks\n",
152-
"from cellseg_models_pytorch.utils import FileHandler\n",
153154
"from cellseg_models_pytorch.transforms.functional import (\n",
154-
" gen_stardist_maps,\n",
155155
" gen_dist_maps,\n",
156+
" gen_stardist_maps,\n",
156157
")\n",
157158
"\n",
159+
"# filehandler contains methods to read and write images and masks\n",
160+
"from cellseg_models_pytorch.utils import FileHandler\n",
161+
"\n",
158162
"img_dir = save_dir / \"train\" / \"images\"\n",
159163
"mask_dir = save_dir / \"train\" / \"labels\"\n",
160164
"imgs = sorted(img_dir.glob(\"*\"))\n",
@@ -216,12 +220,13 @@
216220
"metadata": {},
217221
"outputs": [],
218222
"source": [
223+
"from typing import Dict, List, Optional, Tuple\n",
224+
"\n",
225+
"import lightning.pytorch as pl\n",
219226
"import torch\n",
220227
"import torch.nn as nn\n",
221228
"import torch.optim as optim\n",
222229
"import torchmetrics\n",
223-
"import lightning.pytorch as pl\n",
224-
"from typing import List, Tuple, Dict, Optional\n",
225230
"\n",
226231
"\n",
227232
"class SegmentationExperiment(pl.LightningModule):\n",
@@ -392,16 +397,16 @@
392397
"source": [
393398
"import torch.optim as optim\n",
394399
"\n",
395-
"from cellseg_models_pytorch.models import stardist_base_multiclass\n",
396400
"from cellseg_models_pytorch.losses import (\n",
397401
" MAE,\n",
398402
" MSE,\n",
399-
" DiceLoss,\n",
400403
" BCELoss,\n",
401404
" CELoss,\n",
405+
" DiceLoss,\n",
402406
" JointLoss,\n",
403407
" MultiTaskLoss,\n",
404408
")\n",
409+
"from cellseg_models_pytorch.models import stardist_base_multiclass\n",
405410
"\n",
406411
"# seed the experiment for reproducibility\n",
407412
"pl.seed_everything(42)\n",
@@ -888,9 +893,10 @@
888893
}
889894
],
890895
"source": [
896+
"import matplotlib.patches as mpatches\n",
891897
"import numpy as np\n",
898+
"\n",
892899
"from cellseg_models_pytorch.utils import draw_thing_contours\n",
893-
"import matplotlib.patches as mpatches\n",
894900
"\n",
895901
"fig, ax = plt.subplots(5, 2, figsize=(10, 17))\n",
896902
"ax = ax.flatten()\n",

0 commit comments

Comments
 (0)