Skip to content

DeepGAInS/GAInS

Repository files navigation

GAInS: Gradient Anomaly-aware Biomedical Instance Segmentation [IEEE BIBM 2024]

Introduction

Abstract

Instance segmentation plays a vital role in the morphological quantification of biomedical entities such as tissues and cells, enabling precise identification and delineation of different structures. Current methods often address the challenges of touching, overlapping or crossing instances through individual modeling, while neglecting the intrinsic interrelation between these conditions. In this work, we propose a Gradient Anomaly-aware Biomedical Instance Segmentation approach GAInS, which leverages instance gradient information to perceive local gradient anomaly regions, thus modeling the spatial relationship between instances and refining local region segmentation. Specifically, GAInS is firstly built on a Gradient Anomaly Mapping Module (GAMM), which encodes the radial fields of instances through window sliding to obtain instance gradient anomaly maps. To efficiently refine boundaries and regions with gradient anomaly attention, we propose an Adaptive Local Refinement Module (ALRM) with a gradient anomaly-aware loss function. Extensive comparisons and ablation experiments in three biomedical scenarios demonstrate that our proposed GAInS outperforms other state-of-the-art (SOTA) instance segmentation methods.

Overview of the proposed GAInS.

Qualitative Result

Qualitative Result of our GAInS and other SOTA methods.

Quantitative Result

Methods ISBI2014 UCOC Kaggle2018
mAP↑ AJI↑ mAP↑ AJI↑ mAP↑ AJI↑
Mask R-CNN (R50) 59.59 76.21 72.33 81.29 38.75 54.79
Mask R-CNN (R101) 62.97 75.15 73.71 80.87 37.13 52.80
Mask Scoring R-CNN 60.03 71.78 70.31 83.21 37.32 52.81
PISA 60.84 74.56 73.24 81.73 38.09 51.73
Cascade R-CNN 63.40 52.21 73.66 81.09 40.33 54.09
CondIns 49.46 59.79 50.57 66.88 38.41 46.50
HTC 62.57 35.95 70.32 83.69 37.73 53.48
Pointrend 62.07 69.45 71.14 31.48 37.95 52.63
Occlusion R-CNN 62.35 78.64 67.30 83.52 35.85 51.81
DoNet 63.43 79.88 70.97 82.87 37.83 53.96
FastInst 61.28 71.66 72.74 80.12 37.02 50.93
GAInS 63.71 76.82 73.94 83.59 40.63 53.62
GAInS (R101) 61.39 77.66 74.48 85.87 39.79 55.30

Dataset

For convenience, the three datasets are collected in the Google Drive folders listed below. One can downloaded directly from Google Drive or from the official websites.

Note:

  1. For ISBI2014, we follow the offical division of training, validation and testing. For UOUC and Kaggle2018, the training, validation, and testing sets are divided by a ratio of 6:2:2 with random seeds.
  2. We use the 'cluster_nuclei' subset of Kaggle2018 in the paper. If needed, please download the whole Kaggle2018 dataset from its offical website.

Installation

For detectron2 installation instructions, please refer to the Detectron2 Installation Guide. After installing Detectron2, the data preprocessing needs further libraries such as scipy, matpoltlib, shapely. We recommand using conda or pip install.

pip install scipy
pip install matpoltlib
pip install shapely

QuickStart

Data Preprocessing

Data preprocessing is for Gradient Anomaly Map generation. Datasets like ISBI2014 where one image contains small amount of instances and faces overlapping issues, are recommonded using isbi_process.py. For nuclei datasets where one image contains huge amount of instances and faces touching issues, we recommend usingnuclei_process.py. For chromosomes which face crossing issues, we recommend using chrom_process.py. The reasons they are treated differently are all about running time and a few special cases (as mentioned in the paper).

To process, set the path of your own dataset at line 29-30 of isbi_process.py, nuclei_process.py or chrom_process.py. The json file requires COCO json format. To set up hyperparameters, set window_size, GA_factor and overlap_HL to proper values at line 32-34. Empirically, window_size could be set at a value of 1/20 to 1/10 of the size of an instance in the images. GA_factor is the attention rate you want to put on CTO regions, empirically in a proper interval [0.5, 2]. For overlap_HL, the highlight of overlapping regions, empirically a small number is enough, such as 0.1 or 0.5.

Here we provid our parameters for reference.

  • ISBI2014: window_size = 5, GA_facor = 0.5, overlap_HL = 0.1
  • UOUC: window_size = 5, GA_factor = 0.5, overlap_HL = 0.3
  • Kaggle2018: window_size = 8, GA_factor = 0.8, overlap_HL = 0.1

Additionally, set the target_id at line 40. It is the instance ID in your dataset json file. For example, the id of the cells in ISBI2014 is 0.

After setting up all the parameters, one can run the scripts. For example,

python isbi_process.py
python nuclei_process.py
python chrom_process.py

The data processing only requires CPU devices. The Gradient Anomaly Maps will be saved in the dataset folder. We also provided visualization functions in these three script. Visualize the Gradient Anomaly Maps if needed.

Train on Your Own Datasets

This should be done after the preprocessing step.

To train on your own data, firstly register your datasets by adding the path of your datasets at detectron2/detectron2/data/datasets/builtin.py line 57. Secondly, fill in the meta information of your datasets at detectron2/detectron2/data/datasets/builtin_meta.py line 30.

Config file is at detectron2/tools/my_config.yaml. Things you may change:

  • Line 14, 16: Name of your datasets.
  • Line 220: Number of classes.
  • Line 224: Path to your Gradient Anomaly Maps.
  • Line 295: Path to pre-trained model weight.
  • Line 296: Output directory.
  • Line 301: Initial learning rate.
  • Line 304: Checkpoint period.
  • Line 311: Batch size.
  • Line 313: Maximum number of iteration.
  • Line 344: Test stage evaluation period.

The training step requires GPUs. Use detectron2/tools/train_net.py for training and testing.

Training is done by

python train_net.py --config-file my_config.yaml

Test is done by

python train_net.py --config-file my_config.yaml --eval-only MODEL.WEIGHTS /path/to/checkpoint_file

Use detectron2/demo/demo.py to visualize results.

python demo.py --config-file /path/to/config-file --input /path/to/image --opts MODEL.WEIGHTS /path/to/checkpoint_file

Our Models

We provide our models, i.e. GAInS with R50 and R101 on three datasets, totally 6 models on One Drive. One can download them as pre-trained models, or evaluate on the pre-trained models if needed.

Note

The evaluation code rely on a modified pycoco package that provides a new function iouIntUni to compute intersection over union between masks, return iou, intersection, union together. For installation of the modified pycoco package, please refer to https://github.com/Amandaynzhou/MMT-PSM.

License

Detectron2 is released under the Apache 2.0 license.

Citation

Acknowledgement

The code of GAInS is built on detectron2 and ORCNN, thanks for the Third Party Libs.

Question

Should you have any questions: Runsheng Liu (rliuar@connect.ust.hk), Hao Jiang (hjiangaz@cse.ust.hk)

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors