clDice - a Novel Topology-Preserving Loss Function for Tubular Structure Segmentation

Related tags

Deep Learning clDice
Overview

README

clDice - a Novel Topology-Preserving Loss Function for Tubular Structure Segmentation

CVPR 2021

Authors: Suprosanna Shit and Johannes C. Paetzold et al.

@article{shit2020cldice,
  title={clDice - a Topology-Preserving Loss Function for Tubular Structure Segmentation},
  author={Shit, Suprosanna and Paetzold, Johannes C and Sekuboyina, Anjany and Zhylka, Andrey and Ezhov, Ivan and Unger, Alexander and Pluim, Josien PW and Tetteh, Giles and Menze, Bjoern H},
  journal={arXiv preprint arXiv:2003.07311},
  year={2020}
}

Abstract

Accurate segmentation of tubular, network-like structures, such as vessels, neurons, or roads, is relevant to many fields of research. For such structures, the topology is their most important characteristic; particularly preserving connectedness: in the case of vascular networks, missing a connected vessel entirely alters the blood-flow dynamics. We introduce a novel similarity measure termed centerlineDice (short clDice), which is calculated on the intersection of the segmentation masks and their (morphological) skeleta. We theoretically prove that clDice guarantees topology preservation up to homotopy equivalence for binary 2D and 3D segmentation. Extending this, we propose a computationally efficient, differentiable loss function (soft-clDice) for training arbitrary neural segmentation networks. We benchmark the soft-clDice loss on five public datasets, including vessels, roads and neurons (2D and 3D). Training on soft-clDice leads to segmentation with more accurate connectivity information, higher graph similarity, and better volumetric scores.

Table of contents

clDice Metric

In our publication we show how clDice can be used as a Metric to benchmark segmentation performance for tubular structures. The metric clDice is calculated using a "hard" skeleton using skeletonize from the scikit-image library. Other potentially more sophisticated skeletonization techniques could be integrated in to the clDice metric as well. You can find a python implementation in this repository.

clDice as a Loss function

To train neural networks with clDice we implemented a loss function. For stability reasons and to ensure a good volumetric segmentation we combine clDice with a regular Dice or binary cross entropy loss function. Moreover, we need to introduce a Soft Skeleton to make the skeletonization fully differentiable. In this repository you can find the following implementations:

  1. pytorch 2D and 3D
  2. tensorflow/Keras 2D and 3D

Soft Skeleton

To use clDice as a loss function we introduce a differentiable soft-skeletonization where an iterative min- and max-pooling is applied as a proxy for morphological erosion and dilation.

drawing

Comments
  • 3D soft_skel gives discontinuous input and different results on pytorch vs tensorflow

    3D soft_skel gives discontinuous input and different results on pytorch vs tensorflow

    First of all - thanks for a great paper about clDice, it's really interesting approach.

    I wanted to test the idea on 3D dataset.

    I have a synthetic 3D shape on which I just run soft_skeletonize and I assume it should leave the same single connected component. unfortunately it doesn't. See the following summary showing the input image on the left, and two iterations of soft_skel - for both tensorflow and pytorch.

    image

    I've created a reproducible Colab notebook for the case: https://gist.github.com/kretes/84f6025e7e1ded19591a54b62abcc539

    opened by kretes 5
  • sensitivity and precision

    sensitivity and precision

    https://github.com/jocpae/clDice/blob/e9a5f8efc376eb1961b276e698a38556224b7d60/cldice_metric/cldice.py#L32

    the definition of precision and sensitivity seems not consist with cldice loss and common definition, although it will not affect the final cldice score tprec = cl_score(v_l,skeletonize_3d(v_p)) tsens = cl_score(v_p,skeletonize_3d(v_l))

    opened by cow8 4
  • Evaluation Metrics

    Evaluation Metrics

    Hi, I really appreciate your work. Can you please also share the implementation of topology-based evaluation metrics that you used in your paper? That will be very helpful. will look forward to your response. Thanks.

    opened by Alizah-Masood 3
  • predicted mask format

    predicted mask format

    During training, should I convert the output from the Unet to a binary mask prior feeding into soft_dice_cldice? I used raw logits, but the training result is really bad. However when I tried to convert to a binary mask, I lost the grad on the tensor.

    opened by Feanor007 3
  • about cldice loss

    about cldice loss

    Thank you very much for your article. What I want to ask is whether cldice is used as a module? If I have multiple tasks, how should I combine multiple loss functions for training?

    opened by long123524 3
  • soft skeleton

    soft skeleton

    Hi, thanks for the great package.

    I would like to understand why you perform soft erosion using a sequence of 3 min_pooling with separable filters. Can't we instead just use one big min_pool (using -F.max_pool3d(-img, kernel_size=(3,3,3),stride=1,padding=1) ) ?

    opened by etienne87 2
  • clloss can not be used alone, and keep 0 all the time.

    clloss can not be used alone, and keep 0 all the time.

    When i use the cldice to train a vessel segmentation network, i find that the cldice keeps 0 all the time.

    class soft_cldice(nn.Module): def init(self, iter_=3, smooth = 1.): super(soft_cldice, self).init() self.iter = iter_ self.smooth = smooth

    def forward(self, y_pred, y_true):
        y_true = y_true.contiguous().unsqueeze(1).to(float) # to get the true label
        y_pred = (y_pred > 0.5).contiguous().to(float).requires_grad_() #to get the pred mask
        skel_pred = soft_skel(y_pred, self.iter)
        skel_true = soft_skel(y_true, self.iter)
        tprec = (torch.sum(torch.multiply(skel_pred, y_true)[:,1:,...])+self.smooth)/(torch.sum(skel_pred[:,1:,...])+self.smooth)    
        tsens = (torch.sum(torch.multiply(skel_true, y_pred)[:,1:,...])+self.smooth)/(torch.sum(skel_true[:,1:,...])+self.smooth)    
        cl_dice = 1.- 2.0*(tprec*tsens)/(tprec+tsens)
        return cl_dice
    
    opened by coreeey 2
  • Training

    Training

    Can you provide an example of cldice as a loss function to train the network? I used cldice as a loss function for training, but the result is that I can't train it. The training loss curve is also very strange, and even negative values, and the shock is large, and it cannot be converged.

    opened by long123524 2
  • about clDice

    about clDice

    Thanks for a great paper about clDice. I am a newbie who just started deep learning. I want to add clDice as the loss function of the network. How should I implement it? Besides, Can it be combined with other loss functions?

    opened by long123524 2
  • Boundary mask of CREMI?

    Boundary mask of CREMI?

    Hello, how did you get the boundary mask of CREMI? It only provides the neuron_ids. I can extract the boundaries by applying skimage.segmentation.find_boundarys on neuron_ids. But I got boundaries which seem thinner than what you have shown in paper. Did you add something such as a Gaussian blur? Thanks!

    opened by wlsdzyzl 1
  • Thin vessel disappear when use soft_skeleton

    Thin vessel disappear when use soft_skeleton

    Hello, when I use soft_skeleton, I found only large vessels left no matter how I set the iteration, please tell me what should I do? I'm a little confused about the theory. Thanks!

    opened by yangzhenghao 1
  • clDice Metrics

    clDice Metrics

    Hello nice work!

    I was wondering if this is the right implementation of the clDice metrics. Your paper indicated high performance of this metric. However, finding the clDice score of a vessel mask between itself was resulting 91%.

    I used your implementation by the way. Other metrics were reporting 100%. Have any idea what the bug could be?

    opened by jerofad 0
Owner
null
An implementation for the loss function proposed in Decoupled Contrastive Loss paper.

Decoupled-Contrastive-Learning This repository is an implementation for the loss function proposed in Decoupled Contrastive Loss paper. Requirements P

Ramin Nakhli 71 Dec 4, 2022
Recall Loss for Semantic Segmentation (This repo implements the paper: Recall Loss for Semantic Segmentation)

Recall Loss for Semantic Segmentation (This repo implements the paper: Recall Loss for Semantic Segmentation) Download Synthia dataset The model uses

null 32 Sep 21, 2022
Supervised Sliding Window Smoothing Loss Function Based on MS-TCN for Video Segmentation

SSWS-loss_function_based_on_MS-TCN Supervised Sliding Window Smoothing Loss Function Based on MS-TCN for Video Segmentation Supervised Sliding Window

null 3 Aug 3, 2022
Implement of "Training deep neural networks via direct loss minimization" in PyTorch for 0-1 loss

This is the implementation of "Training deep neural networks via direct loss minimization" published at ICML 2016 in PyTorch. The implementation targe

Cuong Nguyen 1 Jan 18, 2022
Losslandscapetaxonomy - Taxonomizing local versus global structure in neural network loss landscapes

Taxonomizing local versus global structure in neural network loss landscapes Int

Yaoqing Yang 8 Dec 30, 2022
Seach Losses of our paper 'Loss Function Discovery for Object Detection via Convergence-Simulation Driven Search', accepted by ICLR 2021.

CSE-Autoloss Designing proper loss functions for vision tasks has been a long-standing research direction to advance the capability of existing models

Peidong Liu(刘沛东) 54 Dec 17, 2022
Multi-scale discriminator feature-wise loss function

Multi-Scale Discriminative Feature Loss This repository provides code for Multi-Scale Discriminative Feature (MDF) loss for image reconstruction algor

Graphics and Displays group - University of Cambridge 76 Dec 12, 2022
PyTorch implementation of Soft-DTW: a Differentiable Loss Function for Time-Series in CUDA

Soft DTW Loss Function for PyTorch in CUDA This is a Pytorch Implementation of Soft-DTW: a Differentiable Loss Function for Time-Series which is batch

Keon Lee 76 Dec 20, 2022
Implementation of "A Deep Learning Loss Function based on Auditory Power Compression for Speech Enhancement" by pytorch

This repository is used to suspend the results of our paper "A Deep Learning Loss Function based on Auditory Power Compression for Speech Enhancement"

ScorpioMiku 19 Sep 30, 2022
Implementation for the paper SMPLicit: Topology-aware Generative Model for Clothed People (CVPR 2021)

SMPLicit: Topology-aware Generative Model for Clothed People [Project] [arXiv] License Software Copyright License for non-commercial scientific resear

Enric Corona 225 Dec 13, 2022
Official repository for the ICLR 2021 paper Evaluating the Disentanglement of Deep Generative Models with Manifold Topology

Official repository for the ICLR 2021 paper Evaluating the Disentanglement of Deep Generative Models with Manifold Topology Sharon Zhou, Eric Zelikman

Stanford Machine Learning Group 34 Nov 16, 2022
[ICCV2021] Official code for "Channel-wise Topology Refinement Graph Convolution for Skeleton-Based Action Recognition"

CTR-GCN This repo is the official implementation for Channel-wise Topology Refinement Graph Convolution for Skeleton-Based Action Recognition. The pap

Yuxin Chen 148 Dec 16, 2022
An official PyTorch implementation of the TKDE paper "Self-Supervised Graph Representation Learning via Topology Transformations".

Self-Supervised Graph Representation Learning via Topology Transformations This repository is the official PyTorch implementation of the following pap

Hsiang Gao 2 Oct 31, 2022
[CVPR 2021] Rethinking Text Segmentation: A Novel Dataset and A Text-Specific Refinement Approach

Rethinking Text Segmentation: A Novel Dataset and A Text-Specific Refinement Approach This is the repo to host the dataset TextSeg and code for TexRNe

SHI Lab 174 Dec 19, 2022
PyToch implementation of A Novel Self-supervised Learning Task Designed for Anomaly Segmentation

Self-Supervised Anomaly Segmentation Intorduction This is a PyToch implementation of A Novel Self-supervised Learning Task Designed for Anomaly Segmen

WuFan 2 Jan 27, 2022
A Peer-to-peer Platform for Secure, Privacy-preserving, Decentralized Data Science

PyGrid is a peer-to-peer network of data owners and data scientists who can collectively train AI models using PySyft. PyGrid is also the central serv

OpenMined 615 Jan 3, 2023
Bachelor's Thesis in Computer Science: Privacy-Preserving Federated Learning Applied to Decentralized Data

federated is the source code for the Bachelor's Thesis Privacy-Preserving Federated Learning Applied to Decentralized Data (Spring 2021, NTNU) Federat

Dilawar Mahmood 25 Nov 30, 2022
This is the research repository for Vid2Doppler: Synthesizing Doppler Radar Data from Videos for Training Privacy-Preserving Activity Recognition.

Vid2Doppler: Synthesizing Doppler Radar Data from Videos for Training Privacy-Preserving Activity Recognition This is the research repository for Vid2

Future Interfaces Group (CMU) 26 Dec 24, 2022
Tensorflow implementation of the paper "HumanGPS: Geodesic PreServing Feature for Dense Human Correspondences", CVPR 2021.

HumanGPS: Geodesic PreServing Feature for Dense Human Correspondences Tensorflow implementation of the paper "HumanGPS: Geodesic PreServing Feature fo

Google Interns 50 Dec 21, 2022