Implements popular segmentation loss functions.

Losses implemented here:

Loss Wrapper functions

Wrapper for handling different tensor types from fastai.

class FastaiLoss[source]

FastaiLoss(loss, axis=1) :: _Loss

Wrapper class around loss function for handling different tensor types.

Wrapper for combining different losses, adapted from from pytorch-toolbelt

class WeightedLoss[source]

WeightedLoss(loss, weight=1.0) :: _Loss

Wrapper class around loss function that applies weighted with fixed factor. This class helps to balance multiple losses if they have different scales

class JointLoss[source]

JointLoss(first:Module, second:Module, first_weight=1.0, second_weight=1.0) :: _Loss

Wrap two loss functions into one. This class computes a weighted sum of two losses.

The get_loss() function loads popular segmentation losses from Segmenation Models Pytorch and kornia:

  • (Soft) CrossEntropy Loss
  • Dice Loss
  • Jaccard Loss
  • Focal Loss
  • Lovasz Loss
  • TverskyLoss

class Poly1CrossEntropyLoss[source]

Poly1CrossEntropyLoss(num_classes:int, epsilon:float=1.0, reduction:str='mean') :: Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

n_classes=2
output = torch.randn(4, n_classes, 356, 356, requires_grad=True)
target = torch.randint(0, n_classes, (4, 356, 356))

tst = Poly1CrossEntropyLoss(num_classes=n_classes) 
loss = tst(output, target)

get_loss[source]

get_loss(loss_name, mode='multiclass', classes=[1], smooth_factor=0.0, alpha=0.5, beta=0.5, gamma=2.0, reduction='mean', **kwargs)

Load losses from based on loss_name

n_classes = 2
#output = TensorImage(torch.randn(4, n_classes, 356, 356, requires_grad=True))
#target = TensorMask(torch.randint(0, n_classes, (4, 356, 356)))
output = torch.randn(4, n_classes, 356, 356, requires_grad=True)
target = torch.randint(0, n_classes, (4, 356, 356))
for loss_name in LOSSES:
    print(f'Testing {loss_name}')
    tst = get_loss(loss_name, classes=list(range(1,n_classes))) 
    loss = tst(output, target)
Testing CrossEntropyLoss
Testing DiceLoss
Testing SoftCrossEntropyLoss
Testing CrossEntropyDiceLoss
Testing JaccardLoss
Testing FocalLoss
Testing LovaszLoss
Testing TverskyLoss
Installing kornia. Please wait.
Collecting kornia
  Downloading kornia-0.6.4-py2.py3-none-any.whl (493 kB)
Requirement already satisfied: torch>=1.8.1 in /home/magr/.conda/envs/fastai2/lib/python3.9/site-packages (from kornia) (1.10.2)
Requirement already satisfied: packaging in /home/magr/.local/lib/python3.9/site-packages (from kornia) (21.3)
Requirement already satisfied: typing_extensions in /home/magr/.conda/envs/fastai2/lib/python3.9/site-packages (from torch>=1.8.1->kornia) (3.10.0.2)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /home/magr/.local/lib/python3.9/site-packages (from packaging->kornia) (3.0.7)
Installing collected packages: kornia
Successfully installed kornia-0.6.4
Testing Poly1CrossEntropyLoss
ce1 = get_loss('SoftCrossEntropyLoss', smooth_factor=0)
ce2 = CrossEntropyLossFlat(axis=1)
test_close(ce1(output, target), ce2(output, target), eps=1e-04)
jc = get_loss('JaccardLoss')
dc = get_loss('DiceLoss')
dc_loss = dc(output, target)
dc_to_jc = 2*dc_loss/(dc_loss+1) #it seems to be the other way around?
test_close(jc(output, target), dc_to_jc, eps=1e-02)
tw = get_loss("TverskyLoss", alpha=0.5, beta=0.5)
test_close(dc(output, target), tw(output, target), eps=1e-02)
output = torch.randn(4, n_classes, 356, 356)
output[:,1,...] = 0.5
tst = get_loss(loss_name='DiceLoss', classes=None) 
tst2 = get_loss(loss_name='DiceLoss', classes=list(range(1,n_classes))) 
test_ne(tst(output, target), tst2(output, target))