Code adapted from https://github.com/qubvel/ttach.

Functional

ScriptFunction object at 0x7f649c599590>[source]

ScriptFunction object at 0x7f649c599590>()

rotate batch of images by 90 degrees k times

ScriptFunction object at 0x7f649c599310>[source]

ScriptFunction object at 0x7f649c599310>()

flip batch of images horizontally

ScriptFunction object at 0x7f649c59d630>[source]

ScriptFunction object at 0x7f649c59d630>()

flip batch of images vertically

Base Class

class BaseTransform[source]

BaseTransform(pname:str, params:Union[list, tuple]) :: Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

tst_base = BaseTransform('x', (1,2))
torch.jit.script(tst_base)
RecursiveScriptModule(original_name=BaseTransform)

Transform Classes

class HorizontalFlip[source]

HorizontalFlip() :: BaseTransform

Flip images horizontally (left->right)

imgs = torch.randn(4, 2, 356, 356)
t = torch.jit.script(HorizontalFlip())
aug = t(imgs, 1, True)
deaug = t(aug, 1, False)
test_eq(imgs, deaug)

class VerticalFlip[source]

VerticalFlip() :: BaseTransform

Flip images vertically (up->down)

t = torch.jit.script(VerticalFlip())
aug = t(imgs, 1, True)
deaug = t(aug, 1, False)
test_eq(imgs, deaug)

class Rotate90[source]

Rotate90(angles:List[int]) :: BaseTransform

Rotate images 0/90/180/270 degrees (angles)

t = torch.jit.script(Rotate90([180]))
aug = t(imgs, 90, False)
deaug = t(aug, 90, True)
test_eq(imgs, deaug)

Pipeline Classes

class Chain[source]

Chain(transforms:List[BaseTransform]) :: Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

tfms=[HorizontalFlip(),VerticalFlip(), Rotate90(angles=[90,180,270])]
args = [1, 1, 90]

tst_chain = torch.jit.script(Chain(tfms))
tst_chain_deaug = torch.jit.script(Chain(tfms[::-1]))

aug = tst_chain(imgs, args, False)
deaug = tst_chain_deaug(aug, args[::-1], True)
test_eq(imgs, deaug)

class Transformer[source]

Transformer(transforms:List[BaseTransform], args:List[int]) :: Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

tst_tfm = Transformer(tfms, args)
torch.jit.script(tst_tfm)
aug = tst_tfm.augment(imgs)
deaug = tst_tfm.deaugment(aug)
test_eq(imgs, deaug)

class Compose[source]

Compose(aug_transforms:List[BaseTransform]) :: Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

c = Compose(tfms)
c = torch.jit.script(c)
out = []
for t in c.items:
    aug = t.augment(imgs)
    deaug = t.deaugment(aug)
    out.append(deaug)
    test_eq(imgs, deaug)
out = torch.stack(out)
test_close(imgs, torch.mean(out, dim=0))