Skip to content

Pad1D

Description

Info

Parent class: Module

Derived classes: -

This module implements the operation of one-dimensional padding.

The padding operation is applied along with convolution layers of neural networks: user-defined constants are added in a certain way (see mode parameter) during the operation of padding around the perimeter of the data tensor, as a result, a layer of padding elements is formed around the original data.

This can be used, for example, so that the size of the maps would be preserved after the convolution.

There are several ways to set this operation; the following are implemented in the library:

  • filling with constant value;
  • filling with reflection of the last element.

Initializing

def __init__(self, pad, mode="constant", fillValue=None, name=None):

Parameters

Parameters Allowed types Description Default
pad tuple Tuple of binary flags showing the sides to which the operation should be applied -
mode str Fill mode. Possible values: "constant", "reflect" "constant"
fillValue float Fill value during "constant" mode None
name str Layer name None

Explanations

pad - tuple has the configuration (left, right), where the place of the side designation, the number of additional elements for the corresponding side is set. For example, (1, 0) means that you need to add one extra element to the left side of the tensor.

Examples

Necessary imports.

import numpy as np
from PuzzleLib.Backend import gpuarray
from PuzzleLib.Modules import Pad1D

Info

gpuarray is required to properly place the tensor in the GPU

batchsize, maps, size = 1, 1, 5
np.random.seed(1234)
data = gpuarray.to_gpu(np.random.randint(1, 10, (batchsize, maps, size)).astype(np.float32))
print(data)
[[[4. 7. 6. 5. 9.]]]

Let us add 2 elements on the right side in the "constant" mode:

padmod = Pad1D(pad=(0, 2))
print(padmod(data))
[[[4. 7. 6. 5. 9. 0. 0.]]]

Let us change the mode to "reflect" and add padding to the left side:

padmod = Pad1D(pad=(2, 0), mode="reflect")
print(padmod(data))
[[[6. 7. 4. 7. 6. 5. 9.]]]