Sequential¶
Description¶
A container which carries out a sequential connection of modules.
Initializing¶
def __init__(self, name=None):
Parameters
Parameter | Allowed types | Description | Default |
---|---|---|---|
name | str | Container name | None |
Explanations
-
Examples¶
Simple sequence¶
Simple sequence:
- Padding of the input tensor from the bottom and to the right with 2 additional rows / columns
- Maximum value pooling
- Flattening of the tensor
Necessary imports:
>>> import numpy as np
>>> from PuzzleLib.Backend import gpuarray
>>> from PuzzleLib.Containers import Sequential
>>> from PuzzleLib.Modules import Pad2D, MaxPool2D, Flatten
Info
gpuarray
is necessary for the correct placement of the tensor in the GPU
Initialization of a sequence and addition of the above operations to it using the append
method:
>>> seq = Sequential()
>>> seq.append(Pad2D(pad=(0, 2, 0, 2), fillValue=255, name="pad"))
>>> seq.append(MaxPool2D(name="pool"))
>>> seq.append(Flatten(name="flatten"))
Synthetic tensor:
>>> data = np.random.randint(0, 127, size=(1, 1, 4, 4)).astype(np.float32)
>>> print(data)
[[[[105. 62. 58. 56.]
[100. 80. 26. 5.]
[105. 56. 29. 79.]
[114. 89. 54. 117.]]]]
Important
For the library to work correctly with tensors having four axes (for example, pictures with NCHW axes, where N is the tensor number in the batch, C is the number of channels (maps), H is the height, W is the width), two conditions must be fulfilled:
- The sequence of axes of the tensor must be - (N, C, H, W)
- Tensor data type must be - float32
Placement of the initial tensor in the GPU and running it through the sequence:
>>> seq(gpuarray.to_gpu(data))
Tip
Any element of a sequence can be addressed either by its name or by its index
Result:
>>> # 'pad' layer results
>>> print(seq["pad"].data)
[[[[105. 62. 58. 56. 0. 0.]
[100. 80. 26. 5. 0. 0.]
[105. 56. 29. 79. 0. 0.]
[114. 89. 54. 117. 0. 0.]
[ 0. 0. 0. 0. 0. 0.]
[ 0. 0. 0. 0. 0. 0.]]]]
>>> # 'pool' layer results
>>> print(seq["pool"].data)
[[[[105. 58. 0.]
[114. 117. 0.]
[ 0. 0. 0.]]]]
>>> # 'flatten' layer results
>>> print(seq["flatten"].data)
[[105. 58. 0. 114. 117. 0. 0. 0. 0.]]
Cyclic expansion¶
Let us suppose that you need to repeat the same set of blocks on the network several times, which consist, for example, of:
Necessary imports:
>>> from PuzzleLib.Modules import MulAddConst, Activation
Function for creating component blocks:
def block(numb):
block = Sequential()
block.append(MulAddConst(a=2, b=0, name="mul_const_{}".format(numb)))
block.append(Activation(activation="relu", name="act_{}".format(numb)))
return block
Extension of the base sequence with blocks:
>>> for i in range(3):
... seq.extend(block(i))
Synthetic tensor:
>>> data = np.random.randint(-5, 5, size=(1, 1, 4, 4)).astype(np.float32)
>>> print(data, end="\n\n")
[[[[ 3. 0. -2. -2.]
[-5. -2. -2. 1.]
[ 4. -2. -5. -1.]
[ 2. -3. -5. 2.]]]]
Placement of the initial tensor in the GPU and running it through the sequence:
>>> seq(gpuarray.to_gpu(data))
Result:
>>> # 'mul_const_0' layer results
>>> print(seq["mul_const_0"].data)
[[[[ 6. 0. -4. -4.]
[-10. -4. -4. 2.]
[ 8. -4. -10. -2.]
[ 4. -6. -10. 4.]]]]
>>> # 'act_0' layer results
>>> print(seq["act_0"].data)
[[[[6. 0. 0. 0.]
[0. 0. 0. 2.]
[8. 0. 0. 0.]
[4. 0. 0. 4.]]]]