Skip to content



The loss function that calculates binary cross-entropy (binary cross-entropy - BCE), which is a special case of ordinary cross-entropy.

It is used in classification tasks with two classes.

The error function formula is:

H = y\cdot \log{p(y)} + (1-y)\cdot \log(1-p(y))


y - binary indicator of belonging to the target class;
p(y) - probability of belonging to the target class predicted by the classifier.


def __init__(self):






Binary classification

Necessary imports:

import numpy as np
from PuzzleLib.Backend import gpuarray
from PuzzleLib.Cost import BCE


gpuarray is required to properly place the tensor in the GPU.

Synthetic target and prediction tensors:

scores = gpuarray.to_gpu(np.random.randn(10, 1).astype(np.float32))
labels = gpuarray.to_gpu(np.random.randn(10,).astype(np.float32))


Please remember that the first dimension of target and prediction tensors is the size of the batch.

Initializing the error function:

bce = BCE()

Calculating the error and the gradient on the batch:

error, grad = bce(scores, labels)

Multiclass Classification with Binary Flags

Everything is the same as for the previous example, except for the scores and labels tensors. Let us suppose you must determine the presence or absence of four attributes of an object. In this case, the above tensors will be specified in the following way:

scores = gpuarray.to_gpu(np.random.randn(10, 4).astype(np.float32))
labels = gpuarray.to_gpu(np.random.randn(10, 4).astype(np.float32))