Hi Arif! [image] arif_saeed: In order to ensure that I understood how BCE with logits loss works in pytorch, I tried to manually calculate the loss, however I cannot reconcile my manual calculation with the loss generated by the pytorch function F.binary_cross_entropy_with_logits. p1=y*(math.l… Answer from KFrank on discuss.pytorch.org
🌐
PyTorch Forums
discuss.pytorch.org › vision
How to use binary cross entropy with logits in binary target and 3d output - vision - PyTorch Forums
August 7, 2019 - I have batch size = 5 my network output is given by the following code Output = F.upsample(per_frame_logits, t, mode='linear') Shape of output is = torch.Size([5, 2, 64]) Shape of target is = torch.Size([5]) (i.e. ex [1.0, 0.0, 0.0, 1.0, 1.0]) Then i pass it to following loss function loss = F.binary_cross_entropy_with_logits(output, target) I get the following value error raise ValueError("Target size ({}) must be the same as input size ({})".format(target.size(), input.size())) ValueErro...
Discussions

Binary Cross Entropy with logits does not work as expected
When I use F.binary_cross_entropy in combination with the sigmoid function, the model trains as expected on MNIST. However, when changing to the F.binary_cross_entropy_with_logits function, the loss suddenly becomes arbitrarily small during training and the model no longer produces meaningful ... More on discuss.pytorch.org
🌐 discuss.pytorch.org
1
0
September 14, 2019
Info about binary cross entropy with logits
The function torch.nn.functional.binary_cross_entropy_with_logits actually returns a call to the function torch.binary_cross_entropy_with_logits. But I can’t find any information about it. It doesn’t have any docstring either. What is the actual code that is called and how is it called? More on discuss.pytorch.org
🌐 discuss.pytorch.org
0
0
April 1, 2019
pytorch - binary_cross_entropy_with_logits produces negative output - Stack Overflow
I am developing a machine learning model to detect bones from a skeleton image. I am using pytorch, and the model i am using is the hourglass model. When i use binary_cross_entropy_with_logits i ca... More on stackoverflow.com
🌐 stackoverflow.com
deep learning - How is cross entropy loss work in pytorch? - Stack Overflow
I am experimenting with some of the pytorch codes. With cross entropy loss I found some interesting results and I have used both binary cross entropy loss and cross entropy loss of pytorch. import ... More on stackoverflow.com
🌐 stackoverflow.com
🌐
Sebastian Raschka
sebastianraschka.com › faq › docs › pytorch-crossentropy.html
Why are there so many ways to compute the Cross Entropy Loss in PyTorch and how do they differ? | Sebastian Raschka, PhD
January 17, 2026 - In PyTorch, these refer to implementations that accept different input arguments (but compute the same thing). This is summarized below. torch.nn.functional.binary_cross_entropy takes logistic sigmoid values as inputs · torch.nn.functional.binary_cross_entropy_with_logits takes logits as inputs
🌐
Python Guides
pythonguides.com › pytorch-binary-cross-entropy
PyTorch Binary Cross Entropy Loss
June 17, 2025 - PyTorch Binary cross entropy with logits combines a sigmoid activation and the binary cross entropy loss in one single class.
🌐
Medium
zhang-yang.medium.com › how-is-pytorchs-binary-cross-entropy-with-logits-function-related-to-sigmoid-and-d3bd8fb080e7
How is Pytorch’s binary_cross_entropy_with_logits function related to sigmoid and binary_cross_entropy | by Yang Zhang | Medium
August 25, 2019 - This notebook breaks down how binary_cross_entropy_with_logits function (corresponding to BCEWithLogitsLoss used for multi-class classification) is implemented in pytorch, and how it is related to sigmoid and binary_cross_entropy.
🌐
GitHub
gist.github.com › yang-zhang › 09460d9e90a1bf29fb6edf121865df86
binary cross entropy implementation in pytorch · GitHub
binary cross entropy implementation in pytorch · Display the source blob · Display the rendered blob Raw · binary_cross_entropy_with_logits.ipynb · Loading · Sorry, something went wrong. Reload? Sorry, we cannot display this file. Sorry, this file is invalid so it cannot be displayed.
🌐
PyTorch Forums
discuss.pytorch.org › autograd
Info about binary cross entropy with logits - autograd - PyTorch Forums
April 1, 2019 - The function torch.nn.functional.binary_cross_entropy_with_logits actually returns a call to the function torch.binary_cross_entropy_with_logits. But I can’t find any information about it. It doesn’t have any docstring e…
Find elsewhere
🌐
PyTorch Forums
discuss.pytorch.org › t › does-binary-cross-entropy-with-logits-0-5-equals-random-guess › 180264
Does binary cross entropy with logits = 0.5 equals random guess? - PyTorch Forums
May 18, 2023 - Hi, I have 256 samples labeled with 1 and 256 samples labeled with 0. My loss seems to converge to 0.51 Does it mean, the model only makes a random guess? To be precise I have domain_loss = F.binary_cross_entropy_with_logits(domain_predictions, domain_y) and the printout converges to 0.51
🌐
PyTorch Forums
discuss.pytorch.org › t › binary-cross-entropy-with-logits-with-weights › 198424
Binary cross entropy with logits with weights - PyTorch Forums
March 7, 2024 - Hi ! I am currently working with the function torch.nn.functional.binary_cross_entropy_with_logits torch.nn.functional.binary_cross_entropy_with_logits — PyTorch 2.2 documentation and I have some questions. I am not sur…
🌐
CodingNomads
codingnomads.com › binary-classification-binary-cross-entropy
Binary Classification: Binary Cross Entropy
You'll see that the two loss values are very close but not exactly the same. This small difference is due to the numerical instability when working with probabilities close to 0 or 1. In this case, it is better to work with logits! F.binary_cross_entropy_with_logits(logits, labels) - F.binary_cross_entropy(logits.sigmoid(), labels)
🌐
Stack Overflow
stackoverflow.com › questions › 71585313 › why-my-losses-are-in-thousands-when-using-binary-cross-entropy-with-logits-witho
pytorch - Why my losses are in thousands when using binary_cross_entropy_with_logits without sigmoid beforehand? - Stack Overflow
import torch.nn.functional as F weight = torch.ones((4, 56, 96, 96)) weight[:, :, 30, 30] = 100. label = torch.zeros((4, 56, 96, 96)) weight[:, :, 25, 25] = 1. out = mynetwork(input) # shape is (4, 56, 96, 96), network arch is UNET loss = F.binary_cross_entropy_with_logits(out, label , pos_weight=weight)
🌐
PyTorch Forums
discuss.pytorch.org › vision
Equivalent of TensorFlow's Sigmoid Cross Entropy With Logits in Pytorch - vision - PyTorch Forums
April 18, 2017 - I am trying to find the equivalent of sigmoid_cross_entropy_with_logits loss in Pytorch but the closest thing I can find is the MultiLabelSoftMarginLoss. Can someone direct me to the equivalent loss? If it doesn't exis…
🌐
GitHub
github.com › eladhoffer › utils.pytorch › blob › master › cross_entropy.py
utils.pytorch/cross_entropy.py at master · eladhoffer/utils.pytorch
return F.binary_cross_entropy_with_logits(inputs, target, weight=weight, reduction=reduction)
Author   eladhoffer