Pytorch: MultiMarginLoss has no attribute 'backward'

Created on 6 Apr 2017  ·  3Comments  ·  Source: pytorch/pytorch

Is this a bug? torch.__version__ is '0.1.11+b13b701' .

Most helpful comment

Hi,
The nn.Module does not have a backward (none of them have), their forward is implemented with autograd compliant methods and is thus automatically differentiated.
If you want to find the implementation for MultiMarginLoss, it is implemented here in c.

All 3 comments

Works fine for me with (almost) the latest version ('0.1.11+8aa1cef')

import torch
import torch.nn as nn
from torch.autograd import Variable

y = Variable(torch.rand(5, 3), requires_grad=True)
t = Variable(torch.LongTensor(5).random_(0, 2))
m = nn.MultiMarginLoss()
loss = m(y, t)
loss.backward()
print(y.grad)

outputs

Variable containing:
-0.1333  0.0667  0.0667
 0.0667 -0.1333  0.0667
 0.0667 -0.1333  0.0667
 0.0667 -0.1333  0.0667
 0.0667 -0.1333  0.0667
[torch.FloatTensor of size 5x3]

Hi,
The nn.Module does not have a backward (none of them have), their forward is implemented with autograd compliant methods and is thus automatically differentiated.
If you want to find the implementation for MultiMarginLoss, it is implemented here in c.

Thanks. I'm just getting started with PyTorch. I understand it now.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

eliabruni picture eliabruni  ·  3Comments

mishraswapnil picture mishraswapnil  ·  3Comments

rajarshd picture rajarshd  ·  3Comments

NgPDat picture NgPDat  ·  3Comments

soumith picture soumith  ·  3Comments