Pytorch: MultiMarginLoss์—๋Š” 'backward' ์†์„ฑ์ด ์—†์Šต๋‹ˆ๋‹ค.

์— ๋งŒ๋“  2017๋…„ 04์›” 06์ผ  ยท  3์ฝ”๋ฉ˜ํŠธ  ยท  ์ถœ์ฒ˜: pytorch/pytorch

๋ฒ„๊ทธ์ธ๊ฐ€์š”? torch.__version__ ์€ '0.1.11+b13b701' ์ž…๋‹ˆ๋‹ค.

๊ฐ€์žฅ ์œ ์šฉํ•œ ๋Œ“๊ธ€

์•ˆ๋…•,
nn.Module ์—๋Š” ๋ฐฑ์›Œ๋“œ๊ฐ€ ์—†์œผ๋ฉฐ(์–ด๋Š ๊ฒƒ๋„ ์—†์Œ), ํฌ์›Œ๋“œ๋Š” autograd ํ˜ธํ™˜ ๋ฐฉ๋ฒ•์œผ๋กœ ๊ตฌํ˜„๋˜๋ฏ€๋กœ ์ž๋™์œผ๋กœ ๊ตฌ๋ถ„๋ฉ๋‹ˆ๋‹ค.
MultiMarginLoss ์— ๋Œ€ํ•œ ๊ตฌํ˜„์„ ์ฐพ์œผ๋ ค๋ฉด ์—ฌ๊ธฐ c์—์„œ ๊ตฌํ˜„๋ฉ๋‹ˆ๋‹ค.

๋ชจ๋“  3 ๋Œ“๊ธ€

(๊ฑฐ์˜) ์ตœ์‹  ๋ฒ„์ „( '0.1.11+8aa1cef' )์—์„œ ์ž˜ ์ž‘๋™ํ•ฉ๋‹ˆ๋‹ค.

import torch
import torch.nn as nn
from torch.autograd import Variable

y = Variable(torch.rand(5, 3), requires_grad=True)
t = Variable(torch.LongTensor(5).random_(0, 2))
m = nn.MultiMarginLoss()
loss = m(y, t)
loss.backward()
print(y.grad)

์ถœ๋ ฅ

Variable containing:
-0.1333  0.0667  0.0667
 0.0667 -0.1333  0.0667
 0.0667 -0.1333  0.0667
 0.0667 -0.1333  0.0667
 0.0667 -0.1333  0.0667
[torch.FloatTensor of size 5x3]

์•ˆ๋…•,
nn.Module ์—๋Š” ๋ฐฑ์›Œ๋“œ๊ฐ€ ์—†์œผ๋ฉฐ(์–ด๋Š ๊ฒƒ๋„ ์—†์Œ), ํฌ์›Œ๋“œ๋Š” autograd ํ˜ธํ™˜ ๋ฐฉ๋ฒ•์œผ๋กœ ๊ตฌํ˜„๋˜๋ฏ€๋กœ ์ž๋™์œผ๋กœ ๊ตฌ๋ถ„๋ฉ๋‹ˆ๋‹ค.
MultiMarginLoss ์— ๋Œ€ํ•œ ๊ตฌํ˜„์„ ์ฐพ์œผ๋ ค๋ฉด ์—ฌ๊ธฐ c์—์„œ ๊ตฌํ˜„๋ฉ๋‹ˆ๋‹ค.

๊ฐ์‚ฌ ํ•ด์š”. ์ด์ œ ๋ง‰ PyTorch๋ฅผ ์‹œ์ž‘ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ์ด์ œ ์ดํ•ดํ•ฉ๋‹ˆ๋‹ค.

์ด ํŽ˜์ด์ง€๊ฐ€ ๋„์›€์ด ๋˜์—ˆ๋‚˜์š”?
0 / 5 - 0 ๋“ฑ๊ธ‰