浅谈对pytroch中torch.autograd.backward的思考

作者:https://oldpan.me/archives/pytroch-torch-autograd-backward 时间:2023-10-28 13:21:44 

反向传递法则是深度学习中最为重要的一部分,torch中的backward可以对计算图中的梯度进行计算和累积

这里通过一段程序来演示基本的backward操作以及需要注意的地方


>>> import torch
>>> from torch.autograd import Variable

>>> x = Variable(torch.ones(2,2), requires_grad=True)
>>> y = x + 2
>>> y.grad_fn
Out[6]: <torch.autograd.function.AddConstantBackward at 0x229e7068138>
>>> y.grad

>>> z = y*y*3
>>> z.grad_fn
Out[9]: <torch.autograd.function.MulConstantBackward at 0x229e86cc5e8>
>>> z
Out[10]:
Variable containing:
27 27
27 27
[torch.FloatTensor of size 2x2]
>>> out = z.mean()
>>> out.grad_fn
Out[12]: <torch.autograd.function.MeanBackward at 0x229e86cc408>
>>> out.backward()   # 这里因为out为scalar标量,所以参数不需要填写
>>> x.grad
Out[19]:
Variable containing:
4.5000 4.5000
4.5000 4.5000
[torch.FloatTensor of size 2x2]
>>> out  # out为标量
Out[20]:
Variable containing:
27
[torch.FloatTensor of size 1]

>>> x = Variable(torch.Tensor([2,2,2]), requires_grad=True)
>>> y = x*2
>>> y
Out[52]:
Variable containing:
4
4
4
[torch.FloatTensor of size 3]
>>> y.backward() # 因为y输出为非标量,求向量间元素的梯度需要对所求的元素进行标注,用相同长度的序列进行标注
Traceback (most recent call last):
File "C:\Users\dell\Anaconda3\envs\my-pytorch\lib\site-packages\IPython\core\interactiveshell.py", line 2862, in run_code
 exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-53-95acac9c3254>", line 1, in <module>
 y.backward()
File "C:\Users\dell\Anaconda3\envs\my-pytorch\lib\site-packages\torch\autograd\variable.py", line 156, in backward
 torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
File "C:\Users\dell\Anaconda3\envs\my-pytorch\lib\site-packages\torch\autograd\__init__.py", line 86, in backward
 grad_variables, create_graph = _make_grads(variables, grad_variables, create_graph)
File "C:\Users\dell\Anaconda3\envs\my-pytorch\lib\site-packages\torch\autograd\__init__.py", line 34, in _make_grads
 raise RuntimeError("grad can be implicitly created only for scalar outputs")
RuntimeError: grad can be implicitly created only for scalar outputs

>>> y.backward(torch.FloatTensor([0.1, 1, 10]))
>>> x.grad        #注意这里的0.1,1.10为梯度求值比例
Out[55]:
Variable containing:
0.2000
2.0000
20.0000
[torch.FloatTensor of size 3]

>>> y.backward(torch.FloatTensor([0.1, 1, 10]))
>>> x.grad        # 梯度累积
Out[57]:
Variable containing:
0.4000
4.0000
40.0000
[torch.FloatTensor of size 3]

>>> x.grad.data.zero_() # 梯度累积进行清零
Out[60]:
0
0
0
[torch.FloatTensor of size 3]
>>> x.grad       # 累积为空
Out[61]:
Variable containing:
0
0
0
[torch.FloatTensor of size 3]
>>> y.backward(torch.FloatTensor([0.1, 1, 10]))
>>> x.grad
Out[63]:
Variable containing:
0.2000
2.0000
20.0000
[torch.FloatTensor of size 3]

来源:https://oldpan.me/archives/pytroch-torch-autograd-backward

标签:pytroch,torch.autograd.backward
0
投稿

猜你喜欢

  • 利用Python半自动化生成Nessus报告的方法

    2021-03-10 23:04:01
  • layui使用button按钮 点击出现弹层 弹层中加载表单的实例

    2024-05-02 17:22:15
  • Bootstrap 表单验证formValidation 实现表单动态验证功能

    2024-04-22 12:51:36
  • window.onload和$(function(){})的区别介绍

    2024-06-07 15:51:44
  • Python类属性与实例属性用法分析

    2022-10-12 03:14:58
  • Python可视化学习之seaborn绘制矩阵图详解

    2023-02-27 09:25:36
  • 使用 python pyautogui实现鼠标键盘控制功能

    2023-11-17 07:01:41
  • Golang详细讲解常用Http库及Gin框架的应用

    2023-08-25 23:14:41
  • PHP实现HTML页面静态化的方法

    2023-10-15 04:09:59
  • 使用python 写一个静态服务(实战)

    2023-09-29 15:57:25
  • 如何使数据库的ID字段自动加1?

    2010-06-03 10:47:00
  • python读取并定位excel数据坐标系详解

    2022-02-25 19:01:00
  • 使用opencv中匹配点对的坐标提取方式

    2022-12-02 00:32:54
  • 浅析Go设计模式之Facade(外观)模式

    2023-07-16 19:23:08
  • vue项目中全局引入1个.scss文件的问题解决

    2024-05-29 22:44:38
  • MySQL中where 1=1方法的使用及改进

    2024-01-17 22:00:59
  • python实现简单反弹球游戏

    2021-05-06 23:25:45
  • Python进度条tqdm的用法详解

    2022-09-03 00:27:35
  • [项目布局配置]Nosql与PythonWeb-Flask框架组合

    2022-07-08 19:55:40
  • python判断端口是否打开的实现代码

    2021-12-16 04:59:48
  • asp之家 网络编程 m.aspxhome.com