山西省住房建设厅网站,佛山外贸建站,如何制作一个官网,广州做网站的企业在pytorch的tensor类中,有个is_leaf的属性,姑且把它作为叶子节点. is_leaf 为False的时候,则不是叶子节点, is_leaf为True的时候为叶子节点(或者叶张量)
所以问题来了: leaf的作用是什么?为什么要加 leaf? 我们都知道tensor中的 requires_grad()属性#xff0c;当requires_…在pytorch的tensor类中,有个is_leaf的属性,姑且把它作为叶子节点. is_leaf 为False的时候,则不是叶子节点, is_leaf为True的时候为叶子节点(或者叶张量)
所以问题来了: leaf的作用是什么?为什么要加 leaf? 我们都知道tensor中的 requires_grad()属性当requires_grad()为True时我们将会记录tensor的运算过程并为自动求导做准备但是并不是每个requires_grad()设为True的值都会在backward的时候得到相应的grad它还必须为leaf。这就说明 leaf成为了在 requires_grad()下判断是否需要保留 grad的前提条件
is_leaf()
按照惯例,所有requires_grad为False的张量(Tensor) 都为叶张量( leaf Tensor)requires_grad为True的张量(Tensor),如果他们是由用户创建的,则它们是叶张量(leaf Tensor).这意味着它们不是运算的结果,因此gra_fn为None只有是叶张量的tensor在反向传播时才会将本身的grad传入的backward的运算中. 如果想得到当前tensor在反向传播时的grad, 可以用retain_grad()这个属性
例子: a torch.rand(10, requires_gradTrue)a.is_leaf
Trueb torch.rand(10, requires_gradTrue).cuda()b.is_leaf
False
# b was created by the operation that cast a cpu Tensor into a cuda Tensorc torch.rand(10, requires_gradTrue) 2c.is_leaf
False
# c was created by the addition operationd torch.rand(10).cuda()d.is_leaf
True
# d does not require gradients and so has no operation creating it (that is tracked by the autograd engine)e torch.rand(10).cuda().requires_grad_()e.is_leaf
True
# e requires gradients and has no operations creating itf torch.rand(10, requires_gradTrue, devicecuda)f.is_leaf
True
# f requires grad, has no operation creating it