test_autograd.py 文件源码

python
阅读 23 收藏 0 点赞 0 评论 0

项目:pytorch 作者: pytorch 项目源码 文件源码
def test_mark_non_differentiable_none(self):
        # This used to segfault because MyFunction would send back null
        # gradients to MulBackward, which is implemented in C++. C++
        # implemented functions expect incoming  grad_ouptuts to be non-null.
        class MyFunction(Function):
            @staticmethod
            def forward(ctx, input):
                output = input.clone()
                ctx.mark_non_differentiable(output)
                return output

            @staticmethod
            def backward(ctx, grad_output):
                return None

        x = Variable(torch.randn(5, 5), requires_grad=True)
        r = MyFunction.apply(x * x)
        (r * x).sum().backward()
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号