New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Differences between .data and .detach #6990
Comments
Here's an example. If you use
As opposed to using
I'll leave this issue open: we should add an example to the migration guide and clarify that section. |
Hi Richard @zou3519 Thanks for your reply! |
Why the fllowing code working?
|
Any use-cases where |
Change tensor.data to tensor.detach() due to pytorch/pytorch#6990
Thank you for your example, but I saw other ppl's video saying autograd will check the version of the tensor to prevent this from happening. I am new to this, so I am a little confused. |
because value of out is not used for computing the gradient, even though value of out is change, the computed gradient w.r.t. a is still correct. tensor.detach() could detect whether tensors involved in computing gradient are changed or not, but tensor.data has no such functionality. |
Issue description
Hi all,
I am not very clear about the differences between .data and .detach() in the latest pytorch 0.4.
For example:
a = torch.tensor([1,2,3], requires_grad = True)
b = a.data
c = a.detach()
so b is not as the same as the c?
Here is a part of the 'PyTorch 0.4.0 Migration Guide':
"However, .data can be unsafe in some cases. Any changes on x.data wouldn’t be tracked by autograd, and the computed gradients would be incorrect if x is needed in a backward pass. A safer alternative is to use x.detach(), which also returns a Tensor that shares data with requires_grad=False, but will have its in-place changes reported by autograd if x is needed in backward."
Can anyone give me more explanations about this sentence: "but will have its in-place changes reported by autograd if x is needed in backward"? Thanks!
The text was updated successfully, but these errors were encountered: