Autograd Usage in PyTorch : Creation and Backward Propagation

Autograd Usage in PyTorch

Like other deep learning frameworks, PyTorch also uses autograd for automatic differentiation of all the operations done on the tensors. Other frameworks create static computational graphs while Pytorch creates graphs on the fly ( at runtime or dynamic computational graph). In this article, you will learn Autograd Usage in PyTorch.

How Pytorch calculate gradients?

After enabling the gradients in the tensor, all the calculations, and operations are tracked. When you call the backward() method gradients are automatically created. If the tensor is scalar then you call backward() method without arguments but if the tensor has more than one elements then you have to pass the gradient of the same size as an argument. Learn Pytorch Basics if you have are beginners for a more clear understanding of this entire tutorial.

How to create tensor with autrograde?

To create a tensor with autograde then you have to pass the requires_grad=True as an argument. Like below

x = torch.ones(5,5,requires_grad = True)
x

Here I am creating tensors with one as the value of the size 5×5 and passing the requires_grad as True.

After the creation lets do addition operation on tensor x.

#add operation
y = x + 7
print(y)

addition operation on tensor

When you look in the output there is grad_fun. It stores the location of the last performed operation on the tensor. In this case it is grad_fn=<AddBackward0>. Let’s do multiplication on the y.

# multiplication operation
z = y*5
print(z)

Not the output stores the location of the operation as grad_fn=<MulBackward0>. Now I find the mean of z using z.mean() method.

#mean of z
mean = z.mean()
print(mean)

Till now I have done three operations, first, we added 7 to the x, then multiply the result(y) with the 7 and last find the mean of the z. Now let’s do backward propagation and find the gradient. Use the following code.

#backpropagation
mean.backward()
# gradient
print(x.grad)

How to change existing tensors for Autograd?

Sometimes you want to track the existing tensors that have not enabled autograding. Then you can define those using requires_grad_(True) method. It will enable the variable for tracking its operation history.

a = torch.ones(3,3)
print(a)
#operation
a = a+5
print(a)
print(a.requires_grad)

# enable autograd
a.requires_grad_(True)

#opeation on a
b = (a*a).sum()
print(b)
print(b.grad_fn)

changing autograd on the existing tensors

Here you can see tensor a has no grad_fun . But when I enabled the requires_grad_ to true then it has a grad_fun that is going to track all the operations on this variable.

How to stop tensor from Tracking History?

You can also stop the tensor from tracking history using torch.no_grad() method.

# stop tensor from tracking history with torch.no_grad() :
print(x.requires_grad)
y=x*2
print(y.requires_grad)

with torch.no_grad():
y = x*2
print(y.requires_grad)

You can see the when I use the statement with torch.no_grad() then the entire variable inside its scope will not track any operations history.

Conclusion

Tensor allows you to create a dynamically computational graph. You can create autograde feature for the tensor that will track all the operations at a location and when you propagate backward then you will easily find the optimized gradient value.

I hope this article has cleared your query on Autograd Usage in PyTorch. If you have any questions then message us on Facebook Offical Page.

 

Join our list

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Thank you for signup. A Confirmation Email has been sent to your Email Address.

Something went wrong.

Meet Sukesh ( Chief Editor ), a passionate and skilled Python programmer with a deep fascination for data science, NumPy, and Pandas. His journey in the world of coding began as a curious explorer and has evolved into a seasoned data enthusiast.
 
Thank you For sharing.We appreciate your support. Don't Forget to LIKE and FOLLOW our SITE to keep UPDATED with Data Science Learner