Dive Into Deep Learning, Lecture 2: PyTorch Automatic Differentiation ( and backward)
In this video, we discuss PyTorch’s automatic differentiation engine that powers neural networks and deep learning training (for stochastic gradient descent). In this section, you will get a conceptual understanding of how autograd works to find the gradient of multivariable functions. We start by discussing derivatives, partial derivatives, and the definition of gradients. We then discuss how to compute gradients using requires_grad=True and the backward() method. Thus, we cover classes and functions implementing automatic differentiation of arbitrary scalar-valued and non-scalar-valued functions. We also discuss the Jacobian matrix in PyTorch. Differentiation is a crucial step in nearly all machine learning and deep learning optimization algorithms. While the calculations for taking these derivatives are straightforward, working out the updates by hand can be a painful and tedious task.
#Autograd #PyTorch #DeepLearning
1 view
1830
643
4 weeks ago 00:29:28 1
🎥 Can you guess the Anime by the First 10 Seconds? 🔥 Anime Quiz