AutoGrad: Automatic Differentiation in PyTorch - DLMC09

About this event

In this workshop, we will explore the autograd package which provides automatic differentiation for all operations on Tensors. It is a define-by-run framework, which means that your backprop is defined by how your code is run and that every single iteration can be different.

Speaker


Organizer