In this workshop, we will explore the autograd package which provides automatic differentiation for all operations on Tensors. It is a define-by-run framework, which means that your backprop is defined by how your code is run and that every single iteration can be different.
Lenaar Digital
Co-founder
Sharp Edge
Media Creative Lead
Tech Lead
AI and Chatbot Developer | Full Stack Web | PAFLA Ambassador | Public Speaker | Trainer
Communications Lead
Softseek International
Operations Lead
Technology Links
Marketing Lead
ConciSafe
Event Management Lead
Content Writing Lead
HnH Soft Tech Solutions Pvt Ltd
Host