Skip to content

emma-x1/autograd-engine

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 

Repository files navigation

autograd-engine

Based on Andrej Karpathy's Micrograd tutorial

Micrograd is an autograd engine that performs backpropogation, similar to PyTorch. Given a function with multiple inputs, it computes the gradient (derivative) with respect to each input.

This is the heart of a neural network - if we can compute the derivative of an output WRT inputs, we can tune the hyperparameters to approach a function minimum. When we do this with respect to the loss function, we train a neural network towards desired output.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published