Skip to content

cattermelon1234/minigrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

grad.py

implementation of back propagation through a computational graph for addition, multiplication, division, subtraction, exponents, and various activation functions.

nn.py

implementation of a simple neural network node, and a multi layer perceptron with customizable parameters

demo.ipynb

demo of my neural network on the make moons dataset. the model takes in 3 parameters and outputs 1 (-1/1) for classification. The model is trained using gradient descent using the grad.py implementation and a varying learning rate for Stochastic Gradient Descent. The model is able to achieve an accuracy of 100% after 100 steps on the training set, and is also able to achieve an accuracy of 100% on never seen before testing data.

About

implementation of back propagation and a simple neural network

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published