Skip to content

msyvr/micrograd-python

Repository files navigation

An implementation of Andrej Karpathy's micrograd

Reimplementing this as a warm up to implementing in Rust! Two blog posts accompany this implementation:

DeepWiki's take

This repo was indexed by DeepWiki, which generated documentation for it - take a look at the API reference generated for this tiny codebase :).

Use

To experiment with a very simple binary classifier model that has randomly generated inputs and targets:

git clone git@github.com:msyvr/micrograd-python.git
cd micrograd-python
python train.py

Parameters for step size, number of epochs, activation function, etc., can be updated in train.py to see the effects of each on model convergence to targets.

Evaluating the model

Model performance can be evaluated as a function of:

  1. number of layers
  2. nodes per layer
  3. loss function

Separately, given a configuration of the above parameters, training efficiency can be evaluated as a function of:

  1. step size
  2. (maximum) training epochs

TODO

About

An implementation of Karpathy's micrograd

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages