Skip to content

TinyAutoGrad is an automatic differentiation engine and neural network library inspired by Micrograd, with CUDA support.

Notifications You must be signed in to change notification settings

whileskies/tinyautograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

30 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

TinyAutoGrad

TinyAutoGrad is an automatic differentiation engine and neural network library inspired by Micrograd, with optional CUDA support.


πŸš€ Features

  • βš™οΈ Automatic differentiation (forward & backward)
  • 🧱 Simple Tensor class with NumPy-style operations
  • 🧠 Neural network support with MLP example
  • πŸ’» CPU and ⚑ CUDA backend support
  • πŸ§ͺ Built-in testing with pytest

πŸ› οΈ Getting Started

CPU (Default)

Run the MNIST example on CPU:

python -m samples.mnist.mnist

CUDA (Experimental)

Ensure CUDA toolkit (nvcc) is installed:

# Compile CUDA backend
nvcc -shared -o libops.so tinyautograd/ops.cu -Xcompiler -fPIC -lcublas

# Run test.py
python test.py

# Run MNIST with CUDA backend
python -m samples.mnist.mnist_cuda

⚠️ Make sure libops.so is in the current directory or Python load path.


πŸ§ͺ Running Tests

Run all unit tests:

pytest tinyautograd/ -s -v --cache-clear

Or a specific test:

pytest tinyautograd/test_rawtensor.py -s -v --cache-clear

πŸ“‚ Project Structure

.
β”œβ”€β”€ README.md
β”œβ”€β”€ libops.so
β”œβ”€β”€ test.py
β”œβ”€β”€ samples
β”‚   └── mnist
β”‚       β”œβ”€β”€ mnist.py
β”‚       β”œβ”€β”€ mnist_cuda.py
β”‚       β”œβ”€β”€ train-images-idx3-ubyte.gz
β”‚       └── train-labels-idx1-ubyte.gz
β”œβ”€β”€ tinyautograd
β”‚   β”œβ”€β”€ __init__.py
β”‚   β”œβ”€β”€ functional.py
β”‚   β”œβ”€β”€ nn.py
β”‚   β”œβ”€β”€ ops.cu
β”‚   β”œβ”€β”€ optim.py
β”‚   β”œβ”€β”€ rawtensor.py
β”‚   β”œβ”€β”€ tensor.py
β”‚   β”œβ”€β”€ test_nn.py
β”‚   β”œβ”€β”€ test_rawtensor.py
β”‚   └── test_tensor.py

πŸ“„ License

MIT

About

TinyAutoGrad is an automatic differentiation engine and neural network library inspired by Micrograd, with CUDA support.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published