A raw Multilayer Perceptron implementation in pure Python with forward propagation, backpropagation, and various activation functions.
- Forward Propagation
- Backpropagation for supervised learning
- Multiple activation functions (Sigmoid, ReLU, Softmax)
- Optimization with SGD (with momentum)
- Loss functions (MSE)
- Training and evaluation utilities
- More activation functions (Tanh, Leaky ReLU, etc.)
- Additional optimization algorithms (Adam, RMSProp, etc.)
- Regularization techniques (Dropout, L1/L2 regularization)
- Batch normalization
- Weight initialization strategies
src/
├── __init__.py
├── activation.py
├── layer.py
├── network.py
└── utils.py