Skip to content

Neural Network built from scratch ie. No Tensorflow / PyTorch using MNIST dataset to classify handwriten digits

Notifications You must be signed in to change notification settings

ShrinivasanT/NN-from-Scratch-using-MNIST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

MNIST Neural Network

This project implements a simple neural network from scratch to classify handwritten digits from the MNIST dataset. It does not rely on deep learning frameworks like TensorFlow or PyTorch, using only basic Python libraries.

Project Overview

The goal of this project is to build and train a neural network to recognize digits in the MNIST dataset. MNIST consists of 60,000 training images and 10,000 testing images of handwritten digits (0-9), each sized 28x28 pixels.

Key Features

  • Neural Network Built From Scratch: Implemented using Python libraries without deep learning frameworks.
  • Training with Backpropagation: Utilizes backpropagation and stochastic gradient descent (SGD) for weight updates.
  • MNIST Dataset: Trained and tested on the standard MNIST dataset, widely used for benchmarking machine learning algorithms.

Installation

To get started, clone this repository:

git clone https://github.com/ShrinivasanT/NN-from-Scratch-using-MNIST.git
cd NN-from-Scratch-using-MNIST

Usage

To train and test the neural network, run the NeuralNetwork.ipynb Notebook:

jupyter notebook NeuralNetwork.ipynb

This script loads the MNIST dataset, trains the neural network, outputs the accuracy on the test set after training, and makes predictions on some examples from the test set.

Architecture

The neural network architecture consists of:

  • Input Layer: 784 input neurons (one for each pixel of the 28x28 images).
  • Hidden Layer(s): Customizable number of hidden layers with ReLU activation functions.
  • Output Layer: 10 output neurons, one for each possible digit (0-9), with a softmax activation function.

The network is trained using stochastic gradient descent (SGD) with backpropagation to minimize cross-entropy loss.

Results

After training, the neural network achieves a high classification accuracy of up to 82.7% on the MNIST training set, demonstrating the effectiveness of a simple neural network built from scratch.

About

Neural Network built from scratch ie. No Tensorflow / PyTorch using MNIST dataset to classify handwriten digits

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published