- The underlying principles of implementing neural network backpropagation, teachings sourced from Andrej.
- An “under the hood” knowledge of deep learning: layer details, loss functions, optimization, etc.
- Implement basic operations: sum, substract, multiply, division
- Computre gradients
- Implement of Chain rule in backpropagation
- Build Neurons, Layers, and Multi Layer Perceptron
- Create Activation functions
- Establish forward and backward function
- Construct parameters for gradient update
💡 Using Colab to open can see the directory structure more clearly.