Thank you for taking our course. Completing the following tasks will prepare you for the exercise sessions in the coming week. Consequently, we will use workstations running Ubuntu Linux. We highly recommend to use Linux systems instead of Windows.
If you are unfamiliar with GitHub or our exercise setup please follow Task 1 to 5 here.
This exercise studies the implementation of an Algorithmic differentiation engine via operator overloading.
Python supports overloading plus (+) and times (*) via the magic methods __add__
and __mul__
. Both are vital for this project.
Navigate to the src
folder and open src/autograd.py
. The TODO
s mark parts of the code that require your attention.
Run nox -s test
to check your code after implementing the class ADiffFloat
. If all checks pass move on to the src/fit_neuron.py
module.
When overloading __add__
please consider,
with
When overloading __mul__
please consider,
Finally for element-wise functions
Now we want to use the autograd engine from the previous exercise to solve a simple optimisation problem using gradient descent. Move on to src/fit_neuron.py
and resolve all TODO
s.
To do so recall that the multivariate chain rule requires us to sum up contributions from each path. More formally for an input
You can now execute the script with python src/fit_neuron.py
. If everything is implemented correctly, the training process should result in a test accuracy of 1.0 after 10 epochs.
-
Andreas Griewank, Andrea Walther, Evaluating Derivatives
-
Autograd via operator overloading:
-
Autograd via source transformation:
-
Wikipedia's article on Automatic Differentiation