This project will analyze the representations between networks using different levels and types of sparsity across many different continual learning tasks. The goal of the project will be state which types of representations are usefull for remembering and transfering knowledge across tasks. We want to compare gating different levels of sparsity as well as to other typical CL techniques, EWC, SI, etc.
In addition we will compare these techniques in the SL and RL regimes to see if learning rules impact the representation.
Clone this repository to your local machine and install the required packages:
git clone <repository-url>
cd <repository-name>
pip install -r requirements.txt
pip install -e .
## Running
Experiement can be run from the experiment folder.