This project implements time series forecasting using a Vanilla Transformer model, using HF's TimeSeriesTransformer.
What we provide:
- Inference helper classes to help managing the model in a production setting.
- Monitoring of the forecasting loss of the model to detect shifts in the underlying data distribution.
- Attention Classifier to map the distribution shift to a known different data profile using the latent space of the Transformer.
The model expects the dataset to be structured as follows:
target
: The values to be predicted.start
: The start time of the series.feat_static_cat
: Static categorical features associated with the time series.- list with dimension equal to the number of features
feat_static_real
: Dynamic real-valued features associated with the time series.- list with dimension equal to the number of features
feat_dynamic_real
: Dynamic real-valued features that change over time. (curretnly not supported)- array with shape equal to (number of features, target length)
item_id
: Identifier for each time series item.
Ensure your dataset aligns with this structure.
To set up the project environment, follow these steps:
-
Clone the repository:
git clone https://github.com/zxnga/Transformer-TSF.git cd Transformer-TSF
-
Create a virtual environment (optional but recommended):
python3 -m venv env source env/bin/activate # On Windows, use 'env\Scripts\activate'
-
Install the required dependencies:
pip install -r requirements.txt
To train and evaluate the TimeSeriesTransformer model:
-
Prepare your dataset: Ensure it follows the Dataset Structure mentioned above.
-
Define a configuration for the Transformer: Update any hyperparameters as needed, https://huggingface.co/docs/transformers/v4.49.0/en/model_doc/time_series_transformer#transformers.TimeSeriesTransformerConfig.
-
Follow the example in examples:
train_test.ipyng
: training and evaluating the modelloss_monitoring.ipyng
: monitor the forecast loss during inference to detect distribution shiftclassifier.ipyng
: train classifier on Transfomer's latent space
-
Use inference.py functions for inference helper class in production settings
Note: TimeSeriesTransformerForPrediction doesn't allow feat_dynamic_real
Transformer-TSF
├── examples
│ ├── classifier.ipynb
│ ├── loss_monitoring.ipynb
│ ├── train_test.ipynb
├── src (core codes)
│ ├── inference (to simplify running the model in production)
│ │ ├── data
│ │ | ├── buffer.py (buffers to store context, true values and predictions)
│ │ | ├── helper.py (helper to manage all the data needed by the model)
│ │ ├── monitor.py (loss monitoring via ensemble forecast uncertainty weighting)
│ │ ├── wrapper.py (model wrapper for inference)
│ ├── networks
│ │ ├── classifier.py (Classifier to be used from the Transfomer's latent space)
│ │ ├── projection.py (Networks to transform the 2D Encoders's latent space to 1D)
│ ├── plotting.py
│ ├── ts_transformer.py (functions to initilalize, train and test the model)
│ ├── utils.py
├── README.md
├── requirements.txt