This is the official repository for the paper "Turning Tabular Foundation Models into Graph Foundation Models". In this repository, we provide code for reproducing our experiments with G2T-FM, GNNs and LightGBM (including ablation). Code for reproduction of our experiments with prior GFMs is coming soon.
Note
Our work is largely based on TabPFN and "On Finetuning Tabular Foundation Models" paper, please consider checking them out too!
Prerequisites
- Install uv
- Install dependencies
uv sync
- Download TabPFNv2 checkpoints
wget https://huggingface.co/Prior-Labs/TabPFN-v2-reg/resolve/main/tabpfn-v2-regressor.ckpt?download=true -O checkpoints/tabpfn-v2-regressor.ckpt
wget https://huggingface.co/Prior-Labs/TabPFN-v2-clf/resolve/main/tabpfn-v2-classifier.ckpt?download=true -O checkpoints/tabpfn-v2-classifier.ckpt
- For experiments on GraphLand, download datasets and place them in "data" directory
Running the code
You can execute a minimal run with a following command:
uv run bin/go.py exp/g2t_fm/finetune/tolokers-2/tuning.toml --force
bin/
- Training and evaluation scriptsexp/
- Experiment configurations and resultsdata/
- Dataset directory (created after download)lib/
- Common utilities and tools
Experiments are configured using TOML files located in the exp/
directory. Each configuration specifies:
- Dataset path and preprocessing
- Model hyperparameters
- Training settings
- Evaluation metrics
After training, results are saved in the same directory as the configuration file:
report.json
- Evaluation metrics- Model checkpoints
- Training logs