Skip to content

cambridge-mlg/Progressive-Tempering-Sampler-with-Diffusion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Progressive Tempering Sampler with Diffusion (PTSD)

Paper

Official implementation of "Progressive Tempering Sampler with Diffusion" (ICML 2025)

🔬 This repository contains the code for PT simulation, PT+DM training, and PTSD training.

🚧 Coming Soon! Full documentation will be released shortly.


Installation

Create the conda environment and install the required Python dependencies:

conda env create -f environment.yaml
pip install -r requirements.txt

Running Experiments

We provide an example on the GMM task. This includes three main stages:

1. Simulate PT

Run parallel tempering to generate initial samples, saved to data/pt/pt_gmm.pt:

python main.py --config-name=pt_gmm

2. Train PT+DM

Train a diffusion model using the PT samples:

python main.py --config-name=gmm +prefix="ptdm"

3. Train PTSD

Train the full PTSD framework using both PT and DM:

python main.py --config-name=gmm

Tasks

We provide codes for following tasks

  • gmm: Mixture of Gaussian with 40 modes (d=2)
  • mw32: Many Well potential (d=32)
  • lj55: Lennard-Jones potential with 55 particles (d=165)
  • aldp: Alanine Dipeptide in internal coordinate (d=60)
  • aldp_cart: Alanine Dipeptide in Cartesian coordinate(d=66)

Citation

If you find this work useful, please consider citing us:

@inproceedings{rissanen2025progressive,
  title={Progressive Tempering Sampler with Diffusion},
  author={Severi Rissanen and RuiKang OuYang and Jiajun He and Wenlin Chen and Markus Heinonen and Arno Solin and Hern{\'a}ndez-Lobato, Jos{\'e} Miguel},
  booktitle={International Conference on Machine Learning},
  year={2025},
  organization={PMLR}
}

Contact

For questions or feedback, feel free to open an issue or contact the corresponding authors.


About

Official implementation of Progressive Tempering Sampler with Diffusion (ICML 2025)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages