Skip to content
/ pbnn Public

Uncertainty quantification for neural networks: Bayesian inference algorithms, deep ensembles, MC Dropout, amongst others.

License

Notifications You must be signed in to change notification settings

bstaber/pbnn

Repository files navigation

Introduction

This repository gathers the algorithms and numerical experiments presented in the benchmark paper Benchmarking Bayesian neural networks and evaluation metrics for regression tasks.

Please consider citing our paper if you find this code useful in your work:

article{staber2023benchmarking,
      title={Benchmarking Bayesian neural networks and evaluation metrics for regression tasks}, 
      author={Brian Staber and Sébastien Da Veiga},
      year={2023},
      eprint={2206.06779},
      archivePrefix={arXiv},
}

This repository is a fork of pbnn from GitLab. It has been migrated to GitHub for further development.

This project was originally developed on GitLab. The original repository remains available at GitLab.

It is licensed under the MIT License – see the LICENSE file for details.

Getting started

Install guide

You can install this package using pip:

pip install pbnn

Note that pbnn relies on JAX which will be installed through BlackJAX, the main dependance of this package. The code will run on CPU only unless you install JAX with GPU support.

For example, for CUDA 12:

pip install --upgrade "jax[cuda12_pip]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html

For more details, see the official JAX installation instructions.

JAX has been mainly chosen for its composable function transformations (such as grad, jit, or scan) that make MCMC methods for neural networks computationally tractable.

By relying on Flax and BlackJAX, pbnn gives acess to (SG)MCMC methods (most of them being simplified user interfaces built on top of BlackJAX), but also to deep ensembles, Monte Carlo dropout, stochastic weight averaging Gaussian (SWAG), and classical MAP estimation.

The remaining algorithms tested in the accompanying paper are taken from MAPIE for conformal prediction, and laplace for effective Laplace approximation in PyTorch. As such, these are not accessible via pbnn, and the interested reader is referred to the benchmark folder, or the packages official documentations.

Documentation

API documentation and several examples of usage can be found in the online documentation.

About

Uncertainty quantification for neural networks: Bayesian inference algorithms, deep ensembles, MC Dropout, amongst others.

Resources

License

Stars

Watchers

Forks

Contributors 2

  •  
  •