Skip to content

intsystems/SWAG-BMM

Repository files navigation

Project "Optimization operators as evidence estimators"

Motivation

"Classical" evidence lower bound approaches allows researcher to perform a simplified Bayesian inference over quite complex models, like deep learning models. This approach involves MC-like sampling at each optimization iteration. Alternative approach is to consider parameters W as a sample from unknown distribution that changes under action of optimization operator (like SGD) at each optimization step. From the researcher perspective, this approach is useufl because doesn't need to change the optimization at all.

Algorithms to implement

ELBO with SGD

ELBO with preconditioned SLGD

Stochastic Gradient Fisher Scoring from paper

Constant SGD as Variational EM from paper

Team members

  1. Bylinkin Dmitry (Project wrapping, Final demo, Algorithms)
  2. Semenov Andrei (Tests writing, Library writing, Algorithms)
  3. Shestakov Alexander (Project planning, Blog post, Algorithms)
  4. Solodkin Vladimir (Basic code writing, Documentation writing, Algorithms)

Code organisation

The main folder for the library is swaglib. It has the following structure:

(TODO)

swaglib/
├── __init__.py
├── helpers.py
├── optimizers/
    ├── __init__.py
    ├── optimizer.py
    ├── elbo/
    ├── sgfs/

About

[BMM 24-25] Optimization operators as evidence estimators

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •