Skip to content

Commit 09426e0

Browse files
Merge pull request #1 from intsystems/docs
Docs
2 parents 68f3ce4 + 80e55cd commit 09426e0

File tree

5 files changed

+73
-3
lines changed

5 files changed

+73
-3
lines changed

README.md

Lines changed: 35 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -19,9 +19,41 @@
1919

2020
This repository implements an educational project for the Bayesian Multimodeling course. It implements algorithms for sampling from various distributions, using the implicit reparameterization trick.
2121

22-
## Описание
23-
24-
В этом репозитории реализован учебный проект для курса Байесовское мультимоделирование. В нем реализуются алгоритмы сэмплирования из различных распределений, используя implicit reparametriation trick.
22+
## Scope
23+
We plan to implement the following distributions in our library:
24+
- Gaussian normal distribution (*)
25+
- Dirichlet distribution (Beta distributions)(\*)
26+
- Sampling from a mixture of distributions
27+
- Sampling from the Student's t-distribution (**) (\*)
28+
- Sampling from an arbitrary factorized distribution (***)
29+
30+
(\*) - this distribution is already implemented in torch using the explicit reparameterization trick, we will implement it for comparison
31+
32+
(\*\*) - this distribution is added as a backup, their inclusion is questionable
33+
34+
(\*\*\*) - this distribution is not very clear in implementation, its inclusion is questionable
35+
36+
## Stack
37+
38+
We plan to inherit from the torch.distribution.Distribution class, so we need to implement all the methods that are present in that class.
39+
40+
## Usage
41+
In this example, we demonstrate the application of our library using a Variational Autoencoder (VAE) model, where the latent layer is modified by a normal distribution.
42+
```
43+
>>> import torch.distributions.implicit as irt
44+
>>> params = Encoder(inputs)
45+
>>> gauss = irt.Normal(*params)
46+
>>> deviated = gauss.rsample()
47+
>>> outputs = Decoder(deviated)
48+
```
49+
In this example, we demonstrate the use of a mixture of distributions using our library.
50+
```
51+
>>> import irt
52+
>>> params = Encoder(inputs)
53+
>>> mix = irt.Mixture([irt.Normal(*params), irt.Dirichlet(*params)])
54+
>>> deviated = mix.rsample()
55+
>>> outputs = Decoder(deviated)
56+
```
2557

2658
## Links
2759
- [LinkReview](https://github.com/intsystems/implitic-reparametrization-trick/blob/main/linkreview.md)

doc/source/conf.py

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
import os
2+
import sys
3+
sys.path.insert(0, os.path.abspath('../..'))
4+
5+
project = 'Implicit Reparametrization Trick'
6+
author = 'Matvei Kreinin, Maria Nikitina, Petr Babkin, Irina Zaboryanskaya'
7+
release = '0.1'
8+
9+
extensions = [
10+
'sphinx.ext.autodoc',
11+
'sphinx.ext.napoleon',
12+
'sphinx.ext.viewcode',
13+
]
14+
15+
templates_path = ['_templates']
16+
exclude_patterns = []
17+
18+
html_theme = 'sphinx_rtd_theme'
19+
html_static_path = ['_static']

doc/source/index.rst

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
.. Implicit Reparametrization Trick documentation master file, created by
2+
sphinx-quickstart on Mon Oct 10 10:00:00 2022.
3+
You can adapt this file completely to your liking, but it should at least
4+
contain the root `toctree` directive.
5+
6+
Welcome to Implicit Reparametrization Trick's documentation!
7+
===========================================================
8+
9+
.. toctree::
10+
:maxdepth: 2
11+
:caption: Contents:
12+
13+
modules

doc/source/info.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
.. include:: ../../README.md

doc/source/modules.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
Modules
2+
=======
3+
4+
.. automodule:: torch.distributions.implicit
5+
:members:

0 commit comments

Comments
 (0)