Skip to content

Commit 75854ed

Browse files
Create README.rst
1 parent 59fb307 commit 75854ed

File tree

1 file changed

+64
-0
lines changed

1 file changed

+64
-0
lines changed

README.rst

Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
Implicit Reparametrization Trick
2+
==========
3+
4+
|test| |codecov| |docs|
5+
6+
.. |test| image:: https://github.com/intsystems/ProjectTemplate/workflows/test/badge.svg
7+
:target: https://github.com/intsystems/ProjectTemplate/tree/master
8+
:alt: Test status
9+
10+
.. |codecov| image:: https://img.shields.io/codecov/c/github/intsystems/ProjectTemplate/master
11+
:target: https://app.codecov.io/gh/intsystems/ProjectTemplate
12+
:alt: Test coverage
13+
14+
.. |docs| image:: https://github.com/intsystems/ProjectTemplate/workflows/docs/badge.svg
15+
:target: https://intsystems.github.io/implicit-reparameterization-trick/
16+
:alt: Docs status
17+
18+
Description
19+
==========
20+
21+
This repository implements an educational project for the Bayesian Multimodeling course. It implements algorithms for sampling from various distributions, using the implicit reparameterization trick.
22+
23+
## Scope
24+
We plan to implement the following distributions in our library:
25+
- Gaussian normal distribution (*)
26+
- Dirichlet distribution (Beta distributions)(\*)
27+
- Sampling from a mixture of distributions
28+
- Sampling from the Student's t-distribution (**) (\*)
29+
- Sampling from an arbitrary factorized distribution (***)
30+
31+
(\*) - this distribution is already implemented in torch using the explicit reparameterization trick, we will implement it for comparison
32+
33+
(\*\*) - this distribution is added as a backup, their inclusion is questionable
34+
35+
(\*\*\*) - this distribution is not very clear in implementation, its inclusion is questionable
36+
37+
Stack
38+
==========
39+
40+
We plan to inherit from the torch.distribution.Distribution class, so we need to implement all the methods that are present in that class.
41+
42+
Usage
43+
==========
44+
45+
In this example, we demonstrate the application of our library using a Variational Autoencoder (VAE) model, where the latent layer is modified by a normal distribution.
46+
```
47+
>>> import torch.distributions.implicit as irt
48+
>>> params = Encoder(inputs)
49+
>>> gauss = irt.Normal(*params)
50+
>>> deviated = gauss.rsample()
51+
>>> outputs = Decoder(deviated)
52+
```
53+
In this example, we demonstrate the use of a mixture of distributions using our library.
54+
```
55+
>>> import irt
56+
>>> params = Encoder(inputs)
57+
>>> mix = irt.Mixture([irt.Normal(*params), irt.Dirichlet(*params)])
58+
>>> deviated = mix.rsample()
59+
>>> outputs = Decoder(deviated)
60+
```
61+
62+
## Links
63+
- [LinkReview](https://github.com/intsystems/implitic-reparametrization-trick/blob/main/linkreview.md)
64+
- [Plan of project](https://github.com/intsystems/implitic-reparametrization-trick/blob/main/planning.md)

0 commit comments

Comments
 (0)