You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This repository implements an educational project for the Bayesian Multimodeling course. It implements algorithms for sampling from various distributions, using the implicit reparameterization trick.
31
36
32
-
## Scope
37
+
## 🗃 Scope
33
38
We plan to implement the following distributions in our library:
34
39
-[x] Gaussian normal distribution (*)
35
40
-[x] Dirichlet distribution (Beta distributions)(\*)
@@ -44,11 +49,11 @@ We plan to implement the following distributions in our library:
44
49
45
50
(\*\*\*) - this distribution is not very clear in implementation, its inclusion is questionable
46
51
47
-
## Stack
52
+
## 📚 Stack
48
53
49
54
We plan to inherit from the torch.distribution.Distribution class, so we need to implement all the methods that are present in that class.
50
55
51
-
## Usage
56
+
## 👨💻 Usage
52
57
In this example, we demonstrate the application of our library using a Variational Autoencoder (VAE) model, where the latent layer is modified by a normal distribution.
53
58
```
54
59
>>> import torch.distributions.implicit as irt
@@ -66,7 +71,7 @@ In this example, we demonstrate the use of a mixture of distributions using our
0 commit comments