You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This repository implements an educational project for the Bayesian Multimodeling course. It implements algorithms for sampling from various distributions, using the implicit reparameterization trick.
31
31
32
-
## Scope
32
+
## 🗃 Scope
33
33
We plan to implement the following distributions in our library:
34
34
-[x] Gaussian normal distribution (*)
35
35
-[x] Dirichlet distribution (Beta distributions)(\*)
@@ -44,11 +44,11 @@ We plan to implement the following distributions in our library:
44
44
45
45
(\*\*\*) - this distribution is not very clear in implementation, its inclusion is questionable
46
46
47
-
## Stack
47
+
## 📚 Stack
48
48
49
49
We plan to inherit from the torch.distribution.Distribution class, so we need to implement all the methods that are present in that class.
50
50
51
-
## Usage
51
+
## 👨💻 Usage
52
52
In this example, we demonstrate the application of our library using a Variational Autoencoder (VAE) model, where the latent layer is modified by a normal distribution.
53
53
```
54
54
>>> import torch.distributions.implicit as irt
@@ -66,7 +66,7 @@ In this example, we demonstrate the use of a mixture of distributions using our
0 commit comments