You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.rst
+22-20Lines changed: 22 additions & 20 deletions
Original file line number
Diff line number
Diff line change
@@ -24,6 +24,7 @@ Scope
24
24
==========
25
25
26
26
We plan to implement the following distributions in our library:
27
+
27
28
- `Gaussian normal distribution`
28
29
- `Dirichlet distribution (Beta distributions)`
29
30
- `Sampling from a mixture of distributions`
@@ -38,23 +39,24 @@ We plan to inherit from the torch.distribution.Distribution class, so we need to
38
39
Usage
39
40
==========
40
41
41
-
In this example, we demonstrate the application of our library using a Variational Autoencoder (VAE) model, where the latent layer is modified by a normal distribution.
42
-
```
43
-
>>> import torch.distributions.implicit as irt
44
-
>>> params = Encoder(inputs)
45
-
>>> gauss = irt.Normal(*params)
46
-
>>> deviated = gauss.rsample()
47
-
>>> outputs = Decoder(deviated)
48
-
```
49
-
In this example, we demonstrate the use of a mixture of distributions using our library.
- [Plan of project](https://github.com/intsystems/implitic-reparametrization-trick/blob/main/planning.md)
42
+
In this example, we demonstrate the application of our library using a Variational Autoencoder (VAE) model, where the latent layer is modified by a normal distribution.::
43
+
44
+
import torch.distributions.implicit as irt
45
+
params = Encoder(inputs)
46
+
gauss = irt.Normal(*params)
47
+
deviated = gauss.rsample()
48
+
outputs = Decoder(deviated)
49
+
50
+
In this example, we demonstrate the use of a mixture of distributions using our library.::
0 commit comments