Skip to content

Commit a4f8ebc

Browse files
committed
Revise Web pages
1 parent ac566ab commit a4f8ebc

17 files changed

+55
-55
lines changed

bayesml/autoregressive/__init__.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,7 @@
8585
.. math::
8686
\mathrm{St}(x_{n+1}|m_\mathrm{p}, \lambda_\mathrm{p}, \nu_\mathrm{p})
8787
= \frac{\Gamma (\nu_\mathrm{p}/2 + 1/2)}{\Gamma (\nu_\mathrm{p}/2)}
88-
\left( \frac{m_\mathrm{p}}{\pi \nu_\mathrm{p}} \right)^{1/2}
88+
\left( \frac{\lambda_\mathrm{p}}{\pi \nu_\mathrm{p}} \right)^{1/2}
8989
\left[ 1 + \frac{\lambda_\mathrm{p}(x_{n+1}-m_\mathrm{p})^2}{\nu_\mathrm{p}} \right]^{-\nu_\mathrm{p}/2 - 1/2}.
9090
9191
.. math::
@@ -95,7 +95,7 @@
9595
where the parameters are obtained from the hyperparameters of the posterior distribution as follows.
9696
9797
.. math::
98-
m_\mathrm{p} &= \mu_n^\top \boldsymbol{x}'_n,\\
98+
m_\mathrm{p} &= \boldsymbol{\mu}_n^\top \boldsymbol{x}'_n,\\
9999
\lambda_\mathrm{p} &= \frac{\alpha_n}{\beta_n} (1 + (\boldsymbol{x}'_n)^\top \boldsymbol{\Lambda}_n^{-1} \boldsymbol{x}'_n)^{-1},\\
100100
\nu_\mathrm{p} &= 2 \alpha_n.
101101
"""

bayesml/bernoulli/__init__.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323
* :math:`B(\cdot,\cdot): \mathbb{R}_{>0} \times \mathbb{R}_{>0} \to \mathbb{R}_{>0}`: the Beta function
2424
2525
.. math::
26-
p(\theta) = \mathrm{Beta}(\theta|\alpha_0,\beta_0) = \frac{1}{B(\alpha_0, \beta_0)} \theta^{\alpha_0} (1-\theta)^{\beta_0}.
26+
p(\theta) = \mathrm{Beta}(\theta|\alpha_0,\beta_0) = \frac{1}{B(\alpha_0, \beta_0)} \theta^{\alpha_0 - 1} (1-\theta)^{\beta_0 - 1}.
2727
2828
.. math::
2929
\mathbb{E}[\theta] &= \frac{\alpha_0}{\alpha_0 + \beta_0}, \\
@@ -36,11 +36,11 @@
3636
* :math:`\beta_n \in \mathbb{R}_{>0}`: a hyperparameter
3737
3838
.. math::
39-
p(\theta | x^n) = \mathrm{Beta}(\theta|\alpha_n,\beta_n) = \frac{1}{B(\alpha_n, \beta_n)} \theta^{\alpha_n} (1-\theta)^{\beta_n},
39+
p(\theta | x^n) = \mathrm{Beta}(\theta|\alpha_n,\beta_n) = \frac{1}{B(\alpha_n, \beta_n)} \theta^{\alpha_n - 1} (1-\theta)^{\beta_n - 1},
4040
4141
.. math::
4242
\mathbb{E}[\theta | x^n] &= \frac{\alpha_n}{\alpha_n + \beta_n}, \\
43-
\mathbb{V}[\theta | x^n] &= \frac{\alpha_n \beta_n}{(\alpha_n + \beta_n)^2 (\alpha_n + \beta_n + 1)}.
43+
\mathbb{V}[\theta | x^n] &= \frac{\alpha_n \beta_n}{(\alpha_n + \beta_n)^2 (\alpha_n + \beta_n + 1)},
4444
4545
where the updating rule of the hyperparameters is
4646
@@ -56,16 +56,16 @@
5656
* :math:`\theta_\mathrm{p} \in [0,1]`: a parameter
5757
5858
.. math::
59-
p(x_{n+1} | x^n) = \mathrm{Bern}(x_{n+1}|\theta_\mathrm{p}) =\theta_\mathrm{p}^{x_{n+1}}(1-\theta_\mathrm{p})^{1-x_{n+1}}
59+
p(x_{n+1} | x^n) = \mathrm{Bern}(x_{n+1}|\theta_\mathrm{p}) =\theta_\mathrm{p}^{x_{n+1}}(1-\theta_\mathrm{p})^{1-x_{n+1}},
6060
6161
.. math::
6262
\mathbb{E}[x_{n+1} | x^n] &= \theta_\mathrm{p}, \\
63-
\mathbb{V}[x_{n+1} | x^n] &= \theta_\mathrm{p} (1 - \theta_\mathrm{p}).
63+
\mathbb{V}[x_{n+1} | x^n] &= \theta_\mathrm{p} (1 - \theta_\mathrm{p}),
6464
6565
where the parameters are obtained from the hyperparameters of the posterior distribution as follows.
6666
6767
.. math::
68-
\theta_\mathrm{p} = \frac{\alpha_n}{\alpha_n + \beta_n}
68+
\theta_\mathrm{p} = \frac{\alpha_n}{\alpha_n + \beta_n}.
6969
"""
7070

7171
from ._bernoulli import GenModel

bayesml/categorical/__init__.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323
2424
The prior distribution is as follows:
2525
26-
* :math:`\boldsymbol{\alpha}_0 \in \mathbb{R}_{>0}`: a hyperparameter
26+
* :math:`\boldsymbol{\alpha}_0 \in \mathbb{R}_{>0}^d`: a hyperparameter
2727
* :math:`\Gamma (\cdot)`: the gamma function
2828
* :math:`\tilde{\alpha}_0 = \sum_{k=1}^d \alpha_{0,k}`
2929
* :math:`C(\boldsymbol{\alpha}_0)=\frac{\Gamma(\tilde{\alpha}_0)}{\Gamma(\alpha_{0,1})\cdots\Gamma(\alpha_{0,d})}`
@@ -58,7 +58,7 @@
5858
5959
The predictive distribution is as follows:
6060
61-
* :math:`x_{n+1} \in \{ 0, 1\}^d`: a new data point
61+
* :math:`\boldsymbol{x}_{n+1} \in \{ 0, 1\}^d`: a new data point
6262
* :math:`\boldsymbol{\theta}_\mathrm{p} \in [0, 1]^d`: the hyperparameter of the posterior (:math:`\sum_{k=1}^d \theta_{\mathrm{p},k} = 1`)
6363
6464
.. math::
@@ -72,7 +72,7 @@
7272
where the parameters are obtained from the hyperparameters of the posterior distribution as follows:
7373
7474
.. math::
75-
\boldsymbol{\theta}_{\mathrm{p},k} = \frac{\alpha_{n,k}}{\sum_{k=1}^d \alpha_{n,k}}, \quad (k \in \{ 1, 2, \dots , d \}).
75+
\theta_{\mathrm{p},k} = \frac{\alpha_{n,k}}{\sum_{k=1}^d \alpha_{n,k}}, \quad (k \in \{ 1, 2, \dots , d \}).
7676
"""
7777

7878
from ._categorical import GenModel

bayesml/exponential/__init__.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -42,13 +42,13 @@
4242
4343
.. math::
4444
\mathbb{E}[\lambda | x^n] &= \frac{\alpha_n}{\beta_n}, \\
45-
\mathbb{V}[\lambda | x^n] &= \frac{\alpha_n}{\beta_n^2}.
45+
\mathbb{V}[\lambda | x^n] &= \frac{\alpha_n}{\beta_n^2},
4646
4747
where the updating rule of the hyperparameters is
4848
4949
.. math::
50-
\alpha_n &= \alpha_0 + n\\
51-
\beta_n &= \beta_0 + \sum_{i=1}^n x_i
50+
\alpha_n &= \alpha_0 + n,\\
51+
\beta_n &= \beta_0 + \sum_{i=1}^n x_i.
5252
5353
5454
The predictive distribution is as follows:
@@ -58,7 +58,7 @@
5858
* :math:`\eta_\mathrm{p} \in \mathbb{R}_{>0}`: the hyperparameter of the posterior
5959
6060
.. math::
61-
p(x_{n+1}|x^n)=\mathrm{Lomax}(x_{n+1}|\alpha_\mathrm{p},\eta_\mathrm{p}) = \frac{\alpha_\mathrm{p}}{\eta_\mathrm{p}}\left(1+\frac{x}{\eta_\mathrm{p}}\right)^{-(\alpha_\mathrm{p}+1)},
61+
p(x_{n+1}|x^n)=\mathrm{Lomax}(x_{n+1}|\alpha_\mathrm{p},\eta_\mathrm{p}) = \frac{\alpha_\mathrm{p}}{\eta_\mathrm{p}}\left(1+\frac{x_{n+1}}{\eta_\mathrm{p}}\right)^{-(\alpha_\mathrm{p}+1)},
6262
6363
.. math::
6464
\mathbb{E}[x_{n+1} | x^n] &=

bayesml/linearregression/__init__.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
* :math:`d \in \mathbb N`: a dimension
1111
* :math:`\boldsymbol{x} = [x_1, x_2, \dots , x_d] \in \mathbb{R}^d`: an explanatory variable. If you consider an intercept term, it should be included as one of the elements of :math:`\boldsymbol{x}`.
1212
* :math:`y\in\mathbb{R}`: an objective variable
13-
* :math:`\tau \in\mathbb{R}`: a parameter
13+
* :math:`\tau \in\mathbb{R}_{>0}`: a parameter
1414
* :math:`\boldsymbol{\theta}\in\mathbb{R}^{d}`: a parameter
1515
1616
.. math::
@@ -25,7 +25,7 @@
2525
The prior distribution is as follows:
2626
2727
* :math:`\boldsymbol{\mu_0} \in \mathbb{R}^d`: a hyperparameter
28-
* :math:`\boldsymbol{\Lambda_0} \in \mathbb{R}^{d\times d}`: a hyperparameter
28+
* :math:`\boldsymbol{\Lambda_0} \in \mathbb{R}^{d\times d}`: a hyperparameter (a positive definite matrix)
2929
* :math:`\alpha_0\in \mathbb{R}_{>0}`: a hyperparameter
3030
* :math:`\beta_0\in \mathbb{R}_{>0}`: a hyperparameter
3131
@@ -78,7 +78,7 @@
7878
7979
.. math::
8080
p(y_{n+1} | \boldsymbol{X}, \boldsymbol{y}, \boldsymbol{x}_{n+1} ) &= \mathrm{St}\left(y_{n+1} \mid m_\mathrm{p}, \lambda_\mathrm{p}, \nu_\mathrm{p}\right) \\
81-
&= \frac{\Gamma (\nu_\mathrm{p} / 2) + 1/2}{\Gamma (\nu_\mathrm{p} / 2)} \left( \frac{\lambda_\mathrm{p}}{\pi \nu_\mathrm{p}} \right)^{1/2} \left( 1 + \frac{\lambda_\mathrm{p} (y_{n+1} - m_\mathrm{p})^2}{\nu_\mathrm{p}} \right)^{-\nu_\mathrm{p}/2 - 1/2},
81+
&= \frac{\Gamma (\nu_\mathrm{p} / 2 + 1/2 )}{\Gamma (\nu_\mathrm{p} / 2)} \left( \frac{\lambda_\mathrm{p}}{\pi \nu_\mathrm{p}} \right)^{1/2} \left( 1 + \frac{\lambda_\mathrm{p} (y_{n+1} - m_\mathrm{p})^2}{\nu_\mathrm{p}} \right)^{-\nu_\mathrm{p}/2 - 1/2},
8282
8383
.. math::
8484
\mathbb{E}[y_{n+1} | \boldsymbol{X}, \boldsymbol{y}, \boldsymbol{x}_{n+1}] &= m_\mathrm{p} & (\nu_\mathrm{p} > 1), \\

bayesml/multivariate_normal/__init__.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
* :math:`\boldsymbol{x} \in \mathbb{R}^D`: a data point
1313
* :math:`\boldsymbol{\mu} \in \mathbb{R}^D`: a parameter
1414
* :math:`\boldsymbol{\Lambda} \in \mathbb{R}^{D\times D}` : a parameter (a positive definite matrix)
15-
* :math:`| \boldsymbol{\Lambda} | \in \mathbb{R}`: the determinant of :math:`\boldsymbol{\Lambda}_0`
15+
* :math:`| \boldsymbol{\Lambda} | \in \mathbb{R}`: the determinant of :math:`\boldsymbol{\Lambda}`
1616
1717
.. math::
1818
p(\boldsymbol{x} | \boldsymbol{\mu}, \boldsymbol{\Lambda}) &= \mathcal{N}(\boldsymbol{x}|\boldsymbol{\mu},\boldsymbol{\Lambda}^{-1}) \\
@@ -51,7 +51,7 @@
5151
* :math:`\boldsymbol{x}^n = (\boldsymbol{x}_1, \boldsymbol{x}_2, \dots , \boldsymbol{x}_n) \in \mathbb{R}^{D\times n}`: given data
5252
* :math:`\boldsymbol{m}_n \in \mathbb{R}^{D}`: a hyperparameter
5353
* :math:`\kappa_n \in \mathbb{R}_{>0}`: a hyperparameter
54-
* :math:`\nu_n \in \mathbb{R}`: a hyperparameter :math:`(\nu_0 > D-1)`
54+
* :math:`\nu_n \in \mathbb{R}`: a hyperparameter :math:`(\nu_n > D-1)`
5555
* :math:`\boldsymbol{W}_n \in \mathbb{R}^{D\times D}`: a hyperparameter (a positive definite matrix)
5656
5757
.. math::

bayesml/normal/__init__.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@
5757
\mathbb{E}[\mu | x^n] &= m_n & \left( \alpha_n > \frac{1}{2} \right), \\
5858
\mathbb{V}[\mu | x^n] &= \frac{\beta_n \alpha_n}{\alpha_n (\alpha_n - 1)} & (\alpha_n > 1), \\
5959
\mathbb{E}[\tau | x^n] &= \frac{\alpha_n}{\beta_n}, \\
60-
\mathbb{V}[\tau | x^n] &= \frac{\alpha_n}{\beta_n^2}.
60+
\mathbb{V}[\tau | x^n] &= \frac{\alpha_n}{\beta_n^2},
6161
6262
where the updating rule of the hyperparameters is
6363
@@ -66,7 +66,7 @@
6666
m_n &= \frac{\kappa_0 m_0 + n \bar{x}}{\kappa_0 + n}, \\
6767
\kappa_n &= \kappa_0 + n, \\
6868
\alpha_n &= \alpha_0 + \frac{n}{2}, \\
69-
\beta_n &= \beta_0 + \frac{1}{2} \left( \sum_{i=0}^n (x_i - \bar{x})^2 + \frac{\kappa_0 n}{\kappa_n + n} (\bar{x} - m_0)^2 \right).
69+
\beta_n &= \beta_0 + \frac{1}{2} \left( \sum_{i=1}^n (x_i - \bar{x})^2 + \frac{\kappa_0 n}{\kappa_n + n} (\bar{x} - m_0)^2 \right).
7070
7171
The predictive distribution is as follows:
7272
@@ -77,7 +77,7 @@
7777
7878
.. math::
7979
p(x_{n+1} | x^{n} ) &= \mathrm{St}(x_{n+1} | \mu_\mathrm{p}, \lambda_\mathrm{p}, \nu_\mathrm{p}) \\
80-
&= \frac{\Gamma (\nu_\mathrm{p} / 2) + 1/2}{\Gamma (\nu_\mathrm{p} / 2)} \left( \frac{\lambda_\mathrm{p}}{\pi \nu_\mathrm{p}} \right)^{1/2} \left( 1 + \frac{\lambda_\mathrm{p} (x_{n+1} - \mu_\mathrm{p})^2}{\nu_\mathrm{p}} \right)^{-\nu_\mathrm{p}/2 - 1/2},
80+
&= \frac{\Gamma (\nu_\mathrm{p} / 2 + 1/2 )}{\Gamma (\nu_\mathrm{p} / 2)} \left( \frac{\lambda_\mathrm{p}}{\pi \nu_\mathrm{p}} \right)^{1/2} \left( 1 + \frac{\lambda_\mathrm{p} (x_{n+1} - \mu_\mathrm{p})^2}{\nu_\mathrm{p}} \right)^{-\nu_\mathrm{p}/2 - 1/2},
8181
8282
.. math::
8383
\mathbb{E}[x_{n+1} | x^n] &= \mu_\mathrm{p} & (\nu_\mathrm{p} > 1), \\

bayesml/poisson/__init__.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
* :math:`\lambda \in \mathbb{R}_{>0}`: a parameter
1313
1414
.. math::
15-
p(x | \lambda) = \mathrm{Po}(x|\lambda) = \frac{ \lambda^{x} }{x!}\exp \{ -\lambda \}
15+
p(x | \lambda) = \mathrm{Po}(x|\lambda) = \frac{ \lambda^{x} }{x!}\exp \{ -\lambda \}.
1616
1717
The prior distribution is as follows:
1818
@@ -38,13 +38,13 @@
3838
3939
.. math::
4040
\mathbb{E}[\lambda | x^n] &= \frac{\alpha_n}{\beta_n}, \\
41-
\mathbb{V}[\lambda | x^n] &= \frac{\alpha_n}{\beta_n^2}.
41+
\mathbb{V}[\lambda | x^n] &= \frac{\alpha_n}{\beta_n^2},
4242
4343
where the updating rule of the hyperparameters is
4444
4545
.. math::
46-
\alpha_n &= \alpha_0 + \sum_{i=1}^n x_i\\
47-
\beta_n &= \beta_0 + n
46+
\alpha_n &= \alpha_0 + \sum_{i=1}^n x_i,\\
47+
\beta_n &= \beta_0 + n.
4848
4949
The predictive distribution is as follows:
5050

docs/bayesml.autoregressive.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -336,14 +336,14 @@ <h1>bayesml.autoregressive package<a class="headerlink" href="#bayesml-autoregre
336336
<div class="math notranslate nohighlight">
337337
\[\mathrm{St}(x_{n+1}|m_\mathrm{p}, \lambda_\mathrm{p}, \nu_\mathrm{p})
338338
= \frac{\Gamma (\nu_\mathrm{p}/2 + 1/2)}{\Gamma (\nu_\mathrm{p}/2)}
339-
\left( \frac{m_\mathrm{p}}{\pi \nu_\mathrm{p}} \right)^{1/2}
339+
\left( \frac{\lambda_\mathrm{p}}{\pi \nu_\mathrm{p}} \right)^{1/2}
340340
\left[ 1 + \frac{\lambda_\mathrm{p}(x_{n+1}-m_\mathrm{p})^2}{\nu_\mathrm{p}} \right]^{-\nu_\mathrm{p}/2 - 1/2}.\]</div>
341341
<div class="math notranslate nohighlight">
342342
\[\begin{split}\mathbb{E}[x_{n+1} | x^n] &amp;= m_\mathrm{p} &amp; (\nu_\mathrm{p} &gt; 1), \\
343343
\mathbb{V}[x_{n+1} | x^n] &amp;= \frac{1}{\lambda_\mathrm{p}} \frac{\nu_\mathrm{p}}{\nu_\mathrm{p}-2} &amp; (\nu_\mathrm{p} &gt; 2),\end{split}\]</div>
344344
<p>where the parameters are obtained from the hyperparameters of the posterior distribution as follows.</p>
345345
<div class="math notranslate nohighlight">
346-
\[\begin{split}m_\mathrm{p} &amp;= \mu_n^\top \boldsymbol{x}'_n,\\
346+
\[\begin{split}m_\mathrm{p} &amp;= \boldsymbol{\mu}_n^\top \boldsymbol{x}'_n,\\
347347
\lambda_\mathrm{p} &amp;= \frac{\alpha_n}{\beta_n} (1 + (\boldsymbol{x}'_n)^\top \boldsymbol{\Lambda}_n^{-1} \boldsymbol{x}'_n)^{-1},\\
348348
\nu_\mathrm{p} &amp;= 2 \alpha_n.\end{split}\]</div>
349349
<dl class="py class">

docs/bayesml.bernoulli.html

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -281,7 +281,7 @@ <h1>bayesml.bernoulli package<a class="headerlink" href="#bayesml-bernoulli-pack
281281
<li><p><span class="math notranslate nohighlight">\(B(\cdot,\cdot): \mathbb{R}_{&gt;0} \times \mathbb{R}_{&gt;0} \to \mathbb{R}_{&gt;0}\)</span>: the Beta function</p></li>
282282
</ul>
283283
<div class="math notranslate nohighlight">
284-
\[p(\theta) = \mathrm{Beta}(\theta|\alpha_0,\beta_0) = \frac{1}{B(\alpha_0, \beta_0)} \theta^{\alpha_0} (1-\theta)^{\beta_0}.\]</div>
284+
\[p(\theta) = \mathrm{Beta}(\theta|\alpha_0,\beta_0) = \frac{1}{B(\alpha_0, \beta_0)} \theta^{\alpha_0 - 1} (1-\theta)^{\beta_0 - 1}.\]</div>
285285
<div class="math notranslate nohighlight">
286286
\[\begin{split}\mathbb{E}[\theta] &amp;= \frac{\alpha_0}{\alpha_0 + \beta_0}, \\
287287
\mathbb{V}[\theta] &amp;= \frac{\alpha_0 \beta_0}{(\alpha_0 + \beta_0)^2 (\alpha_0 + \beta_0 + 1)}.\end{split}\]</div>
@@ -292,10 +292,10 @@ <h1>bayesml.bernoulli package<a class="headerlink" href="#bayesml-bernoulli-pack
292292
<li><p><span class="math notranslate nohighlight">\(\beta_n \in \mathbb{R}_{&gt;0}\)</span>: a hyperparameter</p></li>
293293
</ul>
294294
<div class="math notranslate nohighlight">
295-
\[p(\theta | x^n) = \mathrm{Beta}(\theta|\alpha_n,\beta_n) = \frac{1}{B(\alpha_n, \beta_n)} \theta^{\alpha_n} (1-\theta)^{\beta_n},\]</div>
295+
\[p(\theta | x^n) = \mathrm{Beta}(\theta|\alpha_n,\beta_n) = \frac{1}{B(\alpha_n, \beta_n)} \theta^{\alpha_n - 1} (1-\theta)^{\beta_n - 1},\]</div>
296296
<div class="math notranslate nohighlight">
297297
\[\begin{split}\mathbb{E}[\theta | x^n] &amp;= \frac{\alpha_n}{\alpha_n + \beta_n}, \\
298-
\mathbb{V}[\theta | x^n] &amp;= \frac{\alpha_n \beta_n}{(\alpha_n + \beta_n)^2 (\alpha_n + \beta_n + 1)}.\end{split}\]</div>
298+
\mathbb{V}[\theta | x^n] &amp;= \frac{\alpha_n \beta_n}{(\alpha_n + \beta_n)^2 (\alpha_n + \beta_n + 1)},\end{split}\]</div>
299299
<p>where the updating rule of the hyperparameters is</p>
300300
<div class="math notranslate nohighlight">
301301
\[\begin{split}\alpha_n = \alpha_0 + \sum_{i=1}^n I \{ x_i = 1 \},\\
@@ -308,13 +308,13 @@ <h1>bayesml.bernoulli package<a class="headerlink" href="#bayesml-bernoulli-pack
308308
<li><p><span class="math notranslate nohighlight">\(\theta_\mathrm{p} \in [0,1]\)</span>: a parameter</p></li>
309309
</ul>
310310
<div class="math notranslate nohighlight">
311-
\[p(x_{n+1} | x^n) = \mathrm{Bern}(x_{n+1}|\theta_\mathrm{p}) =\theta_\mathrm{p}^{x_{n+1}}(1-\theta_\mathrm{p})^{1-x_{n+1}}\]</div>
311+
\[p(x_{n+1} | x^n) = \mathrm{Bern}(x_{n+1}|\theta_\mathrm{p}) =\theta_\mathrm{p}^{x_{n+1}}(1-\theta_\mathrm{p})^{1-x_{n+1}},\]</div>
312312
<div class="math notranslate nohighlight">
313313
\[\begin{split}\mathbb{E}[x_{n+1} | x^n] &amp;= \theta_\mathrm{p}, \\
314-
\mathbb{V}[x_{n+1} | x^n] &amp;= \theta_\mathrm{p} (1 - \theta_\mathrm{p}).\end{split}\]</div>
314+
\mathbb{V}[x_{n+1} | x^n] &amp;= \theta_\mathrm{p} (1 - \theta_\mathrm{p}),\end{split}\]</div>
315315
<p>where the parameters are obtained from the hyperparameters of the posterior distribution as follows.</p>
316316
<div class="math notranslate nohighlight">
317-
\[\theta_\mathrm{p} = \frac{\alpha_n}{\alpha_n + \beta_n}\]</div>
317+
\[\theta_\mathrm{p} = \frac{\alpha_n}{\alpha_n + \beta_n}.\]</div>
318318
<dl class="py class">
319319
<dt class="sig sig-object py" id="bayesml.bernoulli.GenModel">
320320
<em class="property"><span class="pre">class</span> </em><span class="sig-prename descclassname"><span class="pre">bayesml.bernoulli.</span></span><span class="sig-name descname"><span class="pre">GenModel</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="o"><span class="pre">*</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">theta</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">0.5</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">h_alpha</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">0.5</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">h_beta</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">0.5</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">seed</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#bayesml.bernoulli.GenModel" title="Permalink to this definition"></a></dt>

0 commit comments

Comments
 (0)