Skip to content

Commit de4d369

Browse files
Merge pull request #48 from yuta-nakahara/release
Release 0.2.0
2 parents cb49b4c + e1033b3 commit de4d369

File tree

86 files changed

+5709
-485
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

86 files changed

+5709
-485
lines changed

README.md

Lines changed: 18 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ For more details, see our [website](https://yuta-nakahara.github.io/BayesML/ "Ba
2323

2424
Please use the following commands to install BayesML.
2525

26-
``` shell
26+
``` bash
2727
pip install bayesml
2828
```
2929

@@ -33,6 +33,7 @@ The following are required.
3333
* NumPy (>= 1.20)
3434
* SciPy (>= 1.7)
3535
* MatplotLib (>= 3.5)
36+
* Scikit-learn (>= 1.1)
3637

3738
## Example
3839

@@ -53,11 +54,11 @@ gen_model.visualize_model()
5354
```
5455

5556
>theta:0.7
56-
>x0:[1 1 1 1 1 0 1 0 0 1 1 1 1 0 1 1 0 1 1 1]
57-
>x1:[1 0 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1]
58-
>x2:[0 0 1 1 0 1 0 1 1 1 1 1 1 0 1 0 1 1 1 1]
59-
>x3:[1 0 1 1 1 1 1 0 0 0 1 0 0 1 0 1 1 0 1 0]
60-
>x4:[1 1 0 1 0 1 1 1 0 1 1 1 0 0 1 1 1 1 1 1]
57+
>x0:[1 1 1 0 1 1 1 0 1 1 1 1 1 1 0 1 1 1 0 1]
58+
>x1:[1 1 0 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 0]
59+
>x2:[1 0 1 1 0 1 1 1 0 1 1 1 1 1 0 0 1 1 1 1]
60+
>x3:[1 1 1 0 1 1 0 1 0 0 0 0 1 0 1 1 1 1 1 1]
61+
>x4:[0 0 1 0 0 0 1 1 1 1 1 1 1 1 0 0 1 1 1 1]
6162
>![bernoulli_example1](./doc/images/README_ex_img1.png)
6263
6364
After confirming that the frequency of occurrence of 1 is around `theta=0.7`, we generate a sample and store it to variable `x`.
@@ -99,9 +100,9 @@ print(learn_model.estimate_params(loss='abs'))
99100
print(learn_model.estimate_params(loss='0-1'))
100101
```
101102

102-
>0.6428571428571429
103-
>0.6474720009710451
104-
>0.6578947368421053
103+
>0.7380952380952381
104+
>0.7457656349087012
105+
>0.7631578947368421
105106
106107
Different settings of a loss function yield different optimal estimates.
107108

@@ -115,8 +116,12 @@ The following packages are currently available. In this library, a probabilistic
115116
* [Normal model](https://yuta-nakahara.github.io/BayesML/bayesml.normal.html "BayesML Normal Model")
116117
* [Multivariate normal model](https://yuta-nakahara.github.io/BayesML/bayesml.multivariate_normal.html "BayesML Multivariate Normal Model")
117118
* [Exponential model](https://yuta-nakahara.github.io/BayesML/bayesml.exponential.html "BayesML Exponential Model")
119+
* [Gaussian mixture model](https://yuta-nakahara.github.io/BayesML/bayesml.gaussianmixture.html "BayesML Gaussian Mixture Model")
118120
* [Linear regression model](https://yuta-nakahara.github.io/BayesML/bayesml.linearregression.html "BayesML Lenear Regression Model")
121+
* [Meta-tree model](https://yuta-nakahara.github.io/BayesML/bayesml.metatree.html "BayesML Meta-tree Model")
119122
* [Autoregressive model](https://yuta-nakahara.github.io/BayesML/bayesml.autoregressive.html "BayesML Autoregressive Model")
123+
* [Hidden Markov normal model](https://yuta-nakahara.github.io/BayesML/bayesml.hiddenmarkovnormal.html "BayesML Hidden Markov Normal Model")
124+
* [Context tree model](https://yuta-nakahara.github.io/BayesML/bayesml.contexttree.html "BayesML Context Tree Model")
120125

121126
In the future, we will add packages to deal with a mixture normal model and a hidden Markov model, which are difficult to perform exact Bayesian inference, by using variational Bayes methods.
122127

@@ -131,11 +136,8 @@ When you use BayesML for your academic work, please provide the following biblio
131136
Plain text
132137

133138
```
134-
Y. Nakahara, N. Ichijo, K. Shimada,
135-
K. Tajima, K. Horinouchi, L. Ruan,
136-
N. Namegaya, R. Maniwa, T. Ishiwatari,
137-
W. Yu, Y. Iikubo, S. Saito,
138-
K. Kazama, T. Matsushima, ``BayesML,''
139+
Y. Nakahara, N. Ichijo, K. Shimada, Y. Iikubo,
140+
S. Saito, K. Kazama, T. Matsushima, ``BayesML 0.2.0,''
139141
[Online] https://github.com/yuta-nakahara/BayesML
140142
```
141143

@@ -144,11 +146,9 @@ BibTeX
144146
``` bibtex
145147
@misc{bayesml,
146148
author = {Nakahara Yuta and Ichijo Naoki and Shimada Koshi and
147-
Tajima Keito and Horinouchi Kohei and Ruan Luyu and
148-
Namegaya Noboru and Maniwa Ryota and Ishiwatari Taisuke and
149-
Yu Wenbin and Iikubo Yuji and Saito Shota and Kazama Koki and
149+
Iikubo Yuji and Saito Shota and Kazama Koki and
150150
Matsushima Toshiyasu}
151-
title = {BayesML},
151+
title = {BayesML 0.2.0},
152152
howpublished = {\url{https://github.com/yuta-nakahara/BayesML}},
153153
year = {2022}
154154
}

README_jp.md

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ BayesMLは以下の特徴を持っています.
2222

2323
以下のコマンドによりインストール可能です.
2424

25-
``` shell
25+
``` bash
2626
pip install bayesml
2727
```
2828

@@ -32,6 +32,7 @@ BayesMLの実行には以下が必要です.
3232
* NumPy (>= 1.20)
3333
* SciPy (>= 1.7)
3434
* MatplotLib (>= 3.5)
35+
* Scikit-learn (>= 1.1)
3536

3637
## 実行例
3738

@@ -52,11 +53,11 @@ gen_model.visualize_model()
5253
```
5354

5455
>theta:0.7
55-
>x0:[1 1 1 1 1 0 1 0 0 1 1 1 1 0 1 1 0 1 1 1]
56-
>x1:[1 0 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1]
57-
>x2:[0 0 1 1 0 1 0 1 1 1 1 1 1 0 1 0 1 1 1 1]
58-
>x3:[1 0 1 1 1 1 1 0 0 0 1 0 0 1 0 1 1 0 1 0]
59-
>x4:[1 1 0 1 0 1 1 1 0 1 1 1 0 0 1 1 1 1 1 1]
56+
>x0:[1 1 1 0 1 1 1 0 1 1 1 1 1 1 0 1 1 1 0 1]
57+
>x1:[1 1 0 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 0]
58+
>x2:[1 0 1 1 0 1 1 1 0 1 1 1 1 1 0 0 1 1 1 1]
59+
>x3:[1 1 1 0 1 1 0 1 0 0 0 0 1 0 1 1 1 1 1 1]
60+
>x4:[0 0 1 0 0 0 1 1 1 1 1 1 1 1 0 0 1 1 1 1]
6061
>![bernoulli_example1](./doc/images/README_ex_img1.png)
6162
6263
1の出現頻度が`theta=0.7`程度であることを確認したら,サンプルを生成し変数`x`に保存します.
@@ -96,9 +97,9 @@ print(learn_model.estimate_params(loss='abs'))
9697
print(learn_model.estimate_params(loss='0-1'))
9798
```
9899

99-
>0.6428571428571429
100-
>0.6474720009710451
101-
>0.6578947368421053
100+
>0.7380952380952381
101+
>0.7457656349087012
102+
>0.7631578947368421
102103
103104
損失関数の設定が異なると,そのもとでの最適な推定値も異なることがわかります.
104105

@@ -112,8 +113,12 @@ print(learn_model.estimate_params(loss='0-1'))
112113
* [正規モデル](https://yuta-nakahara.github.io/BayesML/bayesml.normal.html "BayesML Normal Model")
113114
* [多変量正規モデル](https://yuta-nakahara.github.io/BayesML/bayesml.multivariate_normal.html "BayesML Multivariate Normal Model")
114115
* [指数モデル](https://yuta-nakahara.github.io/BayesML/bayesml.exponential.html "BayesML Exponential Model")
116+
* [混合正規モデル](https://yuta-nakahara.github.io/BayesML/bayesml.gaussianmixture.html "BayesML Gaussian Mixture Model")
115117
* [線形回帰モデル](https://yuta-nakahara.github.io/BayesML/bayesml.linearregression.html "BayesML Lenear Regression Model")
118+
* [メタツリーモデル](https://yuta-nakahara.github.io/BayesML/bayesml.metatree.html "BayesML Meta-tree Model")
116119
* [自己回帰モデル](https://yuta-nakahara.github.io/BayesML/bayesml.autoregressive.html "BayesML Autoregressive Model")
120+
* [隠れマルコフモデル](https://yuta-nakahara.github.io/BayesML/bayesml.hiddenmarkovnormal.html "BayesML Hidden Markov Normal Model")
121+
* [文脈木モデル](https://yuta-nakahara.github.io/BayesML/bayesml.contexttree.html "BayesML Context Tree Model")
117122

118123
また,今後は混合正規モデルや隠れマルコフモデルなどの厳密なベイズ推論が困難なモデルを変分ベイズ法で学習するパッケージが追加される予定です.
119124

@@ -128,11 +133,8 @@ BayesMLへのコントリビューションを考えてくださってありが
128133
プレーンテキスト
129134

130135
```
131-
Y. Nakahara, N. Ichijo, K. Shimada,
132-
K. Tajima, K. Horinouchi, L. Ruan,
133-
N. Namegaya, R. Maniwa, T. Ishiwatari,
134-
W. Yu, Y. Iikubo, S. Saito,
135-
K. Kazama, T. Matsushima, ``BayesML,''
136+
Y. Nakahara, N. Ichijo, K. Shimada, Y. Iikubo,
137+
S. Saito, K. Kazama, T. Matsushima, ``BayesML,''
136138
[Online] https://github.com/yuta-nakahara/BayesML
137139
```
138140

@@ -141,9 +143,7 @@ BibTeX
141143
``` bibtex
142144
@misc{bayesml,
143145
author = {Nakahara Yuta and Ichijo Naoki and Shimada Koshi and
144-
Tajima Keito and Horinouchi Kohei and Ruan Luyu and
145-
Namegaya Noboru and Maniwa Ryota and Ishiwatari Taisuke and
146-
Yu Wenbin and Iikubo Yuji and Saito Shota and Kazama Koki and
146+
Iikubo Yuji and Saito Shota and Kazama Koki and
147147
Matsushima Toshiyasu}
148148
title = {BayesML},
149149
howpublished = {\url{https://github.com/yuta-nakahara/BayesML}},

bayesml/__init__.py

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,9 @@
77
from . import normal
88
from . import poisson
99
from . import metatree
10+
from . import contexttree
11+
from . import gaussianmixture
12+
from . import hiddenmarkovnormal
1013

1114
__all__ = ['bernoulli',
1215
'categorical',
@@ -16,5 +19,8 @@
1619
'multivariate_normal',
1720
'normal',
1821
'poisson',
19-
'metatree'
22+
'metatree',
23+
'contexttree',
24+
'gaussianmixture',
25+
'hiddenmarkovnormal',
2026
]

bayesml/bernoulli/_bernoulli.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -368,7 +368,7 @@ def visualize_posterior(self):
368368
p_range = np.linspace(0,1,100,endpoint=False)
369369
fig, ax = plt.subplots()
370370
ax.plot(p_range,self.estimate_params(loss="KL").pdf(p_range))
371-
ax.set_xlabel("p_theta")
371+
ax.set_xlabel("theta")
372372
ax.set_ylabel("posterior")
373373
plt.show()
374374

bayesml/categorical/_categorical.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -349,7 +349,7 @@ def update_posterior(self, x):
349349
2-dimensional array whose shape is ``(sample_size,degree)`` whose rows are one-hot vectors.
350350
"""
351351
_check.onehot_vecs(x,'x',DataFormatError)
352-
if self.degree > 1 and x.shape[-1] != self.degree:
352+
if x.shape[-1] != self.degree:
353353
raise(DataFormatError(f"x.shape[-1] must be degree:{self.degree}"))
354354
x = x.reshape(-1,self.degree)
355355

bayesml/contexttree/__init__.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@
5454
where the updating rule of the hyperparameter is as follows:
5555
5656
.. math::
57-
\beta_n(k|s) = \beta_0(k|s) + \sum_{i=1}^n I \left\{ \text{:math:`s` is the ancestor of :math:`s_{T_\mathrm{max}}(x^{i-1})` and :math:`x_i=k` } \right\}.
57+
\beta_n(k|s) = \beta_0(k|s) + \sum_{i=1}^n I \left\{ s \ \mathrm{is \ the \ ancestor \ of} \ s_{T_\mathrm{max}}(x^{i-1}) \ \mathrm{and} \ x_i=k \right\}.
5858
5959
For :math:`T \in \mathcal{T}`,
6060
@@ -66,18 +66,18 @@
6666
.. math::
6767
g_{n,s} =
6868
\begin{cases}
69-
g_{0,s} & \text{if :math:`n=0`}, \\
69+
g_{0,s}, & n=0, \\
7070
\frac{ g_{n-1,s} \tilde{q}_{s_{\mathrm{child}}} (x_n|x^{n-1}) }
71-
{ \tilde{q}_s(x_n|x^{n-1}) } & \text{otherwise},
71+
{ \tilde{q}_s(x_n|x^{n-1}) } & \mathrm{otherwise},
7272
\end{cases}
7373
7474
where :math:`s_{\mathrm{child}}` is the child node of :math:`s` on the path from :math:`s_\lambda` to :math:`s_{T_\mathrm{max}}(x^n)` and
7575
7676
.. math::
7777
\tilde{q}_s(x_n|x^{n-1}) =
7878
\begin{cases}
79-
q_s(x_n|x^{n-1}) & \text{if :math:`s\in\mathcal{L}(T_\mathrm{max})`}, \\
80-
(1-g_{n-1,s}) q_s(x_n|x^{n-1}) + g_{n-1,s} \tilde{q}_{s_{\mathrm{child}}}(x_n|x^{n-1}) & \text{otherwise}.
79+
q_s(x_n|x^{n-1}) & s\in\mathcal{L}(T_\mathrm{max}), \\
80+
(1-g_{n-1,s}) q_s(x_n|x^{n-1}) + g_{n-1,s} \tilde{q}_{s_{\mathrm{child}}}(x_n|x^{n-1}) & \mathrm{otherwise}.
8181
\end{cases}
8282
8383
Here,

bayesml/contexttree/_contexttree.py

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -191,7 +191,7 @@ def _visualize_model_recursion(self,tree_graph,node,node_id,parent_id,sibling_nu
191191
tmp_p_v = p_v
192192

193193
# add node information
194-
label_string = f'h_g={node.h_g:.2f}\\lp_v={tmp_p_v:.2f}\\ltheta_vec='
194+
label_string = f'h_g={node.h_g:.2f}\\lp_v={tmp_p_v:.2f}\\ltheta_vec\\l='
195195
if node.leaf:
196196
label_string += '['
197197
for i in range(self.c_k):
@@ -388,16 +388,16 @@ def visualize_model(self,filename=None,format=None,sample_length=10):
388388
Examples
389389
--------
390390
>>> from bayesml import contexttree
391-
>>> model = contexttree.GenModel(c_k=2,c_d_max=3,h_g=0.75)
391+
>>> gen_model = contexttree.GenModel(c_k=2,c_d_max=3,h_g=0.75)
392392
>>> gen_model.gen_params()
393-
>>> model.visualize_model()
394-
[1 1 1 1 1 1 0 0 0 1]
393+
>>> gen_model.visualize_model()
394+
[1 0 1 0 0 0 1 0 0 0]
395395
396396
.. image:: ./images/contexttree_example.png
397397
398398
See Also
399399
--------
400-
graphbiz.Digraph
400+
graphviz.Digraph
401401
"""
402402
#例外処理
403403
_check.pos_int(sample_length,'sample_length',DataFormatError)
@@ -781,7 +781,7 @@ def estimate_params(self,loss="0-1",visualize=True,filename=None,format=None):
781781
782782
See Also
783783
--------
784-
graphbiz.Digraph
784+
graphviz.Digraph
785785
"""
786786

787787
if loss == "0-1":
@@ -820,7 +820,7 @@ def _visualize_model_recursion(self,tree_graph,node:_LearnNode,node_id,parent_id
820820
tmp_p_v = p_v
821821

822822
# add node information
823-
label_string = f'hn_g={node.hn_g:.2f}\\lp_v={tmp_p_v:.2f}\\ltheta_vec='
823+
label_string = f'hn_g={node.hn_g:.2f}\\lp_v={tmp_p_v:.2f}\\ltheta_vec\\l='
824824
label_string += '['
825825
for i in range(self.c_k):
826826
theta_vec_hat = node.hn_beta_vec / node.hn_beta_vec.sum()
@@ -869,7 +869,7 @@ def visualize_posterior(self,filename=None,format=None):
869869
870870
See Also
871871
--------
872-
graphbiz.Digraph
872+
graphviz.Digraph
873873
"""
874874
try:
875875
import graphviz

0 commit comments

Comments
 (0)