You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/api/wangchanberta.rst
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -2,11 +2,11 @@
2
2
3
3
pythainlp.wangchanberta
4
4
=======================
5
-
The `pythainlp.wangchanberta` module is built upon the WangchanBERTa base model, specifically the `wangchanberta-base-att-spm-uncased` model, as detailed in the paper by Lowphansirikul et al. [^Lowphansirikul_2021].
5
+
The `pythainlp.wangchanberta` module is built upon the WangchanBERTa base model, specifically the `wangchanberta-base-att-spm-uncased` model, as detailed in the paper by Lowphansirikul et al. [#Lowphansirikul_2021]_.
6
6
7
7
This base model is utilized for various natural language processing tasks in the Thai language, including named entity recognition, part-of-speech tagging, and subword tokenization.
8
8
9
-
If you intend to fine-tune the model or explore its capabilities further, please refer to the [thai2transformers repository](https://github.com/vistec-AI/thai2transformers).
9
+
If you intend to fine-tune the model or explore its capabilities further, please refer to the `thai2transformers repository<https://github.com/vistec-AI/thai2transformers>`_.
10
10
11
11
**Speed Benchmark**
12
12
@@ -24,8 +24,8 @@ For a comprehensive performance benchmark, the following notebooks are available
24
24
Colab`_
25
25
- `pythainlp.wangchanberta GPU`_
26
26
27
-
.. _PyThaiNLP basic function and pythainlp.wangchanberta CPU at Google Colab: https://colab.research.google.com/drive/1ymTVB1UESXAyZlSpjknCb72xpdcZ86Db?usp=sharing
.. _PyThaiNLP basic function and pythainlp.wangchanberta CPU at Google Colab: `https://colab.research.google.com/drive/1ymTVB1UESXAyZlSpjknCb72xpdcZ86Db?usp=sharing<https://colab.research.google.com/drive/1ymTVB1UESXAyZlSpjknCb72xpdcZ86Db?usp=sharing>`_
[^Lowphansirikul_2021] Lowphansirikul L, Polpanumas C, Jantrakulchai N, Nutanong S. WangchanBERTa: Pretraining transformer-based Thai Language Models. [ArXiv:2101.09635](http://arxiv.org/abs/2101.09635) [Internet]. 2021 Jan 23 [cited 2021 Feb 27].
50
+
.. [#Lowphansirikul_2021] Lowphansirikul L, Polpanumas C, Jantrakulchai N, Nutanong S. WangchanBERTa: Pretraining transformer-based Thai Language Models. `ArXiv:2101.09635<http://arxiv.org/abs/2101.09635>`_ [Internet]. 2021 Jan 23 [cited 2021 Feb 27].
0 commit comments