Skip to content

Commit 29c875e

Browse files
authored
Update wangchanberta.rst
1 parent 2ac1374 commit 29c875e

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

docs/api/wangchanberta.rst

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22

33
pythainlp.wangchanberta
44
=======================
5-
The `pythainlp.wangchanberta` module is built upon the WangchanBERTa base model, specifically the `wangchanberta-base-att-spm-uncased` model, as detailed in the paper by Lowphansirikul et al. [^Lowphansirikul_2021].
5+
The `pythainlp.wangchanberta` module is built upon the WangchanBERTa base model, specifically the `wangchanberta-base-att-spm-uncased` model, as detailed in the paper by Lowphansirikul et al. [#Lowphansirikul_2021]_.
66

77
This base model is utilized for various natural language processing tasks in the Thai language, including named entity recognition, part-of-speech tagging, and subword tokenization.
88

9-
If you intend to fine-tune the model or explore its capabilities further, please refer to the [thai2transformers repository](https://github.com/vistec-AI/thai2transformers).
9+
If you intend to fine-tune the model or explore its capabilities further, please refer to the `thai2transformers repository <https://github.com/vistec-AI/thai2transformers>`_.
1010

1111
**Speed Benchmark**
1212

@@ -24,8 +24,8 @@ For a comprehensive performance benchmark, the following notebooks are available
2424
Colab`_
2525
- `pythainlp.wangchanberta GPU`_
2626

27-
.. _PyThaiNLP basic function and pythainlp.wangchanberta CPU at Google Colab: https://colab.research.google.com/drive/1ymTVB1UESXAyZlSpjknCb72xpdcZ86Db?usp=sharing
28-
.. _pythainlp.wangchanberta GPU: https://colab.research.google.com/drive/1AtkFT1HMGL2GO7O2tM_hi_7mExKwmhMw?usp=sharing
27+
.. _PyThaiNLP basic function and pythainlp.wangchanberta CPU at Google Colab: `https://colab.research.google.com/drive/1ymTVB1UESXAyZlSpjknCb72xpdcZ86Db?usp=sharing <https://colab.research.google.com/drive/1ymTVB1UESXAyZlSpjknCb72xpdcZ86Db?usp=sharing>`_
28+
.. _pythainlp.wangchanberta GPU: `https://colab.research.google.com/drive/1AtkFT1HMGL2GO7O2tM_hi_7mExKwmhMw?usp=sharing <https://colab.research.google.com/drive/1AtkFT1HMGL2GO7O2tM_hi_7mExKwmhMw?usp=sharing>`_
2929

3030
Modules
3131
-------
@@ -47,4 +47,4 @@ Modules
4747
References
4848
----------
4949

50-
[^Lowphansirikul_2021] Lowphansirikul L, Polpanumas C, Jantrakulchai N, Nutanong S. WangchanBERTa: Pretraining transformer-based Thai Language Models. [ArXiv:2101.09635](http://arxiv.org/abs/2101.09635) [Internet]. 2021 Jan 23 [cited 2021 Feb 27].
50+
.. [#Lowphansirikul_2021] Lowphansirikul L, Polpanumas C, Jantrakulchai N, Nutanong S. WangchanBERTa: Pretraining transformer-based Thai Language Models. `ArXiv:2101.09635 <http://arxiv.org/abs/2101.09635>`_ [Internet]. 2021 Jan 23 [cited 2021 Feb 27].

0 commit comments

Comments
 (0)