Skip to content

Commit 3298304

Browse files
Conchylicultorcopybara-github
authored andcommitted
Automated documentation update
PiperOrigin-RevId: 278923614
1 parent da6993c commit 3298304

File tree

1 file changed

+0
-2
lines changed

1 file changed

+0
-2
lines changed

docs/catalog/wikipedia.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,14 +2,12 @@
22
<div itemscope itemprop="includedInDataCatalog" itemtype="http://schema.org/DataCatalog">
33
<meta itemprop="name" content="TensorFlow Datasets" />
44
</div>
5-
65
<meta itemprop="name" content="wikipedia" />
76
<meta itemprop="description" content="Wikipedia dataset containing cleaned articles of all languages.&#10;The datasets are built from the Wikipedia dump&#10;(https://dumps.wikimedia.org/) with one split per language. Each example&#10;contains the content of one full Wikipedia article with cleaning to strip&#10;markdown and unwanted sections (references, etc.).&#10;&#10;&#10;To use this dataset:&#10;&#10;```python&#10;import tensorflow_datasets as tfds&#10;&#10;ds = tfds.load('wikipedia', split='train')&#10;for ex in ds.take(4):&#10; print(ex)&#10;```&#10;&#10;See [the guide](https://www.tensorflow.org/datasets/overview) for more&#10;informations on [tensorflow_datasets](https://www.tensorflow.org/datasets).&#10;&#10;" />
87
<meta itemprop="url" content="https://www.tensorflow.org/datasets/catalog/wikipedia" />
98
<meta itemprop="sameAs" content="https://dumps.wikimedia.org" />
109
<meta itemprop="citation" content="@ONLINE {wikidump,&#10; author = &quot;Wikimedia Foundation&quot;,&#10; title = &quot;Wikimedia Downloads&quot;,&#10; url = &quot;https://dumps.wikimedia.org&quot;&#10;}&#10;" />
1110
</div>
12-
1311
# `wikipedia`
1412

1513
Wikipedia dataset containing cleaned articles of all languages. The datasets are

0 commit comments

Comments
 (0)