You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/doc/21-load-data/04-http.md
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ description:
7
7
8
8
This tutorial explains how to load data into a table from remote files.
9
9
10
-
The [COPY INTO `<table>` FROM REMOTE FILES](../30-reference/30-sql/10-dml/dml-copy-into-table-url.md) command allows you to load data into a table from one or more remote files by their URL. The supported file types include CSV, JSON, NDJSON, and PARQUET.
10
+
The [COPY INTO `<table>` FROM REMOTE FILES](../30-reference/30-sql/10-dml/dml-copy-into-table.md) command allows you to load data into a table from one or more remote files by their URL. The supported file types include CSV, JSON, NDJSON, and PARQUET.
11
11
12
12
### Before You Begin
13
13
@@ -38,7 +38,7 @@ COPY INTO books FROM 'https://datafuse-1253727613.cos.ap-hongkong.myqcloud.com/d
38
38
39
39
:::tip
40
40
41
-
The command can also load data from multiple files that are sequentially named. See [COPY INTO `<table>` FROM REMOTE FILES](../30-reference/30-sql/10-dml/dml-copy-into-table-url.md) for details.
41
+
The command can also load data from multiple files that are sequentially named. See [COPY INTO `<table>`](../30-reference/30-sql/10-dml/dml-copy-into-table.md) for details.
42
42
43
43
:::
44
44
@@ -52,4 +52,4 @@ SELECT * FROM books;
52
52
| Transaction Processing | Jim Gray | 1992 |
53
53
| Readings in Database Systems | Michael Stonebraker | 2004 |
Copy file name to clipboardExpand all lines: docs/doc/30-reference/30-sql/10-dml/dml-copy-into-table.md
+63-9Lines changed: 63 additions & 9 deletions
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,6 @@
1
1
---
2
-
title: 'COPY INTO <table> FROM STAGED FILES'
3
-
sidebar_label: 'COPY INTO <table> FROM STAGED FILES'
4
-
description:
5
-
'Loads data from staged files'
2
+
title: 'COPY INTO <table>'
3
+
sidebar_label: 'COPY INTO <table>'
6
4
---
7
5
8
6
`COPY` moves data between Databend tables and object storage systems (AWS S3 compatible object storage services and Azure Blob storage).
@@ -11,9 +9,7 @@ This command loads data into a table from files staged in one of the following l
11
9
12
10
* Named internal stage, files can be staged using the [PUT to Stage](../../00-api/10-put-to-stage.md).
13
11
* Named external stage that references an external location (AWS S3 compatible object storage services and Azure Blob storage).
14
-
* External location. This includes AWS S3 compatible object storage services and Azure Blob storage.
15
-
16
-
`COPY` can also load data into a table from one or more remote files by their URL. See [COPY INTO \<table\> FROM REMOTE FILES](dml-copy-into-table-url.md).
12
+
* External location. This includes AWS S3 compatible object storage services, Azure Blob storage, Google Cloud Storage, Huawei OBS.
| REGION | AWS region name. For example, us-east-1. | Optional |
66
62
| ENABLE_VIRTUAL_HOST_STYLE | If you use virtual hosting to address the bucket, set it to "true". | Optional |
67
63
68
-
Azure Blob storage:
64
+
**Azure Blob storage**
69
65
70
66
```sql
71
67
externalLocation ::=
@@ -84,6 +80,53 @@ externalLocation ::=
84
80
| ACCOUNT_NAME | Your account name for connecting the Azure Blob storage. If not provided, Databend will access the container anonymously. | Optional |
85
81
| ACCOUNT_KEY | Your account key for connecting the Azure Blob storage. | Optional |
|`gcs://<bucket>[<path>]`| External files located at the Google Cloud Storage | Required |
97
+
| ENDPOINT_URL | The container endpoint URL starting with "https://". To use a URL starting with "http://", set `allow_insecure` to `true` in the [storage] block of the file `databend-query-node.toml`. | Optional |
98
+
| CREDENTIAL | Your credential for connecting the GCS. If not provided, Databend will access the container anonymously. | Optional |
|`obs://<bucket>[<path>]`| External files located at the obs | Required |
115
+
| ENDPOINT_URL | The container endpoint URL starting with "https://". To use a URL starting with "http://", set `allow_insecure` to `true` in the [storage] block of the file `databend-query-node.toml`. | Optional |
116
+
| ACCESS_KEY_ID | Your access key ID for connecting the OBS. If not provided, Databend will access the bucket anonymously. | Optional |
117
+
| SECRET_ACCESS_KEY | Your secret access key for connecting the OBS. | Optional |
118
+
119
+
**HTTP**
120
+
121
+
```sql
122
+
externalLocation ::=
123
+
'https://<url>'
124
+
```
125
+
126
+
Especially, HTTP supports glob patterns. For example, use
127
+
128
+
-`ontime_200{6,7,8}.csv` to represents `ontime_2006.csv`,`ontime_2007.csv`,`ontime_20080.csv`.
129
+
-`ontime_200[6-8].csv` to represents `ontime_2006.csv`,`ontime_2007.csv`,`ontime_20080.csv`.
87
130
88
131
### FILES = ( 'file_name' [ , 'file_name' ... ] )
89
132
@@ -246,4 +289,15 @@ COPY INTO mytable
246
289
ACCOUNT_NAME ='<account_name>'
247
290
ACCOUNT_KEY ='<account_key>'
248
291
)
292
+
FILE_FORMAT = (type ='CSV');
293
+
```
294
+
295
+
**HTTP**
296
+
297
+
This example reads data from a CSV file and inserts them into a table:
0 commit comments