Skip to content

Commit de9c458

Browse files
authored
DOC-4832 RS: Added REST API examples to import databases (#1545)
1 parent 532727b commit de9c458

File tree

1 file changed

+135
-0
lines changed

1 file changed

+135
-0
lines changed

content/operate/rs/databases/import-export/import-data.md

Lines changed: 135 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,8 @@ Importing data erases all existing content in the database.
2323

2424
## Import data into a database
2525

26+
### Cluster Manager UI method
27+
2628
To import data into a database using the Cluster Manager UI:
2729

2830
1. On the **Databases** screen, select the database from the list, then select **Configuration**.
@@ -33,6 +35,41 @@ To import data into a database using the Cluster Manager UI:
3335
See [Supported storage locations](#supported-storage-locations) for more information about each storage location type.
3436
1. Select **Import**.
3537

38+
### REST API method
39+
40+
To import data into a database using the REST API, send an [import database request]({{<relref "/operate/rs/references/rest-api/requests/bdbs/actions/import">}}):
41+
42+
```sh
43+
POST /v1/bdbs/<database-id>/actions/import
44+
{
45+
"dataset_import_sources": [
46+
{
47+
"type": "<location-type>",
48+
// additional fields, depending on location_type
49+
},
50+
{
51+
"type": "<location-type>",
52+
// additional fields, depending on location_type
53+
}
54+
]
55+
}
56+
```
57+
58+
- Replace `<database-id>` with the destination database's ID.
59+
60+
- Replace the data source's `<location-type>` with the relevant value from the following options:
61+
62+
| Location type | "type" value |
63+
|---------------|--------------|
64+
| FTPS | "url" |
65+
| SFTP | "sftp" |
66+
| Amazon S3 | "s3" |
67+
| Google Cloud Storage | "gs" |
68+
| Microsoft Azure Storage | "abs" |
69+
| NAS/Local Storage | "mount_point" |
70+
71+
See the following storage location sections for REST API request examples for each location type.
72+
3673
## Supported storage locations {#supported-storage-services}
3774

3875
Data can be imported from a local mount point, transferred to [a URI](https://en.wikipedia.org/wiki/Uniform_Resource_Identifier) using FTP/SFTP, or stored on cloud provider storage.
@@ -70,6 +107,20 @@ Example: `ftp://username:password@10.1.1.1/home/backups/<filename>.rdb`
70107

71108
Select **Add path** to add another import file path.
72109

110+
Example [import database REST API request]({{<relref "/operate/rs/references/rest-api/requests/bdbs/actions/import">}}):
111+
112+
```sh
113+
POST /v1/bdbs/<database-id>/actions/import
114+
{
115+
"dataset_import_sources": [
116+
{
117+
"type": "url",
118+
"url": "ftp://<ftp_user>:<ftp_password>@example.com/<path>/<filename>.rdb.gz"
119+
}
120+
]
121+
}
122+
```
123+
73124
### Local mount point
74125

75126
Before importing data from a local mount point, make sure that:
@@ -100,6 +151,20 @@ As of version 6.2.12, Redis Enterprise reads files directly from the mount point
100151

101152
Select **Add path** to add another import file path.
102153

154+
Example [import database REST API request]({{<relref "/operate/rs/references/rest-api/requests/bdbs/actions/import">}}):
155+
156+
```sh
157+
POST /v1/bdbs/<database-id>/actions/import
158+
{
159+
"dataset_import_sources": [
160+
{
161+
"type": "mount_point",
162+
"path": "/<path>/<filename>.rdb.gz"
163+
}
164+
]
165+
}
166+
```
167+
103168
### SFTP server
104169

105170
Before importing data from an SFTP server, make sure that:
@@ -138,6 +203,20 @@ Example: `sftp://username:password@10.1.1.1/home/backups/[filename].rdb`
138203

139204
Select **Add path** to add another import file path.
140205

206+
Example [import database REST API request]({{<relref "/operate/rs/references/rest-api/requests/bdbs/actions/import">}}):
207+
208+
```sh
209+
POST /v1/bdbs/<database-id>/actions/import
210+
{
211+
"dataset_import_sources": [
212+
{
213+
"type": "sftp",
214+
"sftp_url": "sftp://<sftp_user>@example.com/<path>/<filename>.rdb"
215+
}
216+
]
217+
}
218+
```
219+
141220
### AWS Simple Storage Service {#aws-s3}
142221

143222
Before you choose to import data from an [Amazon Web Services](https://aws.amazon.com/) (AWS) Simple Storage Service (S3) bucket, make sure you have:
@@ -175,6 +254,24 @@ To connect to an S3-compatible storage location:
175254

176255
Replace `<filepath>` with the location of the S3 CA certificate `ca.pem`.
177256

257+
Example [import database REST API request]({{<relref "/operate/rs/references/rest-api/requests/bdbs/actions/import">}}):
258+
259+
```sh
260+
POST /v1/bdbs/<database-id>/actions/import
261+
{
262+
"dataset_import_sources": [
263+
{
264+
"type": "s3",
265+
"bucket_name": "backups",
266+
"subdir": "test-db",
267+
"filename": "<filename>.rdb",
268+
"access_key_id": "XXXXXXXXXXXXX",
269+
"secret_access_key": "XXXXXXXXXXXXXXXX"
270+
}
271+
]
272+
}
273+
```
274+
178275
### Google Cloud Storage
179276

180277
Before you import data from a [Google Cloud](https://developers.google.com/console/) storage bucket, make sure you have:
@@ -198,6 +295,26 @@ In the Redis Enterprise Software Cluster Manager UI, when you enter the import l
198295
- In the **Private key** field, enter the `private_key` from the service account key.
199296
Replace `\n` with new lines.
200297

298+
Example [import database REST API request]({{<relref "/operate/rs/references/rest-api/requests/bdbs/actions/import">}}):
299+
300+
```sh
301+
POST /v1/bdbs/<database-id>/actions/import
302+
{
303+
"dataset_import_sources": [
304+
{
305+
"type": "gs",
306+
"bucket_name": "backups",
307+
"client_id": "XXXXXXXX",
308+
"client_email": "cloud-storage-client@my-project-id.iam.gserviceaccount.com",
309+
"subdir": "test-db",
310+
"filename": "<filename>.rdb",
311+
"private_key_id": "XXXXXXXXXXXXX",
312+
"private_key": "XXXXXXXXXXXXXXXX"
313+
}
314+
]
315+
}
316+
```
317+
201318
### Azure Blob Storage
202319

203320
Before you choose to import from Azure Blob Storage, make sure that you have:
@@ -220,6 +337,24 @@ In the Redis Enterprise Software Cluster Manager UI, when you enter the import l
220337

221338
- In the **Azure Account Key** field, enter the storage account key.
222339

340+
Example [import database REST API request]({{<relref "/operate/rs/references/rest-api/requests/bdbs/actions/import">}}):
341+
342+
```sh
343+
POST /v1/bdbs/<database-id>/actions/import
344+
{
345+
"dataset_import_sources": [
346+
{
347+
"type": "abs",
348+
"container": "backups",
349+
"subdir": "test-db",
350+
"filename": "<filename>.rdb",
351+
"account_name": "name",
352+
"account_key": "XXXXXXXXXXXXXXXX" // Or you can use "sas_token": "XXXXXXXXXXXXXXXXXX" instead
353+
}
354+
]
355+
}
356+
```
357+
223358
## Importing into an Active-Active database
224359

225360
When importing data into an Active-Active database, there are two options:

0 commit comments

Comments
 (0)