diff --git a/api-reference/v2/general/limits.mdx b/api-reference/v2/general/limits.mdx new file mode 100644 index 0000000..5dc4ac7 --- /dev/null +++ b/api-reference/v2/general/limits.mdx @@ -0,0 +1,19 @@ +--- +title: Limits +description: 'Rate and operational limits for the Glide API' +--- + +## Payload Limits + +You should not send more than 15MB of data in a single request. If you need to work with more data, use [stashing](/api-reference/v2/stashing/introduction) to upload the data in 15MB chunks. + +## Row Limits + +Even when using stashing, there are limits to the number of rows you can work with in a single request. These limits are approximate and depend on the size of the rows in your dataset. + + +| Endpoint | Row Limit | +|---------------------------------------------------------------|------------| +| [Create Table](/api-reference/v2/tables/post-tables) | 8m | +| [Overwrite Table](/api-reference/v2/tables/put-tables) | 8m | +| [Add Rows to Table](/api-reference/v2/tables/post-table-rows) | 250k | \ No newline at end of file diff --git a/api-reference/v2/general/rate-limits.mdx b/api-reference/v2/general/rate-limits.mdx deleted file mode 100644 index 335194b..0000000 --- a/api-reference/v2/general/rate-limits.mdx +++ /dev/null @@ -1,12 +0,0 @@ ---- -title: Rate Limits -description: 'Various rate and operational limits for the Glide API' ---- - -## Rate Limits - -TODO - -## Payload Limits - -TODO \ No newline at end of file diff --git a/api-reference/v2/resources/changelog.mdx b/api-reference/v2/resources/changelog.mdx index 6aad092..4c3e3db 100644 --- a/api-reference/v2/resources/changelog.mdx +++ b/api-reference/v2/resources/changelog.mdx @@ -3,6 +3,12 @@ title: Glide API Changelog sidebarTitle: Changelog --- +### September 13, 2024 + +- Introduced a new "Limits" document that outlines rate and operational limits for the API. +- Updated guidelines for when to use stashing in line with the new doc. +- Fixed the Bulk Import tutorial to use PUT instead of POST for the Stash Data endpoint. + ### September 4, 2024 - Removed "json" as a valid data type in column schemas for now. diff --git a/api-reference/v2/stashing/introduction.mdx b/api-reference/v2/stashing/introduction.mdx index ffecf0b..283bdd9 100644 --- a/api-reference/v2/stashing/introduction.mdx +++ b/api-reference/v2/stashing/introduction.mdx @@ -13,9 +13,9 @@ Once all data has been uploaded to the stash, the stash can then be referenced i ## When to Use Stashing -You should use stashing when: +You should use stashing when both of the following conditions are met: -* You have a large dataset that you want to upload to Glide. Anything larger than 5mb should be broken up into smaller chunks and stashed. +* You have a large dataset that you want to upload to Glide. Anything larger than [15MB](/api-reference/v2/general/limits) should be broken up into smaller chunks and stashed. * You want to perform an atomic operation using a large dataset. For example, you may want to perform an import of data into an existing table but don't want users to see the intermediate state of the import or incremental updates while they're using their application. ## Stash IDs and Serials diff --git a/api-reference/v2/tutorials/bulk-import.mdx b/api-reference/v2/tutorials/bulk-import.mdx index 4b5ce82..cc7b20d 100644 --- a/api-reference/v2/tutorials/bulk-import.mdx +++ b/api-reference/v2/tutorials/bulk-import.mdx @@ -36,25 +36,25 @@ You are responsible for ensuring that the stash ID is unique and stable across a ## Upload Data -Once you have a stable stash ID, you can use the [stash data endpoint](/api-reference/v2/stashing/post-stashes-serial) to upload the data in stages. +Once you have a stable stash ID, you can use the [stash data endpoint](/api-reference/v2/stashing/put-stashes-serial) to upload the data in chunks. -Upload stages can be run in parallel to speed up the upload of large dataset, just be sure to use the same stash ID across uploads to ensure the final data set is complete. +Chunks can be sent in parallel to speed up the upload of large datasets. Use the same stash ID across uploads to ensure the final data set is complete, and use the serial to control the order of the chunks within the stash. -As an example, the following [stash](/api-reference/v2/stashing/post-stashes-serial) requests will create a final dataset consisting of the two rows identified by the stash ID `20240501-import`. +As an example, the following [stash](/api-reference/v2/stashing/put-stashes-serial) requests will create a final dataset consisting of the two rows identified by the stash ID `20240501-import`. The trailing parameters of `1` and `2` in the request path are the serial IDs. The data in serial `1` will come first in the stash, and the data in serial `2` will come second, even if the requests are processed in a different order. - + ```json [ { "Name": "Alex", "Age": 30, "Birthday": "2024-07-03T10:24:08.285Z" - } + }, ] ``` - + ```json [ { @@ -67,7 +67,7 @@ As an example, the following [stash](/api-reference/v2/stashing/post-stashes-ser -The trailing parameters of `1` and `2` in the request path are the serial IDs, which distinguish and order the two uploads within the stash. +The above is just an example. In practice, you should include more than one row per stash chunk, and if your complete dataset is only 2 rows, you do not need to use stashing at all. See [Limits](/api-reference/v2/general/limits) for guidance. ## Finalize Import diff --git a/mint.json b/mint.json index a0b445a..4ab87bb 100644 --- a/mint.json +++ b/mint.json @@ -27,7 +27,8 @@ "pages": [ "api-reference/v2/general/introduction", "api-reference/v2/general/authentication", - "api-reference/v2/general/errors" + "api-reference/v2/general/errors", + "api-reference/v2/general/limits" ] }, {