diff --git a/api-reference/v2/general/errors.mdx b/api-reference/v2/general/errors.mdx
index 00cc789..b9f10a6 100644
--- a/api-reference/v2/general/errors.mdx
+++ b/api-reference/v2/general/errors.mdx
@@ -52,4 +52,43 @@ curl --request PUT \
"message": "Invalid request params: Stash ID must be 256 characters max, alphanumeric with dashes and underscores, no leading dash or underscore"
}
}
- ```
\ No newline at end of file
+ ```
+
+### Invalid Row Data
+
+When adding or updating rows in a table, if the row data does not match the table schema, the API will return a `422` response status.
+
+#### Unkown Column
+
+```json
+{
+ "error": {
+ "type": "column_id_not_found",
+ "message": "Unknown column ID 'foo'"
+ }
+}
+```
+
+#### Invalid Value for Column
+
+```json
+{
+ "error": {
+ "type": "column_has_invalid_value",
+ "message": "Invalid value for column 'foo'"
+ }
+}
+```
+
+### Row Not Found
+
+When attempting to update a row that does not exist, the API will return a `404` response status.
+
+```json
+{
+ "error": {
+ "type": "row_not_found",
+ "message": "Row with ID 'XHz6kF2XSTGi1ADDbryjqw' not found"
+ }
+}
+```
\ No newline at end of file
diff --git a/api-reference/v2/resources/changelog.mdx b/api-reference/v2/resources/changelog.mdx
index b416e02..eb00ae7 100644
--- a/api-reference/v2/resources/changelog.mdx
+++ b/api-reference/v2/resources/changelog.mdx
@@ -3,6 +3,11 @@ title: Glide API Changelog
sidebarTitle: Changelog
---
+### December 13, 2024
+
+- Clarified that endpoints return row IDs in the same order as the input rows.
+- Clarified the requirements for row data to match the table's schema and what happens if it doesn't.
+
### November 26, 2024
- Added a warning that using the `PUT /tables` endpoint to overwrite a table will clear user-specific columns.
diff --git a/api-reference/v2/tables/delete-table-row.mdx b/api-reference/v2/tables/delete-table-row.mdx
index 1a72f4b..c31a147 100644
--- a/api-reference/v2/tables/delete-table-row.mdx
+++ b/api-reference/v2/tables/delete-table-row.mdx
@@ -3,4 +3,6 @@ title: Delete Row
openapi: delete /tables/{tableID}/rows/{rowID}
---
-Deletes a row in a Big Table. No error is returned if the row does not exist.
+Deletes a row in a Big Table.
+
+No error is returned if the row does not exist.
diff --git a/api-reference/v2/tables/patch-table-row.mdx b/api-reference/v2/tables/patch-table-row.mdx
index 661ff69..777c17a 100644
--- a/api-reference/v2/tables/patch-table-row.mdx
+++ b/api-reference/v2/tables/patch-table-row.mdx
@@ -3,4 +3,6 @@ title: Update Row
openapi: patch /tables/{tableID}/rows/{rowID}
---
-Updates an existing row in a Big Table.
\ No newline at end of file
+Updates an existing row in a Big Table.
+
+If a column is not included in the passed row data, it will not be updated. If a column is passed that does not exist in the table schema, or with a value that does not match the column's type, the default behavior is for no update to be made and the API call to [return an error](/api-reference/v2/general/errors#invalid-row-data). However, you can control this behavior with the `onSchemaError` query parameter.
\ No newline at end of file
diff --git a/api-reference/v2/tables/post-table-rows.mdx b/api-reference/v2/tables/post-table-rows.mdx
index 47899ca..d55d046 100644
--- a/api-reference/v2/tables/post-table-rows.mdx
+++ b/api-reference/v2/tables/post-table-rows.mdx
@@ -3,9 +3,11 @@ title: Add Rows to Table
openapi: post /tables/{tableID}/rows
---
-Add row data to an existing Big Table.
+Add one or more rows to an existing Big Table.
-Row data may be passed in JSON, CSV, or TSV format.
+Row IDs for the added rows are returned in the response in the same order as the input row data is passed in the request. Row data may be passed in JSON, CSV, or TSV format.
+
+If a column is not included in the passed row data, it will be empty in the added row. If a column is passed that does not exist in the table schema, or with a value that does not match the column's type, the default behavior is for no rows to be added and the API call to [return an error](/api-reference/v2/general/errors#invalid-row-data). However, you can control this behavior with the `onSchemaError` query parameter.
## Examples
@@ -31,7 +33,7 @@ Row data may be passed in JSON, CSV, or TSV format.
- [Stashing](/api-reference/v2/stashing/introduction) is our process for handling the upload of large datasets. Break down your dataset into smaller, more manageable, pieces and [upload them to a single stash ID](/api-reference/v2/stashing/post-stashes-serial).
+ [Stashing](/api-reference/v2/stashing/introduction) is our process for handling the upload of large datasets. Break down your dataset into smaller, more manageable, pieces and [upload them to a single stash ID](/api-reference/v2/stashing/put-stashes-serial).
Then, to add all the row data in a stash to the table in a single atomic operation, use the `$stashID` reference in the `rows` field instead of providing the data inline:
diff --git a/api-reference/v2/tables/post-tables.mdx b/api-reference/v2/tables/post-tables.mdx
index 1d2249c..8c40280 100644
--- a/api-reference/v2/tables/post-tables.mdx
+++ b/api-reference/v2/tables/post-tables.mdx
@@ -5,7 +5,11 @@ openapi: post /tables
Create a new Big Table, define its structure, and (optionally) populate it with data.
-When using a CSV or TSV request body, the name of the table must be passed as a query parameter and the schema of the table is inferred from the content. Alternatively, the CSV/TSV content may be [stashed](/api-reference/v2/stashing/introduction), and then the schema and name may be passed in the regular JSON payload.
+Row IDs for any added rows are returned in the response in the same order as the input row data is passed in the request. Row data may be passed in JSON, CSV, or TSV format.
+
+When using a CSV or TSV request body, the name of the table must be passed as a query parameter and the schema of the table is always inferred from the content. Alternatively, the CSV/TSV content may be [stashed](/api-reference/v2/stashing/introduction), and then the schema and name may be passed in the regular JSON payload.
+
+If a schema is passed in the payload, any passed row data must match that schema. If a column is not included in the passed row data, it will be empty in the added row. If a column is passed that does not exist in the schema, or with a value that does not match the column's type, the default behavior is for the table to not be created and the API call to [return an error](/api-reference/v2/general/errors#invalid-row-data). However, you can control this behavior with the `onSchemaError` query parameter.
## Examples
@@ -30,7 +34,7 @@ When using a CSV or TSV request body, the name of the table must be passed as a
However, this is only appropriate for relatively small initial datasets (around a few hundred rows or less, depending on schema complexity). If you need to work with a larger dataset you should utilize stashing.
- [Stashing](/api-reference/v2/stashing/introduction) is our process for handling the upload of large datasets. Break down your dataset into smaller, more manageable, pieces and [upload them to a single stash ID](/api-reference/v2/stashing/post-stashes-serial).
+ [Stashing](/api-reference/v2/stashing/introduction) is our process for handling the upload of large datasets. Break down your dataset into smaller, more manageable, pieces and [upload them to a single stash ID](/api-reference/v2/stashing/put-stashes-serial).
Then, to create a table from a stash, you can use the `$stashID` reference in the `rows` field instead of providing the data inline:
diff --git a/api-reference/v2/tables/put-tables.mdx b/api-reference/v2/tables/put-tables.mdx
index 7da79dd..422724c 100644
--- a/api-reference/v2/tables/put-tables.mdx
+++ b/api-reference/v2/tables/put-tables.mdx
@@ -3,7 +3,11 @@ title: Overwrite Table
openapi: put /tables/{tableID}
---
-Overwrite an existing Big Table by clearing all rows and adding new data.
+Overwrite an existing Big Table by clearing all rows and (optionally) adding new data.
+
+Row IDs for any added rows are returned in the response in the same order as the input row data is passed in the request. Row data may be passed in JSON, CSV, or TSV format.
+
+If a column is not included in the passed row data, it will be empty in the added row. If a column is passed that does not exist in the updated table schema, or with a value that does not match the column's type, the default behavior is for no action to be taken and the API call to [return an error](/api-reference/v2/general/errors#invalid-row-data). However, you can control this behavior with the `onSchemaError` query parameter.
There is currently no way to supply values for user-specific columns in the API. Those columns will be cleared when using this endpoint.
@@ -44,7 +48,7 @@ When using a CSV or TSV request body, you cannot pass a schema. If you need to u
- [Stashing](/api-reference/v2/stashing/introduction) is our process for handling the upload of large datasets. Break down your dataset into smaller, more manageable, pieces and [upload them to a single stash ID](/api-reference/v2/stashing/post-stashes-serial).
+ [Stashing](/api-reference/v2/stashing/introduction) is our process for handling the upload of large datasets. Break down your dataset into smaller, more manageable, pieces and [upload them to a single stash ID](/api-reference/v2/stashing/put-stashes-serial).
Then, to reset a table's data from the stash, use the `$stashID` reference in the `rows` field instead of providing the data inline:
diff --git a/openapi/swagger.json b/openapi/swagger.json
index fb4fcc2..e4da6e5 100644
--- a/openapi/swagger.json
+++ b/openapi/swagger.json
@@ -151,12 +151,12 @@
"type": "array",
"items": {
"type": "string",
- "description": "ID of the row, e.g., `2a1bad8b-cf7c-44437-b8c1-e3782df6`",
- "example": "2a1bad8b-cf7c-44437-b8c1-e3782df6"
+ "description": "ID of the row, e.g., `zcJWnyI8Tbam21V34K8MNA`",
+ "example": "zcJWnyI8Tbam21V34K8MNA"
},
- "description": "Row IDs of added rows, e.g., \n\n```json\n[\n\t\"2a1bad8b-cf7c-44437-b8c1-e3782df6\",\n\t\"93a19-cf7c-44437-b8c1-e9acbbb\"\n]\n```",
+ "description": "Row IDs of added rows, returned in the same order as the input rows, e.g., \n\n```json\n[\n\t\"zcJWnyI8Tbam21V34K8MNA\",\n\t\"93a19-cf7c-44437-b8c1-e9acbbb\"\n]\n```",
"example": [
- "2a1bad8b-cf7c-44437-b8c1-e3782df6",
+ "zcJWnyI8Tbam21V34K8MNA",
"93a19-cf7c-44437-b8c1-e9acbbb"
]
}
@@ -517,12 +517,12 @@
"type": "array",
"items": {
"type": "string",
- "description": "ID of the row, e.g., `2a1bad8b-cf7c-44437-b8c1-e3782df6`",
- "example": "2a1bad8b-cf7c-44437-b8c1-e3782df6"
+ "description": "ID of the row, e.g., `zcJWnyI8Tbam21V34K8MNA`",
+ "example": "zcJWnyI8Tbam21V34K8MNA"
},
- "description": "Row IDs of added rows, e.g., \n\n```json\n[\n\t\"2a1bad8b-cf7c-44437-b8c1-e3782df6\",\n\t\"93a19-cf7c-44437-b8c1-e9acbbb\"\n]\n```",
+ "description": "Row IDs of added rows, returned in the same order as the input rows, e.g., \n\n```json\n[\n\t\"zcJWnyI8Tbam21V34K8MNA\",\n\t\"93a19-cf7c-44437-b8c1-e9acbbb\"\n]\n```",
"example": [
- "2a1bad8b-cf7c-44437-b8c1-e3782df6",
+ "zcJWnyI8Tbam21V34K8MNA",
"93a19-cf7c-44437-b8c1-e9acbbb"
]
}
@@ -1135,12 +1135,12 @@
"type": "array",
"items": {
"type": "string",
- "description": "ID of the row, e.g., `2a1bad8b-cf7c-44437-b8c1-e3782df6`",
- "example": "2a1bad8b-cf7c-44437-b8c1-e3782df6"
+ "description": "ID of the row, e.g., `zcJWnyI8Tbam21V34K8MNA`",
+ "example": "zcJWnyI8Tbam21V34K8MNA"
},
- "description": "Row IDs of added rows, e.g., \n\n```json\n[\n\t\"2a1bad8b-cf7c-44437-b8c1-e3782df6\",\n\t\"93a19-cf7c-44437-b8c1-e9acbbb\"\n]\n```",
+ "description": "Row IDs of added rows, returned in the same order as the input rows, e.g., \n\n```json\n[\n\t\"zcJWnyI8Tbam21V34K8MNA\",\n\t\"93a19-cf7c-44437-b8c1-e9acbbb\"\n]\n```",
"example": [
- "2a1bad8b-cf7c-44437-b8c1-e3782df6",
+ "zcJWnyI8Tbam21V34K8MNA",
"93a19-cf7c-44437-b8c1-e9acbbb"
]
}
@@ -1513,8 +1513,8 @@
"in": "path",
"schema": {
"type": "string",
- "description": "ID of the row, e.g., `2a1bad8b-cf7c-44437-b8c1-e3782df6`",
- "example": "2a1bad8b-cf7c-44437-b8c1-e3782df6"
+ "description": "ID of the row, e.g., `zcJWnyI8Tbam21V34K8MNA`",
+ "example": "zcJWnyI8Tbam21V34K8MNA"
},
"required": true
},
@@ -1653,8 +1653,8 @@
"in": "path",
"schema": {
"type": "string",
- "description": "ID of the row, e.g., `2a1bad8b-cf7c-44437-b8c1-e3782df6`",
- "example": "2a1bad8b-cf7c-44437-b8c1-e3782df6"
+ "description": "ID of the row, e.g., `zcJWnyI8Tbam21V34K8MNA`",
+ "example": "zcJWnyI8Tbam21V34K8MNA"
},
"required": true
}