Replies: 8 comments 18 replies
-
Thank you for the suggestion! It's a fantastic choice for our translation needs, and I'm excited to implement it. You can create an issue from this discussion to keep track of this requirement. |
Beta Was this translation helpful? Give feedback.
-
I initially implemented the ChatGPT batch translation feature based on OpenAI's API documentation (commit d90fcbf). Since I do not have access to the actual API, there may be many faults within the functionality. To improve the feature's usability, ongoing feedback is needed. |
Beta Was this translation helpful? Give feedback.
-
Thank you for your information and details. I tried to fix the error and uploaded the revised version here. Feel free to give me further feedback. |
Beta Was this translation helpful? Give feedback.
-
It likely needs to create a new batch. I updated the code. Please click the "Cancel Batch Translation" button to remove the old batch identifier from the cache, and then recreate a batch to try again. |
Beta Was this translation helpful? Give feedback.
-
I used the wrong endpoint to retrieve the batch translation result. It is now fixed. Please try again. |
Beta Was this translation helpful? Give feedback.
-
Wrongly processed the response. Please try this one. |
Beta Was this translation helpful? Give feedback.
-
Hi ! |
Beta Was this translation helpful? Give feedback.
-
Would it be possible to split a source into several batches? I can only process 90,000 tokens per batch in my current tier (I don't know if more would be possible in a higher tier), if I put more tokens in a batch, the batch aborts immediately with an error during processing because the TPD per batch is exceeded. Or should i use a different model for translations than gpt-4o?
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi!
OpenAI offers special batch API that is 50% cheaper than the normal (synchronous) API.
With batch API all requests are combined into a single
jsonl
(where json requests are newline-delimited). The reason batch API is cheaper is that the response is not immediate. It can take up to 24h to process the batch.https://platform.openai.com/docs/guides/batch
https://platform.openai.com/docs/api-reference/batch
Book translations seems to be a perfect fit for the batch API. Very high token counts and results usually do not need to be immediate.
Single batch can contain up to 50'000 requests or up to 100MB.
Beta Was this translation helpful? Give feedback.
All reactions