You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OpenAI offers special batch API that is 50% cheaper than the normal (synchronous) API.
With batch API all requests are combined into a single jsonl (where json requests are newline-delimited). The reason batch API is cheaper is that the response is not immediate. It can take up to 24h to process the batch.
Hi!
OpenAI offers special batch API that is 50% cheaper than the normal (synchronous) API.
With batch API all requests are combined into a single
jsonl
(where json requests are newline-delimited). The reason batch API is cheaper is that the response is not immediate. It can take up to 24h to process the batch.https://platform.openai.com/docs/guides/batch
https://platform.openai.com/docs/api-reference/batch
Book translations seems to be a perfect fit for the batch API. Very high token counts and results usually do not need to be immediate.
Single batch can contain up to 50'000 requests or up to 100MB.
Discussed in #318
The text was updated successfully, but these errors were encountered: