Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

community: Add cost per 1K tokens for fine-tuned model cached input #29248

Merged
merged 2 commits into from
Jan 16, 2025

Conversation

LuisGoCity
Copy link
Contributor

@LuisGoCity LuisGoCity commented Jan 16, 2025

Description

  • Since there is no cost per 1k input tokens for a fine-tuned cached version of gpt-4o-mini-2024-07-18 is not available when using the OpenAICallbackHandler, it raises an error when trying to make calls with such model.
  • To add the price in the MODEL_COST_PER_1K_TOKENS dictionary

cc. @efriis

Copy link

vercel bot commented Jan 16, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Skipped Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Jan 16, 2025 10:58am

@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. community Related to langchain-community 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Jan 16, 2025
@ccurme ccurme self-assigned this Jan 16, 2025
@ccurme ccurme merged commit 75663f2 into langchain-ai:master Jan 16, 2025
19 checks passed
@dosubot dosubot bot added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label Jan 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature community Related to langchain-community lgtm PR looks good. Use to confirm that a PR is ready for merging. size:XS This PR changes 0-9 lines, ignoring generated files.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

2 participants