Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Azure OpenAI: Invalid parameter 'response_format' of type 'json_schema' appears after updating to 0.4.0 #7593

Closed
5 tasks done
KoreanThinker opened this issue Jan 25, 2025 · 5 comments · Fixed by #7596
Closed
5 tasks done
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@KoreanThinker
Copy link
Contributor

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import * as hub from "langchain/hub";
import azureOpenai from "@/config/azureOpenai";

export default async function personalisationEvaluation(
  userProfile,
  question,
  answer,
) {
  const prompt = await hub.pull("evaluate-personalisation");
  const model = prompt.pipe(azureOpenai);

  const result = await model.invoke(
    {
      user_profile: userProfile,
      user_question: question,
      model_answer: answer,
    },
    { runName: "EvaluatePersonalisation" },
  );

  return {
    personalisationScore: result.score,
    personalisationExplanation: result.explanation,
  };
}
Image

Error Message and Stack Trace (if applicable)

Error: 400 Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs
at oL.generate (.next/server/app/api/chat-bot/callback/route.js:400:30952)
at uY.makeStatusError (.next/server/app/api/chat-bot/callback/route.js:400:44061)
at uY.makeRequest (.next/server/app/api/chat-bot/callback/route.js:400:45005)
at async (.next/server/app/api/chat-bot/callback/route.js:411:17970)
at async t._fn (.next/server/app/api/chat-bot/callback/route.js:912:254750) {
status: 400,
headers: [Object],
request_id: '3d372b97-b1a9-48bb-93cb-5346235e4f80',
error: [Object],
code: null,
param: null,
type: 'invalid_request_error',
attemptNumber: 1,
retriesLeft: 6
}

Description

When I use version 0.3.17, everything works perfectly without any issues. However, after updating to 0.4.0, I consistently encounter the error: "Invalid parameter: 'response_format' of type 'json_schema'".

It seems likely that this issue is related to changes introduced in #7535.

Interestingly, prompts using the same model with only slight differences in output structure work without any problems. This inconsistency makes the issue even more puzzling.

Image

System Info

node = v22.11.0
yarn = v1.22.22
"@langchain/community": "^0.3.27",
"@langchain/core": "^0.3.35",
"@langchain/google-genai": "^0.1.6",
"@langchain/openai": "^0.4.0",
"@langchain/pinecone": "^0.1.3",
"langchain": "^0.3.13",
"langsmith": "^0.3.3",

Copy link

dosubot bot commented Jan 25, 2025

The error "Invalid parameter: 'response_format' of type 'json_schema'" occurs because, in LangChain.js version 0.4.0, the responseFormat parameter is expected to be either "content" or "content_and_artifact" [1]. The use of "json_schema" is no longer supported in this version, which is why you're encountering this error after updating.

To resolve this issue, you should update your code to use one of the supported responseFormat values. If your application relies on structured outputs, you may need to adjust your implementation to align with the new response format handling in version 0.4.0.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jan 25, 2025
@jacoblee93
Copy link
Collaborator

Hey @KoreanThinker, thanks for reporting - we switched the default method for structured output to jsonSchema, but it seems some of the older Azure deployments don't support it yet. Will revert it for Azure for now.

@jacoblee93
Copy link
Collaborator

jacoblee93 commented Jan 25, 2025

And just confirming - you're using gpt-4o as your model right? I see it in your screenshot

@jacoblee93
Copy link
Collaborator

Just shipped 0.4.1. Let me know if that fixes it for you!

@KoreanThinker
Copy link
Contributor Author

Thank you for resolving the issue! Your quick response was incredibly helpful. Looking forward to more great updates in the future.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants