langchain-openai==0.3.0
langchain-openai==0.3
implements two breaking changes:
Structured output
We update the default method
parameter for ChatOpenAI(...).with_structured_output(method=<method>)
from method="function_calling"
to method="json_schema"
.
"json_schema"
(0.3 default) uses OpenAI's dedicated structured output feature to get a structured response"function_calling"
(0.2 default) uses function calling to get a structured response
For schemas specified via TypedDict or JSON schema, strict schema validation is disabled by default but can be enabled by specifying strict=True
.
Note: conceptually there is a difference between forcing a tool call and forcing a response format. Tool calls may have more concise arguments versus generating content adhering to a schema. Prompts may need to be adjusted to recover desired behavior.
How to retain the 0.2 with_structured_output behavior after upgrading to 0.3
To change this behavior back, you can pass method="function_calling"
to your with_structured_output calls that you want to switch the behavior back.
Expected errors
-
Models that don’t support
method="json_schema"
(e.g.,gpt-4
andgpt-3.5-turbo
, currently the default model forChatOpenAI
) will raise an error unlessmethod
is explicitly specified. To recover the previous default, passmethod="function_calling"
intowith_structured_output
. -
Schemas specified via Pydantic
BaseModel
that have fields with non-null defaults or metadata (like min/max constraints) will raise an error. To recover the previous default, passmethod="function_calling"
intowith_structured_output
. See OpenAI's docs for supported schemas.
Optional parameters
We no longer implement non-null defaults for temperature
, max_retires
, and n
, which are optional fields. In particular, we no longer specify a default temperature of 0.7.
The previous defaults can be set by specifying:
temperature=0.7
max_retries=2
n=1