-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
anthropic: input_json_delta
tokens should not be passed to handleLLMNewToken
#6936
Comments
Hey, @danielleontiev! I'm here to assist you with any bugs, questions, or contributions you have. Let's work together to solve this issue! To resolve the issue of if (contentPartCopy.type === "input_json_delta") {
// `input_json_delta` type only represents yielding partial tool inputs
// and is not a valid type for Anthropic messages.
contentPartCopy.type = "tool_use";
} This code snippet ensures that any |
CC @bracesproul I think this is unintended |
Hi, @danielleontiev. I'm Dosu, and I'm helping the LangChain JS team manage their backlog. I'm marking this issue as stale. Issue Summary
Next Steps
Thank you for your understanding and contribution! |
Keeping this open |
Checked other resources
Example Code
Introduced here: #6179 (comment), the Anthropic chat model with the tools would pass
input_json_delta
tokens tohandleLLMNewToken
resulting in the unwanted tokens being propagated to the callbacks.Output
The last three tokens are wrong because they are part of Anthropic's
input_json_delta
:For example, OpenAI behaves differently
Output
Interestingly enough, the behavior is not reproduced if using
.streamEvents
on the chat model, because arguments tokens are returned as part ofinput_json_delta
objects, but is reproduced when using.streamEvents
from the LangGraph agent, becausecontent
is returned as simplestring
in that case.Output (truncated)
Output (truncated)
I am not sure about the streaming behavior, but at least the callback behavior could be fixed by excluding
input
tokens here:langchainjs/libs/langchain-anthropic/src/chat_models.ts
Lines 167 to 174 in 660af3e
Error Message and Stack Trace (if applicable)
No response
Description
"Argumnets" tokens should not appear in the callbacks and when streaming with LangGraph
System Info
npm info langchain
The text was updated successfully, but these errors were encountered: