While streaming, AgentExecutor does not provide differentiation between LLM thoughts and final response #29405
Unanswered
vinay-netomi
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Checked other resources
Commit to Help
Example Code
Description
I am trying to stream the response from Agent Executor using astream_events. But while sharing the events, there is no identifier between a LLM thought or Final response. I have shared both the events below. Is there any way to distinguish and prevent from streaming the LLM thoughts?
LLM Thought event:
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '822b6ca7-75fd-4279-9628-a9df383993ca', 'tags': ['seq:step:3'], 'metadata': {'user_id': '', 'tags': [']}, 'data': {'chunk': AIMessageChunk(content='I')}}
Final Response Event:
System Info
langchain==0.1.9
Beta Was this translation helpful? Give feedback.
All reactions